Project Database
This page contains the database of possible research projects for master and bachelor students in the Biorobotics Laboratory (BioRob). Visiting students are also welcome to join BioRob, but it should be noted that no funding is offered for those projects (see https://biorob.epfl.ch/students/ for instructions). To enroll for a project, please directly contact one of the assistants (directly in his/her office, by phone or by mail). Spontaneous propositions for projects are also welcome, if they are related to the research topics of BioRob, see the BioRob Research pages and the results of previous student projects.
Search filter: only projects matching the keyword Sensor Fusion are shown here. Remove filter
Amphibious robotics
Computational Neuroscience
Dynamical systems
Human-exoskeleton dynamics and control
Humanoid robotics
Miscellaneous
Mobile robotics
Modular robotics
Neuro-muscular modelling
Quadruped robotics
Amphibious robotics
757 – Development of radio and vision electronics for a salamander inspired robot |
Category: | semester project, master project (full-time) | |
Keywords: | Bio-inspiration, Biomimicry, Communication, Electronics, Embedded Systems, Firmware, Programming, Prototyping, Radio, Robotics, Sensor Fusion, Vision, sensor | |
Type: | 70% hardware, 30% software | |
Responsible: | (MED 1 1626, phone: 38676) | |
Description: | This project has been taken. Pleurobot is a salamander-inspired robot that is capable of moving in and transitioning between terrestrial and aquatic environments. Some research projects in our lab have demonstrated the effectiveness of vision-guided or human-controlled locomotion transition strategies. However, the present Pleurobot is unable to use similar strategies robustly, especially in outdoor environments, because of lacking vision systems or robust wireless controllers. In this project, the student will need to add vision systems (e.g., RGB-D camera) for Pleurobot that can operate in amphibious environments. In addition, a robust radio controller is needed to operate the robot in outdoor environments. Alternatively, the student can choose to implement algorithms for the vision system for recognizing terrain and obstacles in real-time. Both systems need to be integrated into the ROS 2 controller running on the onboard computer. The major challenges include the requirements for waterproofing, the limited space for electronics, and the fusion of multiple sensory systems in an embedded system. The student is expected to have a solid background in circuit design for embedded systems, firmware programming, and familiarity with ROS 2. The student who is interested in this project could send his/her transcript, CV, and materials that can demonstrate his/her past project experience to qiyuan.fu@epfl.ch. Last edited: 02/09/2025 |
Miscellaneous
763 – Workload Estimation and Action Classification in Basketball Using IMU Sensors |
Category: | master project (full-time) | |
Keywords: | Data Processing, Machine learning, Motion Capture, Programming, Python, Sensor Fusion | |
Type: | 10% theory, 90% software | |
Responsible: | (MED 0 1016, phone: 32468) | |
Description: | n modern basketball, accurately monitoring player workload and identifying specific movement patterns are critical for optimizing performance, reducing injury risk, and tailoring individualized training programs. However, many existing workload assessment tools are not fine-tuned to capture the complex and explosive actions typical in basketball. This project aims to develop a sensor-based system that can estimate physical workload and classify basketball-specific movements using only Inertial Measurement Unit (IMU) sensors. Data will be collected from athletes during structured training sessions, with a focus on high-intensity basketball actions such as rebounds, layups, jump shots, sprints, direction changes, and defensive movements. The primary objective is to create algorithms capable of:
Last edited: 29/08/2025 |
762 – Multimodal sensor fusion for enhanced biomechanical profiling in football: integrating imu and video data from vertical jump tests |
Category: | master project (full-time) | |
Keywords: | Data Processing, Image Processing, Machine learning, Motion Capture, Programming, Python, Sensor Fusion | |
Type: | 100% software | |
Responsible: | (MED 0 1016, phone: 32468) | |
Description: | Raw video shows the motion. IMUs reveal the accelerations, orientation. Combined, they unlock new biomechanical precision. This project focuses on developing a sensor fusion framework that synchronizes video recordings and inertial measurement unit (IMU) data to compute enhanced biomechanical metrics from jump tests (bilateral and unilateral CMJ, drop jump). The core aim is to overcome the limitations of each modality alone, combining the spatial richness of video with the temporal and acceleration precision of IMUs. You have access to a dataset consist of 25 players collected inside the lab with an infrared motion tracker system. Traditional biomechanical analysis in sport often relies on expensive lab equipment and manual video inspection. Your work could lay the foundation for next-generation performance monitoring systems that are low-cost, field-deployable, and data-rich. Last edited: 29/08/2025 |
3 projects found.