Project Database
This page contains the database of possible research projects for master and bachelor students in the Biorobotics Laboratory (BioRob). Visiting students are also welcome to join BioRob, but it should be noted that no funding is offered for those projects (see https://biorob.epfl.ch/students/ for instructions). To enroll for a project, please directly contact one of the assistants (directly in his/her office, by phone or by mail). Spontaneous propositions for projects are also welcome, if they are related to the research topics of BioRob, see the BioRob Research pages and the results of previous student projects.
Search filter: only projects matching the keyword Data Processing are shown here. Remove filter
Amphibious robotics
Computational Neuroscience
Dynamical systems
Human-exoskeleton dynamics and control
Humanoid robotics
Miscellaneous
Mobile robotics
Modular robotics
Neuro-muscular modelling
Quadruped robotics
Amphibious robotics
759 – Snakes vs 1guilla - a kinematic and energetic comparison |
Category: | master project (full-time) | |
Keywords: | Bio-inspiration, Data Evaluation, Data Processing, Experiments | |
Type: | 20% theory, 40% hardware, 40% software | |
Responsible: | (MED 0 2326, phone: 38499) | |
Description: | Snakes use undulatory swimming to move efficiently through water, and different species show distinct patterns depending on their morphology and environment. With datasets available from live swimming tests and access to an undulatory robot, we can combine data analysis and physical experiments to explore their performance systematically and address biological questions. Project Description: The goal of this project is to analyse and replicate the diversity of swimming gaits in snakes using kinematic data from different species and a robot to explore insights into energy consumption. We’ll use reduced-order modelling to extract dominant motion patterns and look for structure in the data, for example, clustering gaits by species or lifestyle. Based on these results, we’ll test a few representative cases on a robotic platform to evaluate how well the robot can replicate biological gaits and how performance (e.g. speed, efficiency) varies with different strategies. Specific Goals:
Last edited: 14/07/2025 |
Computational Neuroscience
755 – High-performance enconder-decoder design for computational neural signal processing |
Category: | semester project, master project (full-time), internship | |
Keywords: | Computational Neuroscience, Data Processing, Linux, Programming, Python | |
Type: | 20% theory, 5% hardware, 75% software | |
Responsible: | (MED11626, phone: 41783141830) | |
Description: | Background Brain-computer interfaces (BCIs) using signals acquired with intracortical implants have achieved successful high-dimensional robotic device control useful for completing daily tasks. However, the substantial amount of medical and surgical expertise required to correctly implant and operate these systems greatly limits their use beyond a few clinical cases. A non-invasive counterpart requiring less intervention that can provide high-quality control would profoundly improve the integration of BCIS into multiple settings, and represent a nascent research field, brain robotics. However, this is challenging due to the inherent complexity of neural signals and difficulties in online neural decoding with efficient algorithms. Moreover, brain signals created by an external stimulus (e.g., vision) are most widely used in BCI-based applications, but it is impractical and infeasible in dynamic yet constrained environments. A question arises here: "How to circumvent constraints associated with stimulus-based signals? Is it feasible to apply non-invasive BCIS to read brain signals, and how to do so?". To a step further, I wonder could it be possible to accurately decode complete semantic-based command phrases in real time, and further achieve seamless and natural brain-robot systems for control and interactions? The project is for a team of 1-2 Master's students, and breakdown tasks will be assigned to each student later according to their skill set. What needs to be implemented and delivered at the end of the project? 1) A method package of brain signal pre-processing and feature formulation 2) An algorithm package of an encoder and a decoder of neural signals. 3) A model of training brain signals with spatial and temporal features. Last edited: 13/05/2025 |
Human-exoskeleton dynamics and control
735 – Hip exoskeleton to assist walking - multiple projects |
Category: | semester project, master project (full-time), bachelor semester project, internship | |
Keywords: | Bio-inspiration, C, C++, Communication, Compliance, Control, Data Processing, Dynamics Model, Electronics, Experiments, Inverse Dynamics, Kinematics Model, Learning, Locomotion, Machine learning, Online Optimization, Optimization, Programming, Python, Robotics, Treadmill | |
Type: | 30% theory, 35% hardware, 35% software | |
Responsible: | (MED 3 1015, phone: 31153) | |
Description: | Exoskeletons have experienced an unprecedented growth in recent years and hip-targeting active devices have demonstrated their potential in assisting walking activities. Portable exoskeletons are designed to provide assistive torques while taking off the added weight, with the overall goal of increasing the endurance, reducing the energetic expenditure and increase the performance during walking. The design of exoskeletons involves the development of the sensing, the actuation, the control, and the human-robot interface. In our lab, a hip-joint active hip orthosis (“eWalk”) has been prototyped and tested in recent years. Currently, multiple projects are available to address open research questions. Does the exoskeleton reduce the effort while walking? How can we model human-exoskeleton interaction? How can we design effective controls? How can we optimize the interfaces and the control? Which movements can we assist with exoskeletons? To address these challenges, the field necessitates knowledge in biology, mechanics, electronics, physiology, informatics (programming, learning algorithms), and human-robot interaction. If you are interested in collaborating in one of these topics, please send an email to giulia.ramella@epfl.ch with (1) your CV+transcripts, (2) your previous experiences that could be relevant to the project, and (3) what interests you the most about this research topic (to be discussed during the interview). Last edited: 21/07/2025 |
Miscellaneous
765 – Validity and Reliability of IMU-Based Jump Test Analysis |
Category: | master project (full-time) | |
Keywords: | Data Processing, Motion Capture, Programming, Python, Statistical analysis | |
Type: | 10% theory, 90% software | |
Responsible: | (MED 0 1016, phone: 32468) | |
Description: | Optizone has created a cutting-edge suite of algorithms that estimate athletes’ fitness levels through widely recognized performance tests such as the drop jump, squat jump, repetitive jump, hop test, and velocity-based training. These algorithms have the potential to transform athletic monitoring by providing fast, data-driven insights into performance and readiness. This project focuses on putting these algorithms to the test. Their accuracy and consistency will be rigorously evaluated against a gold-standard motion analysis system in a controlled laboratory setting. Using a structured protocol, athlete performance data will be collected, preprocessed, and subjected to in-depth statistical analysis to determine both the reliability (how consistent the results are) and validity (how well the algorithms reflect true performance). By bridging advanced algorithm development with scientific validation, this study aims to strengthen confidence in Optizone’s technology and lay the foundation for smarter, evidence-based training and injury-prevention strategies. Jump Tests:
Last edited: 29/08/2025 |
763 – Workload Estimation and Action Classification in Basketball Using IMU Sensors |
Category: | master project (full-time) | |
Keywords: | Data Processing, Machine learning, Motion Capture, Programming, Python, Sensor Fusion | |
Type: | 10% theory, 90% software | |
Responsible: | (MED 0 1016, phone: 32468) | |
Description: | n modern basketball, accurately monitoring player workload and identifying specific movement patterns are critical for optimizing performance, reducing injury risk, and tailoring individualized training programs. However, many existing workload assessment tools are not fine-tuned to capture the complex and explosive actions typical in basketball. This project aims to develop a sensor-based system that can estimate physical workload and classify basketball-specific movements using only Inertial Measurement Unit (IMU) sensors. Data will be collected from athletes during structured training sessions, with a focus on high-intensity basketball actions such as rebounds, layups, jump shots, sprints, direction changes, and defensive movements. The primary objective is to create algorithms capable of:
Last edited: 29/08/2025 |
762 – Multimodal sensor fusion for enhanced biomechanical profiling in football: integrating imu and video data from vertical jump tests |
Category: | master project (full-time) | |
Keywords: | Data Processing, Image Processing, Machine learning, Motion Capture, Programming, Python, Sensor Fusion | |
Type: | 100% software | |
Responsible: | (MED 0 1016, phone: 32468) | |
Description: | Raw video shows the motion. IMUs reveal the accelerations, orientation. Combined, they unlock new biomechanical precision. This project focuses on developing a sensor fusion framework that synchronizes video recordings and inertial measurement unit (IMU) data to compute enhanced biomechanical metrics from jump tests (bilateral and unilateral CMJ, drop jump). The core aim is to overcome the limitations of each modality alone, combining the spatial richness of video with the temporal and acceleration precision of IMUs. You have access to a dataset consist of 25 players collected inside the lab with an infrared motion tracker system. Traditional biomechanical analysis in sport often relies on expensive lab equipment and manual video inspection. Your work could lay the foundation for next-generation performance monitoring systems that are low-cost, field-deployable, and data-rich. Last edited: 29/08/2025 |
6 projects found.