Project Database
This page contains the database of possible research projects for master and bachelor students in the Biorobotics Laboratory (BioRob). Visiting students are also welcome to join BioRob, but it should be noted that no funding is offered for those projects (see https://biorob.epfl.ch/students/ for instructions). To enroll for a project, please directly contact one of the assistants (directly in his/her office, by phone or by mail). Spontaneous propositions for projects are also welcome, if they are related to the research topics of BioRob, see the BioRob Research pages and the results of previous student projects.
To limit the list to the projects matching a given keyword, click on it. | Show complete list |
3D, Agility, Architecture, Artificial muscles, Balance Control, Bio-inspiration, Biomimicry, Biped Locomotion, C, C#, C++, Coman, Communication, Compliance, Computational Neuroscience, Computer Science, Control, Data Evaluation, Data Processing, Dynamics Model, Electronics, Embedded Systems, Estimator, Experiments, FPGA, Feedback, Firmware, Footstep Planning, GUI, Hybrid Balance Control, Image Processing, Inverse Dynamics, Kinect, Kinematics Model, Laser Scanners, Learning, Leg design, Linux, Localization, Locomotion, Machine learning, Mechanical Construction, Motion Capture, Muscle modeling, Online Optimization, Optic Flow, Optimization, Probabilistics, Processor, Programming, Prototyping, Python, Quadruped Locomotion, Radio, Reflexes, Robotics, Sensor Fusion, Simulator, Soft robotics, Statistical analysis, Synchronization, Treadmill, VHDL, Vision, sensor
Amphibious robotics
Computational Neuroscience
Dynamical systems
Human-exoskeleton dynamics and control
Humanoid robotics
Miscellaneous
Mobile robotics
Modular robotics
Neuro-muscular modelling
Quadruped robotics
Amphibious robotics
757 – Development of radio and vision electronics for a salamander inspired robot |
Category: | semester project, master project (full-time) | |
Keywords: | Bio-inspiration, Biomimicry, Communication, Electronics, Embedded Systems, Firmware, Programming, Prototyping, Radio, Robotics, Sensor Fusion, Vision, sensor | |
Type: | 70% hardware, 30% software | |
Responsible: | (MED 1 1626, phone: 38676) | |
Description: | This project has been taken. Pleurobot is a salamander-inspired robot that is capable of moving in and transitioning between terrestrial and aquatic environments. Some research projects in our lab have demonstrated the effectiveness of vision-guided or human-controlled locomotion transition strategies. However, the present Pleurobot is unable to use similar strategies robustly, especially in outdoor environments, because of lacking vision systems or robust wireless controllers. In this project, the student will need to add vision systems (e.g., RGB-D camera) for Pleurobot that can operate in amphibious environments. In addition, a robust radio controller is needed to operate the robot in outdoor environments. Alternatively, the student can choose to implement algorithms for the vision system for recognizing terrain and obstacles in real-time. Both systems need to be integrated into the ROS 2 controller running on the onboard computer. The major challenges include the requirements for waterproofing, the limited space for electronics, and the fusion of multiple sensory systems in an embedded system. The student is expected to have a solid background in circuit design for embedded systems, firmware programming, and familiarity with ROS 2. The student who is interested in this project could send his/her transcript, CV, and materials that can demonstrate his/her past project experience to qiyuan.fu@epfl.ch. Last edited: 02/09/2025 |
760 – Cable-driven leg design for salamander robots |
Category: | semester project | |
Keywords: | Bio-inspiration, Biomimicry, Leg design, Quadruped Locomotion | |
Type: | 5% theory, 75% hardware, 20% software | |
Responsible: | (MED 1 1626, phone: 38676) | |
Description: | This project has been taken Robots can be useful tools to study animal locomotion in physical environments. However, present robot actuators can barely reach the high power density of animal muscles. In addition, the differences in the morphologies of motors and muscles lead to differences in the geometry and dynamics between robots and animals. Cable-driven mechanisms offer a promising way to bridge the gap, because they enable more flexible placement of actuators and integration of mechanisms similar to animal musculoskeletal systems. In this project, the student will refine the design of a cable-driven leg for our salamander robots. The objectives are to reduce the weight and rotational inertia, reduce the size, and increase output torque along different axes. The design needs to be rigorously tested in a standalone setup and on the real robot. Last edited: 14/08/2025 |
759 – Snakes vs 1guilla - a kinematic and energetic comparison |
Category: | master project (full-time) | |
Keywords: | Bio-inspiration, Data Evaluation, Data Processing, Experiments | |
Type: | 20% theory, 40% hardware, 40% software | |
Responsible: | (MED 0 2326, phone: 38499) | |
Description: | Snakes use undulatory swimming to move efficiently through water, and different species show distinct patterns depending on their morphology and environment. With datasets available from live swimming tests and access to an undulatory robot, we can combine data analysis and physical experiments to explore their performance systematically and address biological questions. Project Description: The goal of this project is to analyse and replicate the diversity of swimming gaits in snakes using kinematic data from different species and a robot to explore insights into energy consumption. We’ll use reduced-order modelling to extract dominant motion patterns and look for structure in the data, for example, clustering gaits by species or lifestyle. Based on these results, we’ll test a few representative cases on a robotic platform to evaluate how well the robot can replicate biological gaits and how performance (e.g. speed, efficiency) varies with different strategies. Specific Goals:
Last edited: 14/07/2025 |
758 – Optimization of compliant structure designs in a salamander robot using physics simulation |
Category: | master project (full-time) | |
Keywords: | Bio-inspiration, Biomimicry, Compliance, Dynamics Model, Experiments, Locomotion, Optimization, Programming, Python, Robotics, Simulator, Soft robotics | |
Type: | 30% theory, 20% hardware, 50% software | |
Responsibles: |
(MED 1 1611, phone: 36620)
(MED 1 1626, phone: 38676) | |
Description: | In nature, animals have many compliant structures that benefit their locomotion. For example, compliant foot/leg structures help adapt to uneven terrain or negotiate obstacles, flexible tails allow efficient undulatory swimming, and muscle-tendon structures help absorb shock and reduce energy loss. Similar compliant structures may benefit salamander-inspired robots as well. In this study, the student will try simulating compliant structures (the feet of the robot) in Mujoco and optimizing the design. To bridge the sim-to-real gap, the student will first work with other lab members to perform experiments to measure the mechanical properties of a few simple compliant structures. Then, the student needs to simulate these experiments using the flexcomp plugin of Mujoco or theoretical solid mechanics models and tune the simulation models to match the dynamical response in simulation with the experiments. Afterward, the student needs to optimize the design parameters of the compliant structures in simulation to improve the locomotion performance of the robot while maintaining a small sim-to-real gap. Finally, prototypes of the optimal design will be tested on the physical robot to verify the results. The student is thus required to be familiar with Python programming, physics engines (preferably Mujoco), and optimization/learning algorithms. The student should also have basic mechanical design abilities to design models and perform experiments. Students who have taken the Computational Motor Control course or have experience with data-driven design and solid mechanics would also be preferred. The student who is interested in this project shall send the following materials to the assistants: (1) resume, (2) transcript showing relevant courses and grades, and (3) other materials that can demonstrate your skills and project experience (such as videos, slides, Git repositories, etc.). Last edited: 17/06/2025 |
Computational Neuroscience
755 – High-performance enconder-decoder design for computational neural signal processing |
Category: | semester project, master project (full-time), internship | |
Keywords: | Computational Neuroscience, Data Processing, Linux, Programming, Python | |
Type: | 20% theory, 5% hardware, 75% software | |
Responsible: | (MED11626, phone: 41783141830) | |
Description: | Background Brain-computer interfaces (BCIs) using signals acquired with intracortical implants have achieved successful high-dimensional robotic device control useful for completing daily tasks. However, the substantial amount of medical and surgical expertise required to correctly implant and operate these systems greatly limits their use beyond a few clinical cases. A non-invasive counterpart requiring less intervention that can provide high-quality control would profoundly improve the integration of BCIS into multiple settings, and represent a nascent research field, brain robotics. However, this is challenging due to the inherent complexity of neural signals and difficulties in online neural decoding with efficient algorithms. Moreover, brain signals created by an external stimulus (e.g., vision) are most widely used in BCI-based applications, but it is impractical and infeasible in dynamic yet constrained environments. A question arises here: "How to circumvent constraints associated with stimulus-based signals? Is it feasible to apply non-invasive BCIS to read brain signals, and how to do so?". To a step further, I wonder could it be possible to accurately decode complete semantic-based command phrases in real time, and further achieve seamless and natural brain-robot systems for control and interactions? The project is for a team of 1-2 Master's students, and breakdown tasks will be assigned to each student later according to their skill set. What needs to be implemented and delivered at the end of the project? 1) A method package of brain signal pre-processing and feature formulation 2) An algorithm package of an encoder and a decoder of neural signals. 3) A model of training brain signals with spatial and temporal features. Last edited: 13/05/2025 |
Human-exoskeleton dynamics and control
735 – Hip exoskeleton to assist walking - multiple projects |
Category: | semester project, master project (full-time), bachelor semester project, internship | |
Keywords: | Bio-inspiration, C, C++, Communication, Compliance, Control, Data Processing, Dynamics Model, Electronics, Experiments, Inverse Dynamics, Kinematics Model, Learning, Locomotion, Machine learning, Online Optimization, Optimization, Programming, Python, Robotics, Treadmill | |
Type: | 30% theory, 35% hardware, 35% software | |
Responsible: | (MED 3 1015, phone: 31153) | |
Description: | Exoskeletons have experienced an unprecedented growth in recent years and hip-targeting active devices have demonstrated their potential in assisting walking activities. Portable exoskeletons are designed to provide assistive torques while taking off the added weight, with the overall goal of increasing the endurance, reducing the energetic expenditure and increase the performance during walking. The design of exoskeletons involves the development of the sensing, the actuation, the control, and the human-robot interface. In our lab, a hip-joint active hip orthosis (“eWalk”) has been prototyped and tested in recent years. Currently, multiple projects are available to address open research questions. Does the exoskeleton reduce the effort while walking? How can we model human-exoskeleton interaction? How can we design effective controls? How can we optimize the interfaces and the control? Which movements can we assist with exoskeletons? To address these challenges, the field necessitates knowledge in biology, mechanics, electronics, physiology, informatics (programming, learning algorithms), and human-robot interaction. If you are interested in collaborating in one of these topics, please send an email to giulia.ramella@epfl.ch with (1) your CV+transcripts, (2) your previous experiences that could be relevant to the project, and (3) what interests you the most about this research topic (to be discussed during the interview). Last edited: 21/07/2025 |
Quadruped robotics
A small excerpt of possible projects is listed here. Highly interested students may also propose projects, or continue an existing topic.
747 – Learning frog gaits and their transitions |
Category: | semester project, master project (full-time) | |
Keywords: | Artificial muscles, Learning, Locomotion, Python | |
Type: | 100% software | |
Responsibles: |
(MED 1 1024, phone: 30563)
(MED 1 1611, phone: 36714) | |
Description: | During terrestrial locomotion, some frog species display both out-of-phase walking or in-phase hopping limb movements. It has been suggested that changes in these gaits arise to minimize energy consumptions. In this project we will explore this hypothesis by simulating the frog terrestrial locomotion using reinforcement learning. We will use a biomechanical model of the frog adopted with artificial muscles to investigate the optimal gaits for different terrain conditions (low-medium-high ground stiffness). The plantaris longus tendon has been associated with a crucial ability of the frog to store elastic energy during frog jumping. We will test this hypothesis in simulation. The goals can be divided in these subgoals (in order of priority/time): 1. Compute the inertial properties of the frog and URDF file creation 2. Train a neural network controller using reinforcement learning and design of the cost function 3. Testing the ability of the model to walk and hop in simplified scenarios Last edited: 12/11/2024 (revalidated 16/09/2025) |
Miscellaneous
765 – Validity and Reliability of IMU-Based Jump Test Analysis |
Category: | master project (full-time) | |
Keywords: | Data Processing, Motion Capture, Programming, Python, Statistical analysis | |
Type: | 10% theory, 90% software | |
Responsible: | (MED 0 1016, phone: 32468) | |
Description: | Optizone has created a cutting-edge suite of algorithms that estimate athletes’ fitness levels through widely recognized performance tests such as the drop jump, squat jump, repetitive jump, hop test, and velocity-based training. These algorithms have the potential to transform athletic monitoring by providing fast, data-driven insights into performance and readiness. This project focuses on putting these algorithms to the test. Their accuracy and consistency will be rigorously evaluated against a gold-standard motion analysis system in a controlled laboratory setting. Using a structured protocol, athlete performance data will be collected, preprocessed, and subjected to in-depth statistical analysis to determine both the reliability (how consistent the results are) and validity (how well the algorithms reflect true performance). By bridging advanced algorithm development with scientific validation, this study aims to strengthen confidence in Optizone’s technology and lay the foundation for smarter, evidence-based training and injury-prevention strategies. Jump Tests:
Last edited: 29/08/2025 |
763 – Workload Estimation and Action Classification in Basketball Using IMU Sensors |
Category: | master project (full-time) | |
Keywords: | Data Processing, Machine learning, Motion Capture, Programming, Python, Sensor Fusion | |
Type: | 10% theory, 90% software | |
Responsible: | (MED 0 1016, phone: 32468) | |
Description: | n modern basketball, accurately monitoring player workload and identifying specific movement patterns are critical for optimizing performance, reducing injury risk, and tailoring individualized training programs. However, many existing workload assessment tools are not fine-tuned to capture the complex and explosive actions typical in basketball. This project aims to develop a sensor-based system that can estimate physical workload and classify basketball-specific movements using only Inertial Measurement Unit (IMU) sensors. Data will be collected from athletes during structured training sessions, with a focus on high-intensity basketball actions such as rebounds, layups, jump shots, sprints, direction changes, and defensive movements. The primary objective is to create algorithms capable of:
Last edited: 29/08/2025 |
762 – Multimodal sensor fusion for enhanced biomechanical profiling in football: integrating imu and video data from vertical jump tests |
Category: | master project (full-time) | |
Keywords: | Data Processing, Image Processing, Machine learning, Motion Capture, Programming, Python, Sensor Fusion | |
Type: | 100% software | |
Responsible: | (MED 0 1016, phone: 32468) | |
Description: | Raw video shows the motion. IMUs reveal the accelerations, orientation. Combined, they unlock new biomechanical precision. This project focuses on developing a sensor fusion framework that synchronizes video recordings and inertial measurement unit (IMU) data to compute enhanced biomechanical metrics from jump tests (bilateral and unilateral CMJ, drop jump). The core aim is to overcome the limitations of each modality alone, combining the spatial richness of video with the temporal and acceleration precision of IMUs. You have access to a dataset consist of 25 players collected inside the lab with an infrared motion tracker system. Traditional biomechanical analysis in sport often relies on expensive lab equipment and manual video inspection. Your work could lay the foundation for next-generation performance monitoring systems that are low-cost, field-deployable, and data-rich. Last edited: 29/08/2025 |
761 – Developing an IMU-based algorithm to quantify the workload of soccer goalkeepers |
Category: | master project (full-time) | |
Keywords: | Motion Capture, Programming, Python, sensor | |
Type: | 20% theory, 80% software | |
Responsible: | (MED 0 1016, phone: 32468) | |
Description: | Workload monitoring is a fundamental component in designing and optimizing training sessions for athletes. In football, several established methods exist to assess the workload during training and matches—particularly for outfield players. However, these methods often fall short when applied to goalkeepers, whose movements and physical demands differ significantly. As a result, there is currently no widely accepted or accurate approach for quantifying goalkeeper workload. This project aims to bridge that gap by developing a reliable method for monitoring and estimating the workload of football goalkeepers. Data will be collected during structured goalkeeper training sessions using a combination of video recordings and Inertial Measurement Unit (IMU) sensors, following a carefully designed protocol. The dataset will capture key movement patterns specific to goalkeeping, such as jumping, diving, lateral shuffling, and rapid direction changes. Using this data, the project will involve the development of an algorithm capable of analysing these movements and estimating the overall workload of a session. The algorithm will classify and quantify various types of activities, providing objective metrics that can inform training design, load management, and performance evaluation tailored specifically to goalkeepers. * Data collection is a part of project Last edited: 29/08/2025 |
739 – Radio communication tests on 169.4 MHz |
Category: | semester project | |
Keywords: | Electronics, Embedded Systems, Firmware, Radio | |
Type: | 10% theory, 70% hardware, 20% software | |
Responsible: | (MED 1 1025, phone: 36630) | |
Description: | Mobile robots often communicate over the 2.4 GHz band using standard off-the-shelf technologies as WiFi or Bluetooth, or sometimes custom radio protocols either on the 2.4 GHz or 868 MHz ISM bands, both on the UHF part of the radio spectrum. This project aims at evaluating the possibility of using the 169.4 MHz band (VHF) for controlling robots and obtaining telemetry, as it might give much better results in terms of range and transmission through obstacles or water, even if the available bandwidth is much more restricted. The project involves:
Requirements: experience with digital electronics and basic understanding of radio communications and related concepts (e.g. transmission lines, antennas). Previous experience with radio frequency and/or PCB design is a plus. Last edited: 11/06/2024 (revalidated 16/09/2025) |
Mobile robotics
740 – Firmware development and teleoperation control of robotic assistive furniture |
Category: | semester project, master project (full-time) | |
Keywords: | C, C++, Communication, Embedded Systems, Firmware, Linux, Programming, Robotics | |
Type: | 10% theory, 20% hardware, 70% software | |
Responsible: | (undefined, phone: 37432) | |
Description: | This project aims to develop an application for remote teleoperation of a swarm of mobile assistive furniture. The developed program allows a user to securely operate mobile furniture remotely as well as define a desired furniture arrangement in the room. On the firmware side, currently we are using Arduino Mega board to control the robot, and rely on ESP32 board or Bluetooth to realize the teleoperation. On the software side, we are using ROS or MQTT to implement the communication, and using Android to implement the tablet control interface. Related work: [1] Real-Time Localization for Closed-Loop Control of Assistive Furniture, Published in: IEEE Robotics and Automation Letters ( Volume: 8, Issue: 8, August 2023) https://ieeexplore.ieee.org/document/10155264 [2] Velocity Potential Field Modulation for Dense Coordination of Polytopic Swarms and Its Application to Assistive Robotic Furniture, Published in: IEEE Robotics and Automation Letters ( Volume: 10, Issue: 7, July 2025) https://ieeexplore.ieee.org/document/11027457 Last edited: 24/08/2025 |
732 – Body language control interface of a swarm of assistive robotic furniture using machine learning |
Category: | semester project, master project (full-time) | |
Keywords: | C++, Kinect, Linux, Machine learning, Python, Robotics, Vision | |
Type: | 45% theory, 10% hardware, 45% software | |
Responsible: | (undefined, phone: 37432) | |
Description: | Furniture is undergoing a significant transformation, evolving from static objects within the indoor environment into active and mobile entities. These enhanced capabilities not only enable novel modes of interaction but also introduce fundamental questions concerning how such systems should communicate with their users. In collaboration with Prof. Emmanuel Senft from the Human-Centered Robotics and AI group at EPFL IDIAP, and building upon recent advances in assistive robotic furniture developed at EPFL BioRob, this project aims to investigate how robotic furniture can communicate with their user by adapting their motions to achieve defined communication goals. This work builds on established systems in which the poses of both mobile furniture and human users are estimated and tracked using Multi-view Kinect RGB-D cameras combined with a learning-based algorithm. Human motions, or sequence of human poses, can be categorized into different meanings based on current studies of human body language, and can further be classified by the provided visual perception system using either geometrical regulations or learning-based motion recognition algorithm (for example, spatial-temporal graph neural network or transformer). Once the user commands are correctly identified, these commands can be sent to the mobile furniture robot using robot operating system (ROS 2) to execute the commands in order to meet the user requirements in the assistive environment. During the execution, a multi-robot coordination algorithm takes the duty of avoiding collision and resolve deadlocks. The project opens multiple avenues for exploration. Potential directions include the development of more robust learning-based human action recognition algorithms, the design of systematic and user-friendly body language communication protocols, enabling the feedback from the user, the extension of the system to multi-user scenarios and accelerating the whole pipeline. Comprehensive real-world experiments will be conducted to evaluate and validate both the functional capabilities and overall performance of the proposed system. Related work: [1] Real-Time Localization for Closed-Loop Control of Assistive Furniture, Published in: IEEE Robotics and Automation Letters ( Volume: 8, Issue: 8, August 2023) https://ieeexplore.ieee.org/document/10155264 [2] Velocity Potential Field Modulation for Dense Coordination of Polytopic Swarms and Its Application to Assistive Robotic Furniture, Published in: IEEE Robotics and Automation Letters ( Volume: 10, Issue: 7, July 2025) https://ieeexplore.ieee.org/document/11027457 Last edited: 24/08/2025 |
754 – Vision-language model-based mobile robotic manipulation |
Category: | master project (full-time), internship | |
Keywords: | Control, Experiments, Learning, Python, Robotics, Vision | |
Type: | 20% theory, 20% hardware, 60% software | |
Responsible: | (MED11626, phone: 41783141830) | |
Description: | INTRODUCTION Recent vision-language-action models (VLAs) build upon pretrained vision-language models and leverage diverse robot datasets to demonstrate strong task execution, language following ability, and semantic generalization. Despite these successes, VLAs struggle with novel robot setups and require fine-tuning to achieve good performance, yet how to most effectively fine-tune them is unclear, given many possible strategies. This project aims to 1) develop a customised mobile robot platform that is composed of a customised and ROS2-based mobile base and robot arms with 6DOF (ViperX 300 S and Widowx 250), and 2) establish a vision system equiped with RGBD cameras which is used for data collection, 3) deploy a pre-trained VLA model locally for robot manipulation with a focus of household environment, and 4) platform testing, validation and delivery. Excellent programming skill (Python) is a plus. For applicants not from EPFL, to obtain the student status at EPFL, the following conditions must be fulfilled (an attestation has to be provided during the online registration): [1] To be registered at a university for the whole duration of the project [2] The project must be required in the academic program and recognized by the home university [3] The duration of the project is a minimum of 2 months and a maximum of 12 months [4] To be accepted by an EPFL professor to do a project under his supervision For an internship, six months at least is suggested. WHAT WE HAVE: [1] Ready-and-easy-to-use robot platforms: including ViperX 300S and WidowX-250, configured with 4 RealSense D405, various grippers, and mobile robot platform [2] Computing resources: TWO desktop PC with NVIDIA GPU 4090 [3] HPC cluster with 1000h/month on NVIDIA A100 and A100fat : can use 1000 hours of A100 and A100 fat NVIDIA GPU every month, supports large-scale training and fine-tuning. Interested students can apply by sending an email to sichao.liu@epfl.ch. Please attach your transcript and past/current experience on the related topics. The position is open until we have final candidates. Otherwise, the position will be closed. Recommend to read: [1] https://www.physicalintelligence.company/blog [2] Kim, Moo Jin, Chelsea Finn, and Percy Liang. "Fine-tuning vision-language-action models: Optimizing speed and success." arXiv preprint arXiv:2502.19645 (2025). [3] https://docs.trossenrobotics.com/aloha_docs/2.0/specifications.html Benchmark: [1] LeRobot: Making AI for Robotics more accessible with end-to-end learning [2] DROID: A Large-Scale In-the-Wild Robot Manipulation Dataset [3] DiT-Block Policy: The Ingredients for Robotic Diffusion Transformers [4] Open X-Embodiment: Robotic Learning Datasets and RT-X Models Last edited: 30/06/2025 |
15 projects found.