Project Database
This page contains the database of possible research projects for master and bachelor students in the Biorobotics Laboratory (BioRob). Visiting students are also welcome to join BioRob, but it should be noted that no funding is offered for those projects (see https://biorob.epfl.ch/students/ for instructions). To enroll for a project, please directly contact one of the assistants (directly in his/her office, by phone or by mail). Spontaneous propositions for projects are also welcome, if they are related to the research topics of BioRob, see the BioRob Research pages and the results of previous student projects.
| To limit the list to the projects matching a given keyword, click on it. | Show complete list |
3D, Agility, Architecture, Artificial muscles, Balance Control, Bio-inspiration, Biomimicry, Biped Locomotion, C, C#, C++, Coman, Communication, Compliance, Computational Neuroscience, Computer Science, Control, Data Evaluation, Data Processing, Dynamics Model, Electronics, Embedded Systems, Estimator, Experiments, FPGA, Feedback, Firmware, Footstep Planning, GUI, Hybrid Balance Control, Image Processing, Inverse Dynamics, Kinect, Kinematics Model, Laser Scanners, Learning, Leg design, Linux, Localization, Locomotion, Machine learning, Mechanical Construction, Motion Capture, Muscle modeling, Online Optimization, Optic Flow, Optimization, Probabilistics, Processor, Programming, Prototyping, Python, Quadruped Locomotion, Radio, Reflexes, Robotics, Sensor Fusion, Simulator, Soft robotics, Statistical analysis, Synchronization, Treadmill, VHDL, Vision, sensor
Amphibious robotics
Computational Neuroscience
Dynamical systems
Human-exoskeleton dynamics and control
Humanoid robotics
Miscellaneous
Mobile robotics
Modular robotics
Neuro-muscular modelling
Quadruped robotics
Amphibious robotics
| 758 – Optimization of compliant structure designs in a salamander robot using physics simulation |
| Category: | master project (full-time) | |
| Keywords: | Bio-inspiration, Biomimicry, Compliance, Dynamics Model, Experiments, Locomotion, Optimization, Programming, Python, Robotics, Simulator, Soft robotics | |
| Type: | 30% theory, 20% hardware, 50% software | |
| Responsibles: |
(MED 1 1611, phone: 36620)
(MED 1 1626, phone: 38676) | |
| Description: | In nature, animals have many compliant structures that benefit their locomotion. For example, compliant foot/leg structures help adapt to uneven terrain or negotiate obstacles, flexible tails allow efficient undulatory swimming, and muscle-tendon structures help absorb shock and reduce energy loss. Similar compliant structures may benefit salamander-inspired robots as well. In this study, the student will try simulating compliant structures (the feet of the robot) in Mujoco and optimizing the design. To bridge the sim-to-real gap, the student will first work with other lab members to perform experiments to measure the mechanical properties of a few simple compliant structures. Then, the student needs to simulate these experiments using the flexcomp plugin of Mujoco or theoretical solid mechanics models, and tune the simulation models to match the dynamical response in simulation with the experiments. Afterward, the student needs to optimize the design parameters of the compliant structures in simulation to improve the locomotion performance of the robot while maintaining a small sim-to-real gap. Finally, prototypes of the optimal design will be tested on the physical robot to verify the results. The student is thus required to be familiar with Python programming, physics engines (preferably Mujoco), and optimization/learning algorithms. The student should also have basic mechanical design abilities to design mechanical structures and perform experiments. Students who have taken the Computational Motor Control course or have experience with data-driven design and solid mechanics would also be preferred. The student who is interested in this project shall send the following materials to the assistants: (1) resume, (2) transcript showing relevant courses and grades, and (3) other materials that can demonstrate your skills and project experience (such as videos, slides, Git repositories, etc.). Last edited: 14/04/2026 | |
| 767 – Data collection pipeline for sensorized amphibious robot experiments |
| Category: | semester project, master project (full-time) | |
| Keywords: | 3D, C, C++, Communication, Computer Science, Data Processing, Experiments, Firmware, Image Processing, Motion Capture, Programming, Python, Robotics, Synchronization, Vision | |
| Type: | 5% theory, 20% hardware, 75% software | |
| Responsible: | (MED 1 1626, phone: 38676) | |
| Description: | This project has been taken In this project, the student will work closely with the other team members to develop data collection pipelines during the experiments of a sensorized amphibious robot and, optionally, use them to collect and analyze experimental data. Specifically, the student needs to:
The student is expected to be familiar with programming in C/C++ and Python, using ROS2, and robot kinematics. Experience with Docker, Linux kernel, communication protocols, and computer vision algorithms would be a bonus. The student who is interested in this project shall send the following materials to the assistant: (1) resume, (2) transcript showing relevant courses and grades, and (3) other materials that can demonstrate their skills and project experience (such as videos, slides, code repositories, etc.). Last edited: 17/01/2026 | |
| 770 – Improvement of passive feet design for sprawling type quadruped robots |
| Category: | semester project, master project (full-time) | |
| Keywords: | Bio-inspiration, Compliance, Experiments, Leg design, Locomotion, Prototyping, Quadruped Locomotion, Robotics, Soft robotics | |
| Type: | 20% theory, 50% hardware, 30% software | |
| Responsible: | (MED 1 1626, phone: 38676) | |
| Description: | This project has been taken Many quadruped robots use simple ball feet, while animals usually have complex foot structures. Some studies have tried designing more complex actuated or adaptive feet for quadruped robots. However, few have systematically investigated the benefits of such feet when they are integrated into the robot, especially for the sprawling-type quadrupeds. The lack of understanding also exists in animal locomotion because of the complexity and small dimensions of the structure. To start understanding the role of biomimetic foot structures, we have had several projects designing passive feet for our salamander-inspired robots. This project aims to further extend the results by improving the design and more systematically collecting data in different environments. The semester student will: (1) improve the design of the feet based on previous studies, (2) perform systematic tests in different environments, and (3) analyze the results. The student is expected to be experienced in mechanical design and manufacturing, Python programming, and robot kinematics. Knowledge of robot dynamics and elastic rod theories is also helpful. If the student aims to finish a master's thesis based on this project, the student needs to additionally finish one of the following tasks: (1) model the passive feet dynamics from first principles or neural networks, (2) develop novel sensors to monitor the states of the feet, (3) design novel structures to integrate the design with the entire leg. Students who are interested in this project shall send the following materials to the assistant: (1) resume, (2) transcript showing relevant courses and grades, and (3) other materials that can demonstrate your skills and project experience (such as videos, slides, Git repositories, etc.). Last edited: 17/01/2026 | |
Quadruped robotics
A small excerpt of possible projects is listed here. Highly interested students may also propose projects, or continue an existing topic.
| 769 – Learning Morphology-Specific Emergence of Gaits |
| Category: | master project (full-time) | |
| Keywords: | Biomimicry, Computational Neuroscience, Learning, Python, Simulator | |
| Type: | 20% theory, 80% software | |
| Responsible: | (MED 1 1226, phone: 32658) | |
| Description: | Why do horses and and camels both walk at slow speeds and gallop at fast speeds, but at intermediate speeds horses prefer to trot while camels pace? While gait transitions have been well studied for a given morphology, these models rarely explain when and why animals prefer different or gaits despite being quite similar, or the same gaits despite having very different morphologies. This project tackles this question through the lens of reinforcement learning (RL), with a focus on the role of entrainment between an internal oscillator model and the mechanical dynamics, i.e the morphology. You will explore both top-down and bottom-up coupling mechanisms, unconventional reward functions such as viability measures, and benchmark these approaches across different morphological parameters (e.g length-to-height and width-to-height ratios, mass). Stretch goals can include evaluating the role of active exploration in a hierarchical RL setup, exploring sprawling or bipedal morphologies, changing morphology during learning (e.g. growth), or you may propose something in discussion with the supervisors. NOTE: this is a collaboration project, to be conducted at Cornell University, USA. To apply, e-mail Steve Heim stating why you are interested in this project (brief, 1-2 sentences each), and attach your CV and transcript. | |
Miscellaneous
| 771 – Diffusing Elementary Dynamics Actions |
| Category: | semester project, master project (full-time) | |
| Keywords: | Bio-inspiration, Control, Robotics | |
| Type: | 20% hardware, 80% software | |
| Responsible: | (Martigny, phone: none) | |
| Description: | The project will use a diffusion policy as a high-level model-predictive controller operating at a relatively low frequency. This controller will activate feedforward actions consistent with the theory of Elementary Dynamic Actions (EDA), inspired by principles of human motor control. Using this framework, the student will test hypotheses about the structure and organization of human motor control. In this project, the student will collect motion and force data during contact interactions (using an OptiTrack motion-capture system and a Bota Systems force-torque sensor), they will use Idiap's high-performance computing (HPC) grid to train models, and they will evaluate the controller on a Franka robot. This project will done in collaboration with James Hermus at IDIAP. Last edited: 03/12/2025 | |
Mobile robotics
| 768 – Aria2Robot: Egocentric data-driven policy learning for robot manipualtion |
| Category: | semester project, master project (full-time), internship | |
| Keywords: | Computer Science, Machine learning, Programming, Python, Robotics, Vision | |
| Type: | 30% theory, 10% hardware, 60% software | |
| Responsibles: |
(undefined, phone: 37432)
(MED11626, phone: 41783141830) | |
| Description: | INTRODUCTION Egocentric wearable sensing is becoming a key enabler for embodied AI and robotics. Meta’s Project Aria (https://www.projectaria.com/) research glasses provide rich, multimodal, first-person observations (RGB video, scene cameras, IMUs, microphones, eye-tracking, and more) in a socially acceptable, all-day wearable form factor, specifically designed to advance egocentric AI, robotics, and contextual perception research. In collaboration with Meta, we aim to tightly couple Aria research glasses with our existing manipulation platforms at EPFL. This project will 1] Integrate the Aria Research Kit (Aria Gen 2) with our ROS2-based robot platforms (ViperX 300S and WidowX-250 arms on a mobile base), including calibration and time-synchronisation with RGB-D cameras and robot state; 2] Design and execute egocentric data collection in household-like environments (Aria + RealSense + robot joint trajectories + language annotations); 3] Develop egocentric data-driven policies for robotic manipulation through learning from demonstration. (Robot can be a robot arm or a humanoid robot (Reaman)) 4] Explore one or more robotics applications powered by Aria signals, such as intention-aware teleoperation, egocentric demonstrations for policy learning, or vision-language(-action) fine-tuning for assistance tasks; and 5) Perform systematic platform testing, validation, and documentation to deliver a reusable research pipeline for future projects. Excellent programming skills (Python) are a plus. IMPORTANCE: We have well-documented tutorials on using the robots, teleoperation interfaces for data collection, using the HPC cluster, and a complete pipeline for training robot policies. The Aria Research Kit and tools (recording, calibration, dataset tooling, SDK) will be integrated into this ecosystem, so the student can focus on the research questions rather than low-level setup. We already have a good base and results based on the ongoing project. What makes Aria special for robotics? Project Aria glasses are multi-sensor “research smart glasses”: multiple cameras (wide FOV), IMUs, microphones, eye gaze, and a Machine Perception Service (MPS) that provides SLAM poses, hand poses, etc. They’re explicitly marketed by Meta as a research kit for contextual AI and robotics – i.e., using egocentric sensing to build embodied agents that understand and act in the world. Compared to a normal RGB-D camera, Aria gives you: Egocentric view: “what the human (or robot) sees” while acting. Calibrated head pose/trajectory (via SLAM in MPS). Hand/gaze info (depending on what parts you use). A portable, wearable, socially acceptable form factor. WHAT WE HAVE: [1] Ready-and-easy-to-use robot platforms: including ViperX 300S and WidowX-250 arms, configured with 4 RealSense D405 cameras, various grippers, and a mobile robot platform. [2] Egocentric sensing hardware: Meta Project Aria research glasses (Aria Gen 2), including access to the Aria Research Kit and tooling for data recording and processing. [3] Computing resources: TWO desktop PCs with NVIDIA GPUs 5090 and 4090. Interested students can apply by emailing sichao.liu@epfl.ch or lixuan.tang@epfl.ch. Please attach your transcript and a short description of your past/current experience on related topics (robotics, computer vision, machine learning, AR/egocentric perception). The position is open until we have final candidates. Otherwise, the position will be closed. Recommend to read: [1] Kareer, Simar, et al. "Egomimic: Scaling imitation learning via egocentric video." 2025 IEEE International Conference on Robotics and Automation (ICRA). IEEE, 2025. [2] Liu, Vincent, et al. "Egozero: Robot learning from smart glasses." arXiv preprint arXiv:2505.20290 (2025). [3] Punamiya, Ryan, et al. "EgoVerse: An Egocentric Human Dataset for Robot Learning from Around the World." arXiv preprint arXiv:2604.07607 (2026). [4] Saroha, Abhishek, et al. "EgoFlow: Gradient-Guided Flow Matching for Egocentric 6DoF Object Motion Generation." arXiv preprint arXiv:2604.01421 (2026). [5] Zheng, Ruijie, et al. "Egoscale: Scaling dexterous manipulation with diverse egocentric human data." arXiv preprint arXiv:2602.16710 (2026). [6] Aria project: https://www.projectaria.com/resources/ [7] Aria GitHub: https://github.com/facebookresearch/projectaria_tools Last edited: 22/04/2026 | |
6 projects found.