Logo EPFL, École polytechnique fédérale de Lausanne

BIOROB

Afficher / masquer le formulaire de recherche
Masquer le formulaire de recherche
  • EN
Menu
  1. Laboratories
  2. Biorobotics Laboratory (BioRob)
  3. Student project list

Project Database

This page contains the database of possible research projects for master and bachelor students in the Biorobotics Laboratory (BioRob). Visiting students are also welcome to join BioRob, but it should be noted that no funding is offered for those projects (see https://biorob.epfl.ch/students/ for instructions). To enroll for a project, please directly contact one of the assistants (directly in his/her office, by phone or by mail). Spontaneous propositions for projects are also welcome, if they are related to the research topics of BioRob, see the BioRob Research pages and the results of previous student projects.

Search filter: only projects matching the keyword Machine learning are shown here. Remove filter

Amphibious robotics
Computational Neuroscience
Dynamical systems
Human-exoskeleton dynamics and control
Humanoid robotics
Miscellaneous
Mobile robotics
Modular robotics
Neuro-muscular modelling
Quadruped robotics


Human-exoskeleton dynamics and control

735 – Hip exoskeleton to assist daily activities
Show details
Category:semester project, master project (full-time), internship
Keywords:Bio-inspiration, C, C++, Communication, Compliance, Control, Data Processing, Dynamics Model, Electronics, Embedded Systems, Experiments, Inverse Dynamics, Kinematics Model, Learning, Locomotion, Machine learning, Online Optimization, Optimization, Programming, Python, Robotics, Treadmill
Type:30% theory, 35% hardware, 35% software
Responsible: (MED 3 1015, phone: 31153)
Description:Exoskeletons have experienced an unprecedented growth in recent years, and portable active devices have demonstrated their potential in assisting locomotion activities, increasing endurance, and reducing the walking effort. In our lab, a hip active orthosis (“eWalk”) has been prototyped and tested in recent years. Some projects will be available to address open research questions revolving around the topics of control, experimental evaluations, sensing, and embedded systems optimization. If you are interested in collaborating in one of these topics, please send an email to giulia.ramella@epfl.ch with (1) your CV+transcripts, (2) your previous experiences that could be relevant to the project, and (3) what interests you the most about this research topic (to be discussed during the interview). Please send the email from your institutional account, and include the type of project and in which semester you are interested in doing the collaboration.

Last edited: 17/11/2025

Miscellaneous

763 – Workload Estimation and Action Classification in Basketball Using IMU Sensors
Show details
Category:master project (full-time)
Keywords:Data Processing, Machine learning, Motion Capture, Programming, Python, Sensor Fusion
Type:10% theory, 90% software
Responsible: (MED 0 1016, phone: 32468)
Description:n modern basketball, accurately monitoring player workload and identifying specific movement patterns are critical for optimizing performance, reducing injury risk, and tailoring individualized training programs. However, many existing workload assessment tools are not fine-tuned to capture the complex and explosive actions typical in basketball. This project aims to develop a sensor-based system that can estimate physical workload and classify basketball-specific movements using only Inertial Measurement Unit (IMU) sensors. Data will be collected from athletes during structured training sessions, with a focus on high-intensity basketball actions such as rebounds, layups, jump shots, sprints, direction changes, and defensive movements. The primary objective is to create algorithms capable of:
  • Estimating workload metrics (e.g., jump count, movement intensity, acceleration patterns)
  • Classifying basketball actions based on IMU-derived motion signatures.
    • Video recordings will be used solely to verify and annotate the IMU data, serving as a ground truth for validating the accuracy of the developed classification and workload estimation models. This project will result in a practical and sport-specific tool for coaches, trainers, and sports scientists to monitor performance and manage training loads using compact wearable technology, without relying on complex camera setups or external tracking systems. Data collection is a part of project

      Last edited: 29/08/2025
762 – Multimodal sensor fusion for enhanced biomechanical profiling in football: integrating imu and video data from vertical jump tests
Show details
Category:master project (full-time)
Keywords:Data Processing, Image Processing, Machine learning, Motion Capture, Programming, Python, Sensor Fusion
Type:100% software
Responsible: (MED 0 1016, phone: 32468)
Description:Raw video shows the motion. IMUs reveal the accelerations, orientation. Combined, they unlock new biomechanical precision. This project focuses on developing a sensor fusion framework that synchronizes video recordings and inertial measurement unit (IMU) data to compute enhanced biomechanical metrics from jump tests (bilateral and unilateral CMJ, drop jump). The core aim is to overcome the limitations of each modality alone, combining the spatial richness of video with the temporal and acceleration precision of IMUs. You have access to a dataset consist of 25 players collected inside the lab with an infrared motion tracker system. Traditional biomechanical analysis in sport often relies on expensive lab equipment and manual video inspection. Your work could lay the foundation for next-generation performance monitoring systems that are low-cost, field-deployable, and data-rich.

Last edited: 29/08/2025

Mobile robotics

768 – Aria2Robot: Egocentric Meta Zürich Wearable Glasses for Robot Manipulation
Show details
Category:semester project, master project (full-time), internship
Keywords:Machine learning, Programming, Python, Robotics, Vision
Type:30% theory, 10% hardware, 60% software
Responsible: (MED11626, phone: 41783141830)
Description:INTRODUCTION Egocentric wearable sensing is becoming a key enabler for embodied AI and robotics. Meta’s Project Aria (https://www.projectaria.com/) research glasses provide rich, multimodal, first-person observations (RGB video, scene cameras, IMUs, microphones, eye-tracking, and more) in a socially acceptable, all-day wearable form factor, specifically designed to advance egocentric AI, robotics, and contextual perception research. In collaboration with Meta Zürich, we aim to tightly couple Aria research glasses with our existing manipulation platforms at EPFL. This project will 1) integrate the Aria Research Kit with our ROS2-based robot platforms (ViperX 300S and WidowX-250 arms on a mobile base), including calibration and time-synchronisation with RGB-D cameras and robot state; 2) design and execute egocentric data collection in household-like environments (Aria + RealSense + robot joint trajectories + language annotations); 3) explore one or more robotics applications powered by Aria signals, such as intention-aware teleoperation, egocentric demonstrations for policy learning, or vision-language(-action) fine-tuning for assistance tasks; and 4) perform systematic platform testing, validation and documentation to deliver a reusable research pipeline for future projects. Excellent programming skill (Python) is a plus. IMPORTANCE: We have well-documented tutorials on using the robots, teleoperation interfaces for data collection, using the HPC cluster, and a complete pipeline for training robot policies. The Aria Research Kit and tools (recording, calibration, dataset tooling, SDK) will be integrated into this ecosystem, so the student can focus on the research questions rather than low-level setup. What makes Aria special for robotics? Project Aria glasses are multi-sensor “research smart glasses”: multiple cameras (wide FOV), IMUs, microphones, eye gaze, and a Machine Perception Service (MPS) that provides SLAM poses, hand poses, etc. They’re explicitly marketed by Meta as a research kit for contextual AI and robotics – i.e., using egocentric sensing to build embodied agents that understand and act in the world. Compared to a normal RGB-D camera, Aria gives you: Egocentric view: “what the human (or robot) sees” while acting. Calibrated head pose/trajectory (via SLAM in MPS). Hand/gaze info (depending on what parts you use). A portable, wearable, socially acceptable form factor. WHAT WE HAVE: [1] Ready-and-easy-to-use robot platforms: including ViperX 300S and WidowX-250 arms, configured with 4 RealSense D405 cameras, various grippers, and a mobile robot platform. [2] Egocentric sensing hardware: Meta Project Aria research glasses (via collaboration with Meta Zürich), including access to the Aria Research Kit and tooling for data recording and processing. [3] Computing resources: TWO desktop PCs with NVIDIA GPUs 5090 and 4090. Interested students can apply by sending an email to sichao.liu@epfl.ch. Please attach your transcript and a short description of your past/current experience on related topics (robotics, computer vision, machine learning, AR/egocentric perception). The position is open until we have final candidates. Otherwise, the position will be closed. Recommend to read: [1] Aria project: https://www.projectaria.com/resources/ [2] Aria GitHub: https://github.com/facebookresearch/projectaria_tools [3] Liu V, Adeniji A, Zhan H, Haldar S, Bhirangi R, Abbeel P, Pinto L. Egozero: Robot learning from smart glasses. arXiv preprint arXiv:2505.20290. [4] Zhu LY, Kuppili P, Punamiya R, Aphiwetsa P, Patel D, Kareer S, Ha S, Xu D. Emma: Scaling mobile manipulation via egocentric human data. arXiv preprint arXiv:2509.04443. [5] Lai Y, Yuan S, Zhang B, Kiefer B, Li P, Deng T, Zell A. Fam-hri: Foundation-model assisted multi-modal human-robot interaction combining gaze and speech. arXiv preprint arXiv:2503.16492. [56 Banerjee P, Shkodrani S, Moulon P, Hampali S, Zhang F, Fountain J, Miller E, Basol S, Newcombe R, Wang R, Engel JJ. Introducing HOT3D: An egocentric dataset for 3D hand and object tracking. arXiv preprint arXiv:2406.09598.

Last edited: 25/11/2025

4 projects found.

Quick links

  • Teaching and student projects
    • Project database
    • Past student projects
    • Students FAQ
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • Alessandro Crespi
  • +41 21 693 66 30
Accessibility Disclaimer

© 2025 EPFL, all rights reserved