Logo EPFL, École polytechnique fédérale de Lausanne

BIOROB

    Afficher / masquer le formulaire de recherche
    Masquer le formulaire de recherche
    • EN
    Menu
    1. Laboratories
    2. Biorobotics Laboratory (BioRob)
    3. Teaching and student projects
    4. Student project list

    Project Database

    This page contains the database of possible research projects for master and bachelor students in the Biorobotics Laboratory (BioRob). Visiting students are also welcome to join BioRob, but it should be noted that no funding is offered for those projects (see https://biorob.epfl.ch/students/ for instructions). To enroll for a project, please directly contact one of the assistants (directly in his/her office, by phone or by mail). Spontaneous propositions for projects are also welcome, if they are related to the research topics of BioRob, see the BioRob Research pages and the results of previous student projects.

    Search filter: only projects matching the keyword Data Processing are shown here. Remove filter

    Amphibious robotics
    Computational Neuroscience
    Dynamical systems
    Human-exoskeleton dynamics and control
    Humanoid robotics
    Miscellaneous
    Mobile robotics
    Modular robotics
    Neuro-muscular modelling
    Quadruped robotics


    Amphibious robotics

    767 – Data collection pipeline for sensorized amphibious robot experiments
    Show details
    Category:semester project, master project (full-time)
    Keywords:3D, C, C++, Communication, Computer Science, Data Processing, Experiments, Firmware, Image Processing, Motion Capture, Programming, Python, Robotics, Synchronization, Vision
    Type:5% theory, 20% hardware, 75% software
    Responsible: (MED 1 1626, phone: 38676)
    Description:

    This project has been taken

    In this project, the student will work closely with the other team members to develop data collection pipelines during the experiments of a sensorized amphibious robot and, optionally, use them to collect and analyze experimental data. Specifically, the student needs to:

    • Build a multi-camera system for tracking 3-D kinematics of the robots, ideally in real time. The system is expected to work in both indoor and outdoor experiments. We already have a few working setups, and the student needs to replicate them using new hardware, calibrate the system, and implement real-time tracking nodes in ROS2.
    • Synchronize data collected from multiple resources: cameras, force sensors, motors, etc.
    • Visualize the data collected in Mujoco viewer, RViz, Blender, or other 3D visualizers.
    • Help collect experimental data.
    • (For master project only) Help analyze data or use learning algorithms to find underlying patterns.

    The student is expected to be familiar with programming in C/C++ and Python, using ROS2, and robot kinematics. Experience with Docker, Linux kernel, communication protocols, and computer vision algorithms would be a bonus.

    The student who is interested in this project shall send the following materials to the assistant: (1) resume, (2) transcript showing relevant courses and grades, and (3) other materials that can demonstrate their skills and project experience (such as videos, slides, code repositories, etc.).



    Last edited: 17/01/2026

    Computational Neuroscience

    755 – High-performance encoder-decoder design for computational neural signal processing
    Show details
    Category:semester project, master project (full-time), internship
    Keywords:Computational Neuroscience, Data Processing, Linux, Programming, Python
    Type:20% theory, 5% hardware, 75% software
    Responsible: (MED11626, phone: 41783141830)
    Description:Background Brain-computer interfaces (BCIs) using signals acquired with intracortical implants have enabled successful high-dimensional robotic device control, making it possible to complete daily tasks. However, the substantial amount of medical and surgical expertise required to correctly implant and operate these systems greatly limits their use beyond a few clinical cases. A non-invasive counterpart that requires less intervention and can provide high-quality control would profoundly improve the integration of BCIS into multiple settings, representing a nascent research field known as brain robotics. However, this is challenging due to the inherent complexity of neural signals and difficulties in online neural decoding with efficient algorithms. Moreover, brain signals created by an external stimulus (e.g., vision) are most widely used in BCI-based applications; however, they are impractical and infeasible in dynamic yet constrained environments. A question arises here: "How to circumvent constraints associated with stimulus-based signals? Is it feasible to apply non-invasive BCIS to read brain signals, and how to do so?". To take a step further, I wonder if it would be possible to accurately decode complete, semantic-based command phrases in real time and further achieve seamless and natural brain-robot systems for control and interaction? The project is for a team of 1-2 Master's students, and breakdown tasks will be assigned to each student later according to their skill set. What needs to be implemented and delivered at the end of the project? 1) A method package of brain signal (MEG and EEG) pre-processing and feature formulation 2) An algorithm package of an encoder and a decoder of neural signals. 3) A model of training brain signals with spatial and temporal features. Importance: We have well-documented tutorials on how to use the brain signal dataset, how to use the HPC cluster to train the encoder and decoder, and a complete pipeline to decode EEG-image pairs and MEG-Audio pairs. Please come to office MED01612

    Last edited: 11/12/2025

    Miscellaneous

    765 – Validity and Reliability of IMU-Based Jump Test Analysis
    Show details
    Category:master project (full-time)
    Keywords:Data Processing, Motion Capture, Programming, Python, Statistical analysis
    Type:10% theory, 90% software
    Responsible: (MED 0 1016, phone: 32468)
    Description:Optizone has created a cutting-edge suite of algorithms that estimate athletes’ fitness levels through widely recognized performance tests such as the drop jump, squat jump, repetitive jump, hop test, and velocity-based training. These algorithms have the potential to transform athletic monitoring by providing fast, data-driven insights into performance and readiness. This project focuses on putting these algorithms to the test. Their accuracy and consistency will be rigorously evaluated against a gold-standard motion analysis system in a controlled laboratory setting. Using a structured protocol, athlete performance data will be collected, preprocessed, and subjected to in-depth statistical analysis to determine both the reliability (how consistent the results are) and validity (how well the algorithms reflect true performance). By bridging advanced algorithm development with scientific validation, this study aims to strengthen confidence in Optizone’s technology and lay the foundation for smarter, evidence-based training and injury-prevention strategies. Jump Tests:
    • Drop Jump
    • Squat Jump
    • Velocity-Based Training
    Project Phases:
    • Data Collection: Acquire athlete performance data in a motion analysis lab under standardized conditions.
    • Data Preprocessing: Clean, structure, and prepare the dataset for analysis.
    • Statistical Analysis: Apply statistical methods to assess Optizone’s algorithms against the gold-standard reference system


    Last edited: 29/08/2025
    763 – Workload Estimation and Action Classification in Basketball Using IMU Sensors
    Show details
    Category:master project (full-time)
    Keywords:Data Processing, Machine learning, Motion Capture, Programming, Python, Sensor Fusion
    Type:10% theory, 90% software
    Responsible: (MED 0 1016, phone: 32468)
    Description:n modern basketball, accurately monitoring player workload and identifying specific movement patterns are critical for optimizing performance, reducing injury risk, and tailoring individualized training programs. However, many existing workload assessment tools are not fine-tuned to capture the complex and explosive actions typical in basketball. This project aims to develop a sensor-based system that can estimate physical workload and classify basketball-specific movements using only Inertial Measurement Unit (IMU) sensors. Data will be collected from athletes during structured training sessions, with a focus on high-intensity basketball actions such as rebounds, layups, jump shots, sprints, direction changes, and defensive movements. The primary objective is to create algorithms capable of:
    • Estimating workload metrics (e.g., jump count, movement intensity, acceleration patterns)
    • Classifying basketball actions based on IMU-derived motion signatures.
      • Video recordings will be used solely to verify and annotate the IMU data, serving as a ground truth for validating the accuracy of the developed classification and workload estimation models. This project will result in a practical and sport-specific tool for coaches, trainers, and sports scientists to monitor performance and manage training loads using compact wearable technology, without relying on complex camera setups or external tracking systems. Data collection is a part of project

        Last edited: 29/08/2025
    762 – Multimodal sensor fusion for enhanced biomechanical profiling in football: integrating imu and video data from vertical jump tests
    Show details
    Category:master project (full-time)
    Keywords:Data Processing, Image Processing, Machine learning, Motion Capture, Programming, Python, Sensor Fusion
    Type:100% software
    Responsible: (MED 0 1016, phone: 32468)
    Description:Raw video shows the motion. IMUs reveal the accelerations, orientation. Combined, they unlock new biomechanical precision. This project focuses on developing a sensor fusion framework that synchronizes video recordings and inertial measurement unit (IMU) data to compute enhanced biomechanical metrics from jump tests (bilateral and unilateral CMJ, drop jump). The core aim is to overcome the limitations of each modality alone, combining the spatial richness of video with the temporal and acceleration precision of IMUs. You have access to a dataset consist of 25 players collected inside the lab with an infrared motion tracker system. Traditional biomechanical analysis in sport often relies on expensive lab equipment and manual video inspection. Your work could lay the foundation for next-generation performance monitoring systems that are low-cost, field-deployable, and data-rich.

    Last edited: 29/08/2025

    5 projects found.

    Quick links

    • Teaching and student projects
      • Project database
      • Past student projects
      • Students FAQ
    Logo EPFL, École polytechnique fédérale de Lausanne
    • Contact
    • Alessandro Crespi
    • +41 21 693 66 30
    Accessibility Disclaimer

    © 2026 EPFL, all rights reserved