Project Database
This page contains the database of possible research projects for master and bachelor students in the Biorobotics Laboratory (BioRob). Visiting students are also welcome to join BioRob, but it should be noted that no funding is offered for those projects (see https://biorob.epfl.ch/students/ for instructions). To enroll for a project, please directly contact one of the assistants (directly in his/her office, by phone or by mail). Spontaneous propositions for projects are also welcome, if they are related to the research topics of BioRob, see the BioRob Research pages and the results of previous student projects.
Search filter: only projects matching the keyword Linux are shown here. Remove filter
Amphibious robotics
Computational Neuroscience
Dynamical systems
Human-exoskeleton dynamics and control
Humanoid robotics
Miscellaneous
Mobile robotics
Modular robotics
Neuro-muscular modelling
Quadruped robotics
Computational Neuroscience
755 – High-performance enconder-decoder design for computational neural signal processing |
Category: | semester project, master project (full-time), internship | |
Keywords: | Computational Neuroscience, Data Processing, Linux, Programming, Python | |
Type: | 20% theory, 5% hardware, 75% software | |
Responsible: | (MED11626, phone: 41783141830) | |
Description: | Background Brain-computer interfaces (BCIs) using signals acquired with intracortical implants have achieved successful high-dimensional robotic device control useful for completing daily tasks. However, the substantial amount of medical and surgical expertise required to correctly implant and operate these systems greatly limits their use beyond a few clinical cases. A non-invasive counterpart requiring less intervention that can provide high-quality control would profoundly improve the integration of BCIS into multiple settings, and represent a nascent research field, brain robotics. However, this is challenging due to the inherent complexity of neural signals and difficulties in online neural decoding with efficient algorithms. Moreover, brain signals created by an external stimulus (e.g., vision) are most widely used in BCI-based applications, but it is impractical and infeasible in dynamic yet constrained environments. A question arises here: "How to circumvent constraints associated with stimulus-based signals? Is it feasible to apply non-invasive BCIS to read brain signals, and how to do so?". To a step further, I wonder could it be possible to accurately decode complete semantic-based command phrases in real time, and further achieve seamless and natural brain-robot systems for control and interactions? The project is for a team of 1-2 Master's students, and breakdown tasks will be assigned to each student later according to their skill set. What needs to be implemented and delivered at the end of the project? 1) A method package of brain signal pre-processing and feature formulation 2) An algorithm package of an encoder and a decoder of neural signals. 3) A model of training brain signals with spatial and temporal features. Last edited: 13/05/2025 |
Mobile robotics
740 – Firmware development and teleoperation control of robotic assistive furniture |
Category: | semester project, master project (full-time) | |
Keywords: | C, C++, Communication, Embedded Systems, Firmware, Linux, Programming, Robotics | |
Type: | 10% theory, 20% hardware, 70% software | |
Responsible: | (undefined, phone: 37432) | |
Description: | This project aims to develop an application for remote teleoperation of a swarm of mobile assistive furniture. The developed program allows a user to securely operate mobile furniture remotely as well as define a desired furniture arrangement in the room. On the firmware side, currently we are using Arduino Mega board to control the robot, and rely on ESP32 board or Bluetooth to realize the teleoperation. On the software side, we are using ROS or MQTT to implement the communication, and using Android to implement the tablet control interface. Related work: [1] Real-Time Localization for Closed-Loop Control of Assistive Furniture, Published in: IEEE Robotics and Automation Letters ( Volume: 8, Issue: 8, August 2023) https://ieeexplore.ieee.org/document/10155264 [2] Velocity Potential Field Modulation for Dense Coordination of Polytopic Swarms and Its Application to Assistive Robotic Furniture, Published in: IEEE Robotics and Automation Letters ( Volume: 10, Issue: 7, July 2025) https://ieeexplore.ieee.org/document/11027457 Last edited: 24/08/2025 |
732 – Body language control interface of a swarm of assistive robotic furniture using machine learning |
Category: | semester project, master project (full-time) | |
Keywords: | C++, Kinect, Linux, Machine learning, Python, Robotics, Vision | |
Type: | 45% theory, 10% hardware, 45% software | |
Responsible: | (undefined, phone: 37432) | |
Description: | Furniture is undergoing a significant transformation, evolving from static objects within the indoor environment into active and mobile entities. These enhanced capabilities not only enable novel modes of interaction but also introduce fundamental questions concerning how such systems should communicate with their users. In collaboration with Prof. Emmanuel Senft from the Human-Centered Robotics and AI group at EPFL IDIAP, and building upon recent advances in assistive robotic furniture developed at EPFL BioRob, this project aims to investigate how robotic furniture can communicate with their user by adapting their motions to achieve defined communication goals. This work builds on established systems in which the poses of both mobile furniture and human users are estimated and tracked using Multi-view Kinect RGB-D cameras combined with a learning-based algorithm. Human motions, or sequence of human poses, can be categorized into different meanings based on current studies of human body language, and can further be classified by the provided visual perception system using either geometrical regulations or learning-based motion recognition algorithm (for example, spatial-temporal graph neural network or transformer). Once the user commands are correctly identified, these commands can be sent to the mobile furniture robot using robot operating system (ROS 2) to execute the commands in order to meet the user requirements in the assistive environment. During the execution, a multi-robot coordination algorithm takes the duty of avoiding collision and resolve deadlocks. The project opens multiple avenues for exploration. Potential directions include the development of more robust learning-based human action recognition algorithms, the design of systematic and user-friendly body language communication protocols, enabling the feedback from the user, the extension of the system to multi-user scenarios and accelerating the whole pipeline. Comprehensive real-world experiments will be conducted to evaluate and validate both the functional capabilities and overall performance of the proposed system. Related work: [1] Real-Time Localization for Closed-Loop Control of Assistive Furniture, Published in: IEEE Robotics and Automation Letters ( Volume: 8, Issue: 8, August 2023) https://ieeexplore.ieee.org/document/10155264 [2] Velocity Potential Field Modulation for Dense Coordination of Polytopic Swarms and Its Application to Assistive Robotic Furniture, Published in: IEEE Robotics and Automation Letters ( Volume: 10, Issue: 7, July 2025) https://ieeexplore.ieee.org/document/11027457 Last edited: 24/08/2025 |
3 projects found.