Project Database
This page contains the database of possible research projects for master and bachelor students in the Biorobotics Laboratory (BioRob). Visiting students are also welcome to join BioRob, but it should be noted that no funding is offered for those projects (see https://biorob.epfl.ch/students/ for instructions). To enroll for a project, please directly contact one of the assistants (directly in his/her office, by phone or by mail). Spontaneous propositions for projects are also welcome, if they are related to the research topics of BioRob, see the BioRob Research pages and the results of previous student projects.
Search filter: only projects matching the keyword Machine learning are shown here. Remove filter
Amphibious robotics
Computational Neuroscience
Dynamical systems
Human-exoskeleton dynamics and control
Humanoid robotics
Miscellaneous
Mobile robotics
Modular robotics
Neuro-muscular modelling
Quadruped robotics
Mobile robotics
| 768 – Aria2Robot: Egocentric Meta Zürich Wearable Glasses for Robot Manipulation |
| Category: | semester project, master project (full-time), internship | |
| Keywords: | Machine learning, Programming, Python, Robotics, Vision | |
| Type: | 30% theory, 10% hardware, 60% software | |
| Responsible: | (MED11626, phone: 41783141830) | |
| Description: | INTRODUCTION Egocentric wearable sensing is becoming a key enabler for embodied AI and robotics. Meta’s Project Aria (https://www.projectaria.com/) research glasses provide rich, multimodal, first-person observations (RGB video, scene cameras, IMUs, microphones, eye-tracking, and more) in a socially acceptable, all-day wearable form factor, specifically designed to advance egocentric AI, robotics, and contextual perception research. In collaboration with Meta Zürich, we aim to tightly couple Aria research glasses with our existing manipulation platforms at EPFL. This project will 1) integrate the Aria Research Kit with our ROS2-based robot platforms (ViperX 300S and WidowX-250 arms on a mobile base), including calibration and time-synchronisation with RGB-D cameras and robot state; 2) design and execute egocentric data collection in household-like environments (Aria + RealSense + robot joint trajectories + language annotations); 3) explore one or more robotics applications powered by Aria signals, such as intention-aware teleoperation, egocentric demonstrations for policy learning, or vision-language(-action) fine-tuning for assistance tasks; and 4) perform systematic platform testing, validation and documentation to deliver a reusable research pipeline for future projects. Excellent programming skill (Python) is a plus. IMPORTANCE: We have well-documented tutorials on using the robots, teleoperation interfaces for data collection, using the HPC cluster, and a complete pipeline for training robot policies. The Aria Research Kit and tools (recording, calibration, dataset tooling, SDK) will be integrated into this ecosystem, so the student can focus on the research questions rather than low-level setup. What makes Aria special for robotics? Project Aria glasses are multi-sensor “research smart glasses”: multiple cameras (wide FOV), IMUs, microphones, eye gaze, and a Machine Perception Service (MPS) that provides SLAM poses, hand poses, etc. They’re explicitly marketed by Meta as a research kit for contextual AI and robotics – i.e., using egocentric sensing to build embodied agents that understand and act in the world. Compared to a normal RGB-D camera, Aria gives you: Egocentric view: “what the human (or robot) sees” while acting. Calibrated head pose/trajectory (via SLAM in MPS). Hand/gaze info (depending on what parts you use). A portable, wearable, socially acceptable form factor. WHAT WE HAVE: [1] Ready-and-easy-to-use robot platforms: including ViperX 300S and WidowX-250 arms, configured with 4 RealSense D405 cameras, various grippers, and a mobile robot platform. [2] Egocentric sensing hardware: Meta Project Aria research glasses (via collaboration with Meta Zürich), including access to the Aria Research Kit and tooling for data recording and processing. [3] Computing resources: TWO desktop PCs with NVIDIA GPUs 5090 and 4090. Interested students can apply by sending an email to sichao.liu@epfl.ch. Please attach your transcript and a short description of your past/current experience on related topics (robotics, computer vision, machine learning, AR/egocentric perception). The position is open until we have final candidates. Otherwise, the position will be closed. Recommend to read: [1] Aria project: https://www.projectaria.com/resources/ [2] Aria GitHub: https://github.com/facebookresearch/projectaria_tools [3] Liu V, Adeniji A, Zhan H, Haldar S, Bhirangi R, Abbeel P, Pinto L. Egozero: Robot learning from smart glasses. arXiv preprint arXiv:2505.20290. [4] Zhu LY, Kuppili P, Punamiya R, Aphiwetsa P, Patel D, Kareer S, Ha S, Xu D. Emma: Scaling mobile manipulation via egocentric human data. arXiv preprint arXiv:2509.04443. [5] Lai Y, Yuan S, Zhang B, Kiefer B, Li P, Deng T, Zell A. Fam-hri: Foundation-model assisted multi-modal human-robot interaction combining gaze and speech. arXiv preprint arXiv:2503.16492. [56 Banerjee P, Shkodrani S, Moulon P, Hampali S, Zhang F, Fountain J, Miller E, Basol S, Newcombe R, Wang R, Engel JJ. Introducing HOT3D: An egocentric dataset for 3D hand and object tracking. arXiv preprint arXiv:2406.09598. Please come to office MED 01612 Last edited: 11/12/2025 | |
One project found.