ROS4HRI is a collection of open-source resources built around the Human-Robot Interaction (HRI) paradigm, enabling robots to meaningfully perceive, interpret and respond to humans in shared spaces and applications.
At its heart, ROS4HRI is a set of ROS 1/ROS 2-compatible conventions, libraries, and tools designed to streamline the development of HRI systems. It aligns and implements the ROS REP-155.
➡️ Go to the DOCUMENTATION ⬅️
Within this organization you’ll find standard implementations and libraries for the REP-155 messages and abstractions, as well as modules for human skeleton tracking, face detection, full-body modeling, engagement estimation, and other key components for HRI applications. See the documentation above for a full list.
We welcome contributions of all kinds: bug-reports, feature requests, pull-requests, documentation improvements, and new HRI modules.
To contribute:
- Fork the target repository.
- Create a feature or fix branch.
- Make sure your code follows the existing style conventions and passes any included tests.
- Submit a Pull Request and reference the issue(s) it addresses.
- Engage in review — we aim for clarity, correctness and maintainability.
- Check the Issues tab of each repository for upcoming work and current priorities.
- If you build an HRI use-case with ROS4HRI, we’d love to hear about it -- drop a link or case study.
Relevant academic references include:
- Mohamed, Lemaignan, ROS for Human-Robot Interaction
- Ros, Lemaignan, Ferrini, Andriella, Irisarri, ROS4HRI: Standardising an Interface for Human-Robot Interaction
- Lemaignan, Ferrini, Probabilistic fusion of persons' body features: the Mr. Potato algorithm
- Lemaignan, Ferrini, Gebelli, Ros, Juricic, Cooper, Hands-on: From Zero to an Interactive Social Robot using ROS4HRI and LLMs
- Alameda-Pineda et al., Socially Pertinent Robots in Gerontological Healthcare (first real-world deployment of a full autonomous robot using ROS4HRI)
Drop us a line (by starting a new disucssion, ➡️ ) if you want to see your research featured here!
Unless otherwise stated in a specific module, all material in this organisation is available under the Apache License, Version 2.0 — see each repository for details.

