🤖 RIO Revolution - Transform your smartphone into a fully-featured ROS2 robot! Utilize a wide array of built-in mobile sensors, including the Accelerometer, Gyro, Compass, GPS, NFC, IR, Ambient Light, Fingerprint Scanner, Cameras, and Mic/Speaker, as ROS2 topics, services, and actions. With the integration of Lidar, we can enable autonomously navigating companion robots that express emotions through animated facial expressions, while also extending functionality with our custom hardware platform.
- 📑 Table of Contents
- 🏛️ Rio Architecture
- ⚙️ Requirements
- 🚀 Getting Started
- 📡 RIO Interfaces
- 🔗 Reference Links
- Future Scope
- 🤝 Contributing
- 📄 License
- 📡 15+ built-in sensors as ROS2 interfaces
- 👁️ Programmable facial expressions displayed on the screen, enabling the robot to engage in conversation using its microphone and speaker.
- 🔌 Unified mobile-to-robot communication
- ⚡ RIO Control Board based on ESP32 (MicroROS compatible)
- 🔌 Easy-to-use plug-and-play ports:
- 2x Motors (controlled via a 2-channel L293D driver)
- 2x Quadrature Encoders
- 2x Servos with a 5V 2A power output
- 1x I2C port with 5V power supply
- 1x WS2812B LED Strip
- 1x Lidar sensor
- 1x 7-12V DC power jack
- Type-C programming connector
- 📡 Seamless integration with RPLIDAR A1M8 for enhanced mapping capabilities
- Assembled RIO Robot with Control PCB (BOM, assembly and firmware flashing.)
- Android Smartphone
- ROS 2 Humble (Recommended)
- Ollama Installation (Local LLM Execution)
- Groq API Key(optional if ollama is not used) (Cloud LLM Execution) create account in groq and get the api key and set it in the environment variable
GROQ_API_KEYpermanently add the api key to the environment variableexport GROQ_API_KEY=<your_groq_api_key>GROQ_API_KEYin the~/.bashrcfileecho "export GROQ_API_KEY=<your_groq_api_key>" >> ~/.bashrc - RIO Companion App ( Play Store)
Note: For complete ROS 2 installation directly on Android device itself, refer to our Android Installation Guide
# Source ROS installation
source /opt/ros/$ROS_DISTRO/setup.bash# Set up Micro-ROS workspace
mkdir -p ~/uros_ws/src
cd ~/uros_ws/src
git clone -b $ROS_DISTRO https://github.com/micro-ROS/micro_ros_setup.git
# Build Micro-ROS
cd ~/uros_ws
rosdep update && rosdep install --from-paths src --ignore-src -y
colcon build
source install/setup.bash
# Build Micro-ROS Agent
ros2 run micro_ros_setup create_agent_ws.sh
ros2 run micro_ros_setup build_agent.sh
source install/setup.bash# Set up RIO workspace
mkdir -p ~/rio_ws/src
cd ~/rio_ws/src
git clone https://github.com/botforge-robotics/rio_ros2.git
# Build RIO packages
cd ~/rio_ws
rosdep install --from-paths src --ignore-src -r -y
colcon build
source install/setup.bashAdd these to your ~/.bashrc for automatic sourcing in new terminals:
# 2.1 Add ROS source
echo "source /opt/ros/$ROS_DISTRO/setup.bash" >> ~/.bashrc
# 2.2 Add Micro-ROS workspace source
echo "source ~/uros_ws/install/setup.bash" >> ~/.bashrc
# 2.3 Add RIO workspace source
echo "source ~/rio_ws/install/setup.bash" >> ~/.bashrcImportant: For existing terminals, manually run these commands or restart your shell:
# 2.4 Manual sourcing for existing terminals source /opt/ros/$ROS_DISTRO/setup.bash source ~/uros_ws/install/setup.bash source ~/rio_ws/install/setup.bash
The mobile nodes launch file (mobile_nodes.launch.py) starts components related to the smartphone functionality:
# Launch mobile-related nodes
ros2 launch rio_bringup mobile_nodes.launch.py llm_backend:=ollamaThis launch file includes:
- Ollama LLM Node: local LLM for robot interactions
- Groq LLM Node: cloud LLM for robot interactions
- WebRTC Node: Video streaming server (port 8080)
- Rosbridge WebSocket: Enables ROS2-to-WebSocket communication
Mobile Node Launch Parameters:
| Parameter | Description | Default Value | Options |
|---|---|---|---|
llm_backend |
LLM backend | ollama |
ollama, groq |
The PCB nodes launch file (pcb_nodes.launch.py) manages hardware-related components:
# Launch PCB-related nodes
ros2 launch rio_bringup pcb_nodes.launch.py agent_port:=8888PCB Launch Parameters:
| Parameter | Description | Default Value |
|---|---|---|
agent_port |
Micro-ROS agent UDP port | 8888 |
This launch file includes:
- Micro-ROS Agent: Handles communication with ESP32
- Odometry TF Broadcaster: Publishes transform data
- LIDAR UDP Node: Manages LIDAR sensor data
# 5.1 Launch real robot nodes
ros2 launch rio_bringup rio_real_robot.launch.py \
use_sim_time:=false \
agent_port:=8888 \
llm_backend:=ollamaReal Robot Parameters:
| Parameter | Description | Default Value | Options |
|---|---|---|---|
use_sim_time |
Use simulation clock (must be false for real hardware) | false |
true/false |
agent_port |
Micro-ROS agent UDP port | 8888 |
Any available port number |
llm_backend |
LLM backend | ollama |
ollama, groq |
# 3.1 Launch Gazebo simulation
ros2 launch rio_simulation gazebo.launch.py \
world:=house.world \
use_sim_time:=trueSimulation Parameters:
| Parameter | Description | Default Value | Options |
|---|---|---|---|
world |
Gazebo world file | empty.world |
house.world, warehouse.world |
use_sim_time |
Use simulation clock | true |
true/false |
# 5.1.1 Launch SLAM mapping
ros2 launch rio_mapping mapping.launch.py \
use_sim_time:=false \
use_gui:=falseNote: Refer mapping params in rio_mapping/params/mapping_config.yaml for any modifications.
Mapping Parameters:
| Parameter | Description | Default Value | Options |
|---|---|---|---|
use_sim_time |
Use simulation clock | false |
true/false |
use_gui |
Open RVIZ2 GUI | false |
true/false |
# 5.2.1.1 Launch joystick teleop
ros2 launch rio_teleop teleop_joy.launch.pyNote: Refer joystick params in rio_teleop/params/joystick.yaml for any modifications.
# 5.2.2.1 Install RQT if not already installed
sudo apt install ros-$ROS_DISTRO-rqt-robot-steering
# 5.2.2.2 Launch RQT Robot Steering
ros2 run rqt_robot_steering rqt_robot_steeringTip: Ensure your robot's teleop topic is correctly configured. Typical topics include:
/cmd_velfor drive robot
# 5.3.1 Save created map
ros2 launch rio_mapping save_map.launch.py map_name:=<map_name_here>This saves map files inside rio_mapping/maps/ folder.
# 5.4.1 Launch navigation
ros2 launch rio_navigation navigation.launch.py \
map_name:=house.yaml \
params_file:=nav2_real_params.yaml \
use_sim_time:=false \
use_gui:=falseNavigation Parameters:
| Parameter | Description | Default Value | Options |
|---|---|---|---|
map_name |
Map file for navigation | house.yaml |
YAML map file name |
params_file |
Navigation parameters | nav2_real_params.yaml |
YAML config file name, Available: nav2_real_params.yaml/ nav2_sim_params.yaml |
use_sim_time |
Use simulation clock | false |
true/false |
use_gui |
Open RVIZ2 GUI | false |
true/false |
# 6.1 Launch RViz
ros2 launch rio_simulation rviz.launch.py \
rviz_config:=default.rvizVisualization Parameters:
| Parameter | Description | Default Value | Options |
|---|---|---|---|
rviz_config |
RViz config file | default.rviz |
Any .rviz config |
/battery(sensor_msgs/BatteryState) - Battery status information/expression(std_msgs/String) - Current facial expression/gps(sensor_msgs/NavSatFix) - GPS location data/illuminance(sensor_msgs/Illuminance) - Ambient light sensor readings/imu/absolute_orientation(geometry_msgs/Vector3Stamped) - Absolute orientation using magnetic reference/imu/data(sensor_msgs/Imu) - Combined IMU data with acceleration, velocity and orientation/imu/heading(std_msgs/Float32) - Compass heading in degrees (0-360°)/imu/linear_acceleration(geometry_msgs/Vector3Stamped) - User acceleration without gravity (m/s²)/imu/mag(sensor_msgs/MagneticField) - Magnetometer readings in μT (micro-Tesla)/imu/orientation(geometry_msgs/Vector3Stamped) - Device orientation (pitch, roll, yaw)/odom(nav_msgs/Odometry) - Robot odometry data/scan(sensor_msgs/LaserScan) - LIDAR scan data/sonar(sensor_msgs/Range) - Ultrasonic sensor range data/speech_recognition/hotword_detected(std_msgs/Empty) - Wake word detection/speech_recognition/result(std_msgs/String) - Recognized speech text/speech_recognition/status(std_msgs/String) - Speech Recognition system status ("listening", "done")
-
/cmd_vel(geometry_msgs/Twist) - Control robot's linear and angular velocity. -
/left_led(std_msgs/ColorRGBA) - Control left LED color with RGBA values (RGB: 0-255, Alpha: 0-255) -
/right_led(std_msgs/ColorRGBA) - Control right LED color with RGBA values (RGB: 0-255, Alpha: 0-255) -
/servoA(std_msgs/Int16) - Control servo A position in degrees (0-180) -
/servoB(std_msgs/Int16) - Control servo B position in degrees (0-180) -
/torch(std_msgs/Bool) - Control phone's flashlight (true = on, false = off)
- Type:
rio_interfaces/action/Auth - Description: Authenticate using phone's biometric sensors
- Usage:
# Send goal ros2 action send_goal /auth rio_interfaces/action/Auth \ "{message: 'Please authenticate to continue'}"
- Type:
rio_interfaces/action/TTS - Description: Converts text to speech with facial expressions
- Usage:
# Send goal ros2 action send_goal /tts rio_interfaces/action/TTS \ "{text: 'Hello, how are you?', voice_output: true, start_expression: 'happy', end_expression: 'neutral', expression_sound: false}"
-
Type:
rio_interfaces/srv/Camera -
Description: Control phone's front/back cameras
-
Usage:
# Enable front camera ros2 service call /enable_camera rio_interfaces/srv/Camera \ "{direction: 0, status: true}" # Enable back camera ros2 service call /enable_camera rio_interfaces/srv/Camera \ "{direction: 1, status: true}" # Disable camera ros2 service call /enable_camera rio_interfaces/srv/Camera \ "{direction: 0, status: false}"
-
Parameters:
direction: 0 (front) or 1 (back)status: true (enable) or false (disable)
- Type:
rio_interfaces/srv/GetExpression - Description: Get current facial expression
- Usage:
# Get current expression ros2 service call /expression_status rio_interfaces/srv/GetExpression "{}"
- Type:
rio_interfaces/srv/Expression - Description: Set robot's facial expression
- Available Expressions:
Expression Description afraidDisplays fear or concern angryShows frustration or anger blushEmbarrassed or shy curiousShows interest or curiosity happyShows joy or pleasure idleDefault neutral state listeningActive listening mode sadDisplays sadness sleepPower saving mode speakingTalking animation surpriseDisplays astonishment thinkingProcessing or computing wakeupActivation animation - Usage:
# Set happy expression with sound ros2 service call /set_expression rio_interfaces/srv/Expression \ "{expression: 'happy', expression_sound: true}"
- RIO Hardware - Hardware design files, BOM and assembly instructions
- RIO Firmware - Micro-ROS firmware for the RIO controller board
- ROS2 Android - Run ROS2 Humble directly on Android using Termux
- Implement existing mobile sensors to enhance RIO's capabilities, including:
- IR Sensor: Utilize for remote controlling appliances.
- Touch Gestures: Implement tap and swipe gestures for user interaction.
- On-board Object Detection: Develop capabilities for recognizing objects withonboard CNN.
- Auto Pilot - Tensorflow: Implement CNNs for steering control based on camera input, inspired by PilotNet.
- ADAS Features: Incorporate advanced driver assistance systems similar to those in FlowPilot.
- And many more...
- Fork the Repository
- Create Feature Branch
- Commit Changes
- Push to Branch
- Open Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.








