[Documentation] [TitleIndex] [WordIndex

Planet ROS

Planet ROS - http://planet.ros.org

Planet ROS - http://planet.ros.org[WWW] http://planet.ros.org


ROS Discourse General: 🚀 [New Release] BUNKER PRO 2.0 – Reinforced Tracked Chassis for Extreme Terrain and Developer-Friendly Integration

Hello ROS community,

AgileX Robotics is excited to introduce the BUNKER PRO 2.0, a reinforced tracked chassis designed for demanding off-road conditions and versatile field robotics applications.

Key Features:

Intelligent Expansion, Empowering the Future

Typical Use Cases:

AgileX Robotics provides full ROS driver support and SDK documentation to accelerate your development process. We welcome collaboration opportunities and field testing partnerships with the community.

For detailed technical specifications or to discuss integration options, please contact us at sales@agilex.ai.

Learn more at https://global.agilex.ai/

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/new-release-bunker-pro-2-0-reinforced-tracked-chassis-for-extreme-terrain-and-developer-friendly-integration/49275

ROS Discourse General: Cloud Robotics WG Meeting 2025-07-28 | Heex Technologies Tryout and Anomaly Detection Discussion

Please come and join us for this coming meeting at Mon, Jul 28, 2025 4:00 PM UTC→Mon, Jul 28, 2025 5:00 PM UTC, where we will be trying out Heex Technologies service offering from their website and discussing anomaly detection for Logging & Observability.

Last meeting, we heard from Bruno Mendes De Silva, Co-Founder and CEO of Heex Technologies, and Benoit Hozjan, Project Manager in charge of customer experience at Heex Technologies. The two discussed the company and purpose of the service they offer, then demonstrated a showcase workspace for the visualisation and anomaly detection capabilities of the server. If you’d like to see the meeting, it is available on YouTube.

The meeting link for nex meeting is here, and you can sign up to our calendar or our Google Group for meeting notifications or keep an eye on the Cloud Robotics Hub.

Hopefully we will see you there!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/cloud-robotics-wg-meeting-2025-07-28-heex-technologies-tryout-and-anomaly-detection-discussion/49274

ROS Discourse General: Sponsoring open source project, what do you think?

Hi,

I just saw this and I was thinking about the ROS community.

We have a large and amazing ecosystem of free software, free as in beer and speech!

That accelerated robotic development and we are all very grateful for it.

But I thin that it is also interesting to discuss how to support financially mantainers, keeping the software free for small companies (pre-revenue), students and individuals.

Thoughts’

4 posts - 4 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/sponsoring-open-source-project-what-do-you-think/49257

ROS Discourse General: Baxter Robot Troubleshooting Tips

Hey everyone,

I’ve been working with the Baxter robot recently and ran into a lot of common issues that come up when dealing with an older platform with limited support. Since official Rethink Robotics docs are gone, I compiled this troubleshooting guide from my experience and archived resources. Hopefully, this saves someone hours of frustration!


Finding Documentation


Startup & Boot Issues

1. Baxter not powering on / unresponsive screen

2. BIOS password lockout

3. Real-time clock shows wrong date (e.g., 2016)


Networking & Communication

4. IP mismatch between Baxter and workstation

5. Static IP configuration on Linux (example: 192.168.42.1)

6. Ping test: can’t reach baxter.local

7. ROS Master URI not resolving

export ROS_MASTER_URI=http://baxter.local:11311

8. SSH into Baxter fails


ROS & Intera SDK Issues

9. Wrong catkin workspace sourcing

source ~/ros_ws/devel/setup.bash

10. enable_robot.py or joint_trajectory_action_server.py missing

11. intera.sh script error

12. MoveIt integration not working


Hardware & Motion Problems

13. Arms not enabled or unresponsive

rosrun baxter_tools enable_robot.py -e

14. Joint calibration errors


Software/Configuration Mismatches

15. Time sync errors causing ROS disconnect


Testing, Debugging, & Logging

16. Check robot state:

rostopic echo /robot/state

17. Helpful debug commands:

rostopic list
rosnode list
rosservice list

18. Reading logs:

19. Confirm joint angles:

rostopic echo /robot/joint_states

If you have more tips or fixes, add them in the comments. Let’s keep these robots running.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/baxter-robot-troubleshooting-tips/49223

ROS Discourse General: Remote (Between Internet Networks) Control of Robot Running Micro-ROS

Hello,
I am looking into solutions for communicating with a robot running Micro-ROS that is not on the same network as the host computer (the computer running ROS 2).
The only solution I have found till now is this blog post by Husarnet. The only problem is that this use-case no longer works, and the Husarnet team does not plan to resolve the issue any time soon.
Does anybody know any solution for this that work?

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/remote-between-internet-networks-control-of-robot-running-micro-ros/49213

ROS Discourse General: AgileX Robotics at 2025 ROS Summer School: PiPER & LIMO Hands-on Tracks and Schedule

AgileX Robotics at 2025 ROS Summer School

AgileX Robotics is thrilled to announce our participation in the upcoming 2025 ROS Summer School
:date: July 26 – August 1, 2025
:round_pushpin: Zhejiang University International Science and Innovation Center, Hangzhou, China
:globe_with_meridians: Official site: http://www.roseducation.org.cn/ros2025/


Hands-on Tracks

This year, we are bringing two dedicated hands-on tracks designed to empower developers with practical skills in robot navigation and mobile manipulation.


:wrench: PiPER – Mobile Manipulation Track

Our PiPER-based curriculum introduces core concepts in robotic grasping, visual perception, and motion control. Ideal for those exploring real-world robotic manipulation with ROS!

Date Time Session Topic
Day 4 AM Session 1 Introduction to PiPER
Day 4 AM Session 2 Motion analysis
Day 4 PM Session 1 Overview of PiPER-sdk
Day 4 PM Session 2 MoveIt + Gazebo simulation
Day 5 AM Session 1 QR code recognition grasping
Day 5 AM Session 2 Code-level analysis of grasping logic
Day 5 PM Session 1 YOLO-based Object Recognition and Grasping with Code Analysis
Day 5 PM Session 2 Frontier Insights on Embodied Intelligence

:automobile: LIMO – Navigation & AI Track

Focused on the LIMO platform, this track offers structured ROS-based training in navigation, SLAM, perception, and deep learning.

Date Time Session Topic
Day 1 AM Session 1 LIMO basic functions overview
Day 1 AM Session 2 Chassis Kinematics Analysis
Day 1 PM Session 1 ROS communication mechanisms
Day 1 PM Session 2 LiDAR-based Mapping
Day 2 AM Session 1 Path planning
Day 2 AM Session 2 Navigation frameworks
Day 2 PM Session 1 Navigation practice
Day 2 PM Session 2 Visual perception
Day 3 AM Session 1 Intro to deep reinforcement learning
Day 3 AM Session 2 DRL hands-on session
Day 3 PM Session 1 Multi-robot systems intro
Day 3 PM Session 2 Multi-robot simulation practice

We look forward to meeting all ROS developers, enthusiasts, and learners at the event. Come join us for hands-on learning and exciting robotics innovation!

— AgileX Robotics

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/agilex-robotics-at-2025-ros-summer-school-piper-limo-hands-on-tracks-and-schedule/49209

ROS Discourse General: Is DDS suitable for RF datalink communication with intermittent connection?

I’m not using ROS myself, but I understand that ROS 2 relies on DDS as its middleware, so I thought this community might be a good place to ask.

I’m working on a UAV system that includes a secondary datalink between the drone and the ground segment, used for control/status messages. The drone flies up to 35 km away and communicates over an RF-based datalink with an estimated bandwidth of around 2 Mbps, though the link is prone to occasional disconnections and packet loss due to the nature of the environment.

I’m considering whether DDS is a suitable protocol for this kind of scenario, or if its overhead and discovery/heartbeat mechanisms might cause issues in a lossy or intermittent RF link.

Has anyone here tried using DDS over real-world RF communication (not simulated Wi-Fi or Ethernet), and can share experiences or advice?

Thanks in advance!
S.

10 posts - 6 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/is-dds-suitable-for-rf-datalink-communication-with-intermittent-connection/49145

ROS Discourse General: Feature freeze for Gazebo Jetty (x-post from Gazebo Community)

Hello everyone!

The feature freeze period for Gazebo Jetty starts on Fri, Jul 25, 2025 12:00 AM UTC.

During the feature freeze period, we will not accept new features to Gazebo. This includes new features to Jetty as well as to currently stable versions. If you have a new feature you want to contribute, please open a PR before we go into feature freeze noting that changes can be made to open PRs during the feature freeze period. This period will be close when we go into code freeze on Mon, Aug 25, 2025 12:00 AM UTC.

Bug fixes and documentation changes will still be accepted after the freeze date.

More information on the release timeline can be here: Release Jetty · Issue #1271 · gazebo-tooling/release-tools · GitHub

The Gazebo Dev Team :gazebo:

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/feature-freeze-for-gazebo-jetty-x-post-from-gazebo-community/45257

ROS Discourse General: Donate your rosbag (Cloudini benchmark)

Hi,

as my presentation about Cloudini was accepted at ROSCon 2025, I want to come prepared with an automated benchmarking suite that measure performance over a wide range of datasets.

You can contribute to this donating a rosbag!!!

Thanks for your help. Let’s make pointcloud smaller together :pinched_fingers:

How to

Data Donation Disclaimer: Public Availability for CI Benchmarking

By donating your data files, you acknowledge and agree to the following terms regarding their use and public availability:

Purpose: The donated data will be used for research purposes, specifically to perform and validate benchmarking within Continuous Integration (CI) environments.

Public Availability: You understand and agree that the donated data, or subsets thereof, will be made publicly available. This public release is essential for researchers and the wider community to reproduce, verify, and build upon the benchmarking results, fostering transparency and collaborative progress in pointcloud compression.

Anonymization/Pseudonymization: Please ensure that no personally identifiable information is included in the data you submit, as it will be made public as-is.

5 posts - 3 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/donate-your-rosbag-cloudini-benchmark/45230

ROS Discourse General: Everything I Know About ROS Interfaces: Explainer Video

I made a video about everything I’ve learned about ROS Interfaces (messages/services/actions) in my fifteen years of working with ROS.

The ROS Interface Primer

Text Version: ROS Interface Primer - Google Docs (Google Doc)

Featuring:
:information_source: Information about Interfaces, from Super Basic to Complex Design Issues
:microscope:Original Research analyzing all the interfaces in ROS 2 Humble
:magic_wand:Best Practices for designing new interfaces
:supervillain:Hot takes (i.e. the things that I think ROS 2 Interfaces do wrong)
:name_badge: Three different ways to divide information among topics
:waffle: Fun with multidimensional arrays
:nine: Nine different recipes for “optional” components of interfaces
:thought_balloon: Strong opinions that defy the ROS Orthodoxy
:prohibited: Zero content generated by AI/LLM

Making video is hard, and so I’m calling this version 1.0 of the video, so please let me know what I got wrong and what I’m missing, and I may make another version in the future.

In closing: bring back Pose2D you monsters.

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/everything-i-know-about-ros-interfaces-explainer-video/45225

ROS Discourse General: ROS and ROS2 Logging Severity Level

Hi All!

I’m working on an application for containerizing ROS (1 & 2) projects.

I’m asking for the help of everyone experienced with ROS loggers.
In particular, I’m looking for a solution to generalize the definition of the minimum severity level for all the nodes running in a project.

This configuration should be possible outside of the node source code, so using parameters, environmental variables, or configuration files.
I know that In ROS 1 (C++ base nodes) it is possible to set the minimum severity level from rosconsole.config. (What about ROS 1 Python nodes? It still uses rosconsole.config?)

Also I may have some doubts about how named loggers works, each node has its own logger? In principle it is not possible to define the minimum severity level for all the nodes running in a project?

In ROS 2 (C++ and Python nodes) I know that the --log-level args works to configure the severity when running a node. But again I’m looking for a global solution…

Anyone with useful resources or insights on this aspect?
As anticipated before, the final goal is having an environmental variable or a configuration file that can be used to set the severity level of all the nodes that will be executed when the project start (so for example multiple nodes running from a launch file).
Moreover, I want it to be independent of the language used to write the node (Python or C++).
I’m not referring to a “global parameters” because I know that ROS 2 is structured such that each node has its parameters.

Thanks to all of you!
(I hope the question is not badly formulated, I?m not very experienced with this aspects and the different structure of ROS 1 and ROS 2 in managing loggers… So also study resources on this aspects can be very helpful for me)

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-and-ros2-logging-severity-level/45217

ROS Discourse General: Ros2top - top-like utility for ROS2

Hi everyone!

Repo: GitHub - AhmedARadwan/ros2top

I’ve always found it hard to track each node’s resource usage, so I thought it might be a good idea to build a tool that works for ROS 2 and essentially any Python or C++ process to monitor resource usage in real time. The goal? Quickly see which processes are consuming the most resources and gain better visibility into a running system.

This is an initial release: it relies on the node registering itself to become visible and tracked by the ros2top utility.

What it does so far:

How it works:

Why it might help:

I’d love to hear your thoughts:

This is very early-stage, but I hope it can evolve into a valuable tool for the ROS 2 community. Feedback, suggestions, or even contributions are all welcome! :blush:

9 posts - 7 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros2top-top-like-utility-for-ros2/45206

ROS Discourse General: Best GPU for Large-Scale Multi-Robot Simulation (20–50 Robots) with Open RMF in ROS2

Hello everyone,

I’m planning to run a large-scale multi-robot simulation using ROS2. The setup involves simulating 100 and more robots in a shared environment, using:

Simulation tools like Gazebo or Ignition

Visualization through RViz2

Open RMF for fleet coordination, traffic scheduling, and path planning

I’m looking for suggestions regarding a suitable GPU that can smoothly handle the simulation load without performance issues.

Specifically, I’d like to ask:

Which NVIDIA GPU models are recommended for this scale of simulation?

Would GPUs like RTX 3060 / 3070 / 3080 / 4090 or Quadro series be sufficient?

Is CUDA support helpful for improving performance in Gazebo/Ignition + RViz2?

What minimum VRAM (GPU memory) is advisable (e.g., 8GB vs 16GB+)?

Will the suggested GPU models work well across all ROS2 distributions and Ubuntu versions, including future upgrades?

My aim is to choose a future-ready GPU that supports high-scale multi-robot simulation involving Open RMF logic and visual rendering, with consistent performance.

Any guidance or shared experiences would be greatly appreciated.

And also how many robot Gazebo and Rviz realistically handle in simulation?

Thank you!d

7 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/best-gpu-for-large-scale-multi-robot-simulation-20-50-robots-with-open-rmf-in-ros2/45190

ROS Discourse General: Installation and configuration of the Raspberry Pi Camera on a ROS 2/Jazzy Raspberry Pi 5

We are pleased to release the following information in a document posted in a repository on Raspberry Pi Camera ROS Install that describes the steps to get a Raspberry Pi ™ (or compliant 3rd party) V1, V2, or V3 Camera working on a Raspberry Pi 5 configured with Ubuntu 24.04/ROS 2 Jazzy. It may also be applicable on selected Raspberry Pi 4 configurations . The document was the result of an ongoing dialog on content, posted on the HBRobotics Forum HBRobotics , from notes, Linux Terminal scripts and libraries contributed by Alan Federman, Marco Walther, Sergei Grichine, Ross Lunan. The necessary libraries are installed from downloaded binaries. The purpose was to enable the functioning of the “camera_ros" package developed by Christian Rauch camera_ros , which publishes the camera image as ROS 2 messages: /camera/camera_info, /camera/image_raw, /camera/image_compressed, /parameter_events and /rosout .

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/installation-and-configuration-of-the-raspberry-pi-camera-on-a-ros-2-jazzy-raspberry-pi-5/45177

ROS Discourse General: ROS 2 Rust Meeting: July 2025

The next ROS 2 Rust Meeting will be Mon, Jul 14, 2025 2:00 PM UTC

The meeting room will be at https://meet.google.com/rxr-pvcv-hmu

In the unlikely event that the room needs to change, we will update this thread with the new info!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-2-rust-meeting-july-2025/45167

ROS Discourse General: 🌉 California ROS Events for July 2025 (including Open Sauce!)

ROS Events in California for July 2025

Hi Everyone,

I’ve put together a string of exciting ROS and open source events in California this July!

I had a fantastic time at Open Sauce last year talking to other open source projects (like OpenSCAD, and FreeCAD). This year I’ve organized a joint Open Robotics / ROS / Open Source Hardware Association / OpenCV booth. If you are attending Open Sauce we would love for you to stop by (we’ll have tons of free OSHWA / OpenCV / ROS Stickers).

I’ve also worked with our friends at Hackster.io to organize an open source @ Open Sauce after party at the Studio 45 fabrication space in San Francisco on Saturday night. The after party is open to everyone, regardless of whether you are planning to attend Open Sauce.


RSVP for After Party Here

ROS By-The-Bay

We’re planning to hold our next ROS By-The-Bay Meetup on Fri, Jul 25, 2025 1:00 AM UTC. I’ve lined up two fantastic speakers from Ember Robotics and Orangewood Robotics.


ROS By-The-Bay Meetup

ROS Meetup in LA

Finally, @mrpollo and I will be in LA the last week of July for IEEE SCM-IT/SCC 2025. @mrpollo and @ivanperez have organized a workshop on open source software for space missions..

We are tentatively planning to hold a joint ROS / Dronecode meetup on July 31st but we’re still looking for space and speakers (we just had our venue fall through). We were hoping to find a space in the El Segundo / Long Beach area but we’re open to anything right about now (perhaps Pasadena?). If you have suggestions please reach out.

I’ll post additional information as we figure it out.

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/california-ros-events-for-july-2025-including-open-sauce/45153

ROS Discourse General: Embodied AI Community group meeting #9

The Embodied AI Community Group dedicated to the topic of applications of Generative AI to ROS 2 robotics will have a ninth meeting on 9 July 16:00 UTC (9:00 am PST) - in less than 24h!

Join us to keep up with the newest advancements in embodied AI field.
We have some exciting topics in the agenda:

Here is the meeting link, meetings take place every month, so feel free to subscribe to the calendar and visit the group landing page.
Detailed agenda can be found in the meeting document. You can also find there all materials and recordings of past meetings.

See you there!

4 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/embodied-ai-community-group-meeting-9/45115

ROS Discourse General: 🏎️ ROS 2 Online Robot Racing Contest — Fun & Challenge Await This July!

Hi community!

This July, we’ve prepared something fun for you — an Online ROS 2 Robot Racing Contest!

Robot Racing Contest

This 100% free, simulation-based competition invites robotics developers from around the world to compete. Build your fastest robot lap — and the winner will receive a real ROS 2 robot equipped with a camera, LiDAR, and a hot-swappable battery system!

:chequered_flag: How to Participate

:trophy: Winners will be announced** during a live online event on July 31st


This contest is more than just a race — it’s a fun way to strengthen your ROS 2 skills and connect with the global ROS community.

We invite you to race, learn, and enjoy with this robot contest!

The Construct Robotics Institute
theconstruct.ai

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-2-online-robot-racing-contest-fun-challenge-await-this-july/45100

ROS Discourse General: AI Worker Redefines Agility in Logistics with Swerve Drive

:robot: AI Worker Redefines Agility in Logistics with Swerve Drive :rocket:

AI WORKER #4: Swerve Drive at Work – Logistics Task Demo

Are you interested in the logistics and distribution environment? Our team is thrilled to finally release a new video showcasing our AI Worker’s enhanced driving and operational capabilities!

This video vividly demonstrates how our AI Worker, equipped with Swerve Drive technology, moves with incredible flexibility and intelligence in a real-world logistics setting. While “Omni-Directional” methods typically include Omni wheels and Mecanum wheels, both rely on friction with the floor, which can lead to lower positional accuracy and even floor damage. In contrast, the Swerve Drive type offers superior positional accuracy, a significant advantage in terms of reduced data noise from a Physical AI perspective. After all, if the data crucial for learning isn’t accurate, the learning outcomes won’t be good either. This Swerve Drive technology allows the AI Worker to navigate narrow spaces within the work area, and most importantly, its horizontal movement capability significantly enhances efficiency for tasks involving conveyor belts or tabletop operations.

While this video still features some teleoperated segments, our ultimate goal is for this AI Worker to evolve into a fully autonomous system, capable of self-judgment and movement, by integrating with Robot Foundation Models (RFM). Your continued support and encouragement as we pursue this journey would mean a great deal to us! :folded_hands:

As with all ROBOTIS robot systems, this AI Worker is also open-source and built on Open Robotics ROS 2 and Hugging Face LeRobot, so those interested in the technical aspects will find it engaging.

You can watch the original YouTube video at the link below:

AI WORKER #4: Swerve Drive at Work – Logistics Task Demo
:backhand_index_pointing_right: https://youtu.be/WNpRlIr4zbw

Our open-source GitHub repositories are here:
:backhand_index_pointing_right: GitHub - ROBOTIS-GIT/ai_worker: AI Worker: FFW (Freedom From Work)
:backhand_index_pointing_right: GitHub - ROBOTIS-GIT/physical_ai_tools: ROBOTIS Physical AI Tools: Physical AI Development Interface with LeRobot and ROS 2
:backhand_index_pointing_right: GitHub - ROBOTIS-GIT/robotis_lab: robotis_lab

A comprehensive overview of the AI Worker is available on our webpage:
:backhand_index_pointing_right: https://ai.robotis.com/

Please feel free to leave any questions or feedback in the comments after watching the video! :wink:

#ROBOTIS #AIWorker #Humanoid #DYNAMIXEL robot #OpenSource ROS #PhysicalAI #EmbodiedAI

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ai-worker-redefines-agility-in-logistics-with-swerve-drive/45099

ROS Discourse General: New ROS2 Package: ros2_teleoperation

Hello ROS2 Community,

dashboard_compressed

I just release the ros2_teleoperation package, a ROS2 package that allows to visualize different information under one UI, based on QT, this package provides a clean and efficient graphical interface to monitor and teleoperate your robot in real time. With built-in support for GPS visualization, waypoint creation, camera streaming, LiDAR point clouds, IMU data, and more, you can easily select any topic and start visualizing the data you need, all in a few clicks!

Whether you’re debugging, monitoring, or controlling your robot remotely, ros2_teleoperation offers the flexibility and clarity you need.

More viewers will be added soon, and if there’s a specific data type you’d like to see, let me know, let’s build the ultimate open-source visualization tool together!

LINK: ros2_teleoperation

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/new-ros2-package-ros2-teleoperation/45080

ROS Discourse General: Discussion with Bruno Mendes De Silva | Cloud Robotics WG Meeting 2025-07-14

The next meeting for the Cloud Robotics Working Group will be at 1600-1700 UTC on Monday 14th July 2025, where Bruno Mendes De Silva, Co-Founder and CEO of Heex Technologies, has agreed to join us as a guest expert in Logging and Observability.

Heex Technologies is a cloud data platform accelerating Physical AI adoption. The company helps customers with their autonomous systems by collecting and presenting only the most relevant data for a required view. We will see some slides and demos, plus gather as much information from Bruno as possible for writing a community guide on Logging and Observability.

Last meeting, we invited Benji Barash, Co-Founder and CEO of Roboto AI, as a guest expert, also for Logging & Observability. The meeting became more of a presentation and Q&A, as Benji had slides and demos ready to run. He showed very impressive tech from Roboto.AI, including the ability to select a clip of robot data and search other datasets for similar clips to find similar motions! If you would like to see the meeting, the recording is available on YouTube.

The meeting link is here, and you can sign up to our calendar or our Google Group for meeting notifications or keep an eye on the Cloud Robotics Hub.

Hopefully we will see you there!

P.S. I couldn’t find the control to insert a date/time, hence writing it in UTC. Has something happened to that?

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/discussion-with-bruno-mendes-de-silva-cloud-robotics-wg-meeting-2025-07-14/45068

ROS Discourse General: New contact message for sensor_msgs

Hi!

This post will be about the proposal of a Contact.msg for the sensor_msgs package.

If you have any ideas for changes or usage, please share it in this post and send PR to my message-only package contact_msgs.

Motivation

I could not find simple message that defines whether there is contact between some defined sensor and other undefined object/objects. Messages like /ros_gz_interfaces/msg/Contact.msg are too complicated and simulator specific for simple contact detection. For contact sensor with force and torque readings geometry_msgs/WrenchStamped.msg would be enough.

Design / Implementation Considerations

# This is a message to hold data from simple contact

std_msgs/Header header      # timestamp is the time the contact was measured
                                        # frame_id is the location of the contact sensor


bool contact                # True if there is contact

Additional Information

I believe that this message could be used for creating future hardware interfaces and broadcasters for ros2_control framework. It also would be helpful for fields where contact sensors are widely used, such as legged robotics.

Links

Contact message repository
common_interfaces issue related to message
common_interfaces PR related to message
Old issue related to message

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/new-contact-message-for-sensor-msgs/45055

ROS Discourse General: Real-time Raspberry Pi ROS 2 image updated for Jazzy and 24.04

Some of you may remember the ROS 2 real-time image for Raspberry Pi that I presented at ROScon 2022 or in a previous post. @razr and I have worked on it and updated it to support ROS 2 Jazzy and Ubuntu 24.04. I just created a release for it here: Releases · ros-realtime/ros-realtime-rpi4-image · GitHub

Contrary to the previous efforts, this image now supports Raspberry Pi 3, 4, and 5! This gives a wider range of choices for people interested in running ROS on Raspberry Pis. I’m particularly excited to start using this on the Raspberry Pi 5 which gives exceptional performance both for normal code and in-terms of delivering a low scheduling latency.

Here are some preliminary real-time scheduling latency results based on measuring cyclictest latency during a heavy CPU stress test.

Model Result
Raspberry Pi 3
Raspberry Pi 4
Raspberry Pi 5

Enjoy!

4 posts - 3 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/real-time-raspberry-pi-ros-2-image-updated-for-jazzy-and-24-04/45047

ROS Discourse General: Introduction to OMY and Physical AI Tool

The Beginning of Physical AI Research: Easily Learn Robotics with OMY & Physical AI Tools!

Hello everyone! In today’s video, we’re excited to introduce an innovative solution that lowers the barrier to Physical AI research: OMY and Physical AI Tool

:backhand_index_pointing_right: Introducing OMY
Our OMY is a 6-axis manipulator & gripper specifically designed for Physical AI research. Notably, its Leader-Follower configuration provides an optimized environment for Imitation Learning. Equipped with collision detection and gravity compensation, OMY enables safe and efficient acquisition of training data.

:hammer_and_wrench: Introducing Physical AI Tools
Physical AI Tools is a user-friendly software developed to allow anyone to easily participate in Physical AI research. Built upon ROS 2 and LeRobot, it offers intuitive UI-based data acquisition and monitoring without complex procedures. It’s designed to be easily usable by beginners in robotics learning and part-time data acquisition specialists. Furthermore, with Web UI support, it can be accessed from various devices like mobile phones and tablets.

:light_bulb: What You’ll See in This Video:
In this video, we’ll walk you through the entire workflow, from introducing OMY to the detailed processes of data acquisition, training, and inference using Physical AI Tools. For the inference, we are currently utilizing the ACT Policy model.

:sparkles: The Future of Physical AI Tools:
While currently focused on data acquisition and inference, we will soon be adding training functionalities. We also plan to integrate various open-source models like NVIDIA GR00T N1.5 to provide even more powerful features. All the code for our Physical AI Tools is open-source, emphasizing transparency and extensibility.

We kindly ask for your continued interest and anticipation for our future contributions to the advancement of Physical AI research.

Thank you so much for watching the video! Please feel free to leave any questions in the comments.

:inbox_tray: GitHub Repo

#ROBOTIS #OMY ROS nvidia #HuggingFace #LeRobot robotics #physicalai ai #artificialIntelligence robot #opensource #robotarm #LeaderFollower #ImitationLearning

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/introduction-to-omy-and-physical-ai-tool/45027

ROS Discourse General: Need of ROS in Industry

I wanted everyone’s opinion of using ROS in Industry. After working for about 3 years in robotics industry I have few questions popped up in mind and would like to hear perspectives and answers of others on this:

During my work with industrial robotic arms in industry I have seen very few sources using ROS. Most of the industry in robotic arms is dominated by PLC-Robot communication. My question is why do we need ROS if PLC -Robot is used in industry? What are advantages of using ROS compared to standard robot motion programming of industrial robots (which has motion planners integrated with their controllers)?

I would like to hear everyone’s thoughts on this.

6 posts - 6 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/need-of-ros-in-industry/45004


2025-07-26 13:54