Planet ROS

Planet ROS - http://planet.ros.org

Planet ROS - http://planet.ros.org[WWW] http://planet.ros.org


ROS Discourse General: 𝐋𝐢𝐧𝐤𝐅𝐨𝐫𝐠𝐞: 𝐑𝐨𝐛𝐨𝐭 𝐦𝐨𝐝𝐞𝐥𝐢𝐧𝐠 𝐝𝐨𝐞𝐬 𝐧𝐨𝐭 𝐡𝐚𝐯𝐞 𝐭𝐨 𝐛𝐞 𝐜𝐨𝐦𝐩𝐥𝐢𝐜𝐚𝐭𝐞𝐝

I recorded a short video to show how easy it is to build a simple mobile robot with ���������, a Blender extension designed to bridge the gap between 3D modeling and robotics simulation.

All in a few straightforward steps.

���������: ����� �������� ���� ��� ���� �� �� �����������.

The goal is simple: remove friction from robot modeling so engineers can focus on simulation, control, and behavior, not file formats and repetitive setup.

If you are working with ROS or robot simulation and want a faster, cleaner workflow, this is worth a look.

All in a few straightforward steps.

The goal is simple: remove friction from robot modeling so engineers can focus on simulation, control, and behavior, not file formats and repetitive setup.

If you are working with ROS or robot simulation and want a faster, cleaner workflow, this is worth a look.

Blender Extensions: https://extensions.blender.org/add-ons/linkforge/

GitHub: https://github.com/arounamounchili/linkforge

Documentation: https://linkforge.readthedocs.io/

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/topic/51883

ROS Industrial: First of 2026 ROS-I Developers' Meeting Looks at Upcoming Releases and Collaboration

The ROS-Industrial Developers’ Meeting provided updates on open-source robotics tools, with a focus on advancements in Tesseract, Helping developers still using MoveIt2, and Trajopt. These updates underscore the global push to innovate motion planning, perception, and tooling systems for industrial automation. Key developments revolved around stabilizing existing frameworks, improving performance, and leveraging modern technologies like GPUs for acceleration.

The Tesseract project, designed to address traditional motion planning tools' limitations, is moving steadily toward a 1.0 release. With about half of the work complete, remaining tasks include API polishing, unit test enhancements, and transitioning the motion planning pipeline to a plugin-based architecture. Tesseract is also integrating improved collision checkers and tools like the Task Composer, which supports modular backends, making it more adaptable for high-complexity manufacturing tasks.

On the MoveIt 2 front, ongoing community support will be critical as the prior suppor team shifts to supporting the commercial MoveItPro. To ensured Tesseract maintainability, updates include the migration of documentation directly into repositories via GitHub. This step simplifies synchronization between code and documentation, helping developers maintain robust, open-source solutions. There are plans to provide migration tutorials for those wanting to investigate Tesseract if MoveIt2 is not meeting development needs and not ready to move to MoveItPro. Ability to utilize MoveIt2 components within Tesseract are being investigated.

Trajopt, another critical component of the Tesseract ecosystem, is undergoing a rewrite to better handle complex trajectories and cost constraints. The new version, expected within weeks, will enable better time parameterization and overall performance improvements. Discussions also explored GPU acceleration, focusing on opportunities to optimize constraint and cost calculations using emerging GPU libraries, though some modifications will be needed to fully realize this potential.

Toolpath optimization also gained attention, with updates on the noether repository, which supports industrial toolpath generation and reconstruction. While still a work in progress, noether is set to play a pivotal role in enabling advanced workflows once the planned updates are implemented.

As the meeting concluded, contributors emphasized the importance of community engagement to further modernize and refine these tools. Upcoming events across Europe and Asia will foster collaboration and showcase advancements in the ROS-Industrial ecosystem. This collective effort promises to drive a smarter, more adaptable industrial automation landscape, ensuring open-source solutions stay at the forefront of global manufacturing innovation.

The next Developers' Meeting is slated to be hosted by the ROS-I Consortium EU. You can find all the info for Developers' Meetings over at the Developer Meeting page.

[WWW] https://rosindustrial.org/news/2026/1/16/first-ros-i-developers-meeting-looks-at-upcoming-releases-and-collaboration

ROS Discourse General: Simple status webpage for a robot in localhost?

Hi, I’m just collecting info on how you’re solving some simple status pages running locally on robots that would show some basic information like battery status, driver status, sensor health etc. But nothing fancy like camera streaming, teleoperation and such. No cloud, everything local!

The use-case is just being able to quickly connect to a robot AP and see the status of important things. This can of course be done via rqt or remote desktop, but a status webpage is much more accessible from phones, tablets etc.

I’ve seen statically generated pages with autoreload (easiest to implement, but very custom).

I guess some people have something on top of rosbridge/RobotWebTools, right? But I haven’t found much info about this.

Introducing Robotics UI: A Web Interface Solution for ROS 2 Robots  - sciota robotics seemed interesting, but it never did it over 8 commits…

So what do you use?

Is there some automatic /diagnostics_agg → HTML+JS+WS framework? :slight_smile: And no, I don’t count Foxglove, because self-hosted costs… who knows what :slight_smile:

7 posts - 4 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/simple-status-webpage-for-a-robot-in-localhost/51864

ROS Discourse General: Tbai - towards better athletic intelligence

Introducing tbai, a framework designed to democratize robotics and embodied AI and to help us move towards better athletic intelligence.

output

Drawing inspiration from Hugging Face (more specifically lerobot :hugs:), tbai implements and makes fully open-source countless state-of-the-art methods for controlling various sorts of robots, including quadrupeds, humanoids, and industrial robotic arms.

With its well-established API and levels of abstraction, users can easily add new controllers while reusing the rest of the infrastructure, including utilities for time synchronization, visualization, config interaction, and state estimation, to name a few.

Everything is built out of lego-like components that can be seamlessly combined into a single, high-performing robot controller pipeline. Its wide pool of already implemented state-of-the-art controllers (many from Robotic Systems Lab), state estimators, and robot interfaces, together with simulation or real-robot deployment abstractions, allows anyone using tbai to easily start playing around and working on novel methods, using the existing framework as a baseline, or to change one component while keeping the rest, thus accelerating the iteration cycle.

No more starting from scratch, no more boilerplate code. Tbai takes care of all of that.

Tbai seeks to support as many robotic platforms as possible. Currently, there are nine robots that have at least one demo prepared, with many more to come. Specifically, we have controllers readily available for ANYmal B, ANYmal C, and ANYmal D from ANYbotics; Go2, Go2W, and G1 from Unitree Robotics; Franka Emika from Franka Robotics; and finally, Spot and Spot with arm from Boston Dynamics.

Tbai is an ongoing project that will continue making strides towards democratizing robotics and embodied AI. If you are a researcher or a tinkerer who is building cool controllers for a robot, be it an already supported robot or a completely new one, please do consider contributing to tbai so that as many people can benefit from your work as possible.

Finally, a huge thanks goes to all researchers and tinkerers who do robotics and publish papers together with their code for other people to learn from. Tbai would not be where it is now if it weren’t for the countless open-source projects it has drawn inspiration from. I hope tbai becomes an inspiration for other projects too.

Thank you all!

Link: https://github.com/tbai-lab/tbai

Link: https://github.com/tbai-lab/tbai_ros

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/tbai-towards-better-athletic-intelligence/51848

ROS Discourse General: [Humble] Upcoming behavior change: Improved log file flushing in rcl_logging_spdlog

Summary

The ROS PMC has approved backporting an improved log file flushing behavior to ROS 2 Humble. This change will be included in an next Humble sync and affects how rcl_logging_spdlog flushes log data to the filesystem.

What’s Changing?

Previously, rcl_logging_spdlog did not explicitly configure flushing behavior, which could result in:

  • Missing log messages when an application crashes
  • Empty or incomplete log files during debugging sessions

After this update, the logging behavior will:

  • Flush log files every 5 seconds (periodic flush)
  • Immediately flush on ERROR level messages (flush on error)

This provides a much better debugging experience, especially when investigating crashes or unexpected application terminations.

Compatibility

  • :white_check_mark: API/ABI compatible — No rebuild of your packages is required
  • :warning: Behavior change — Log files will be flushed more frequently

How to Revert to the Old Behavior

If you need to restore the previous flushing behavior (no explicit flushing), you can set the following environment variable:

export RCL_LOGGING_SPDLOG_EXPERIMENTAL_OLD_FLUSHING_BEHAVIOR=1

Note: This environment variable is marked as EXPERIMENTAL and is intended as a temporary measure. It may be removed in future ROS 2 releases when full logging configuration file support is implemented. Please do not rely on this variable being available in future versions.

Related Links

Questions or Concerns?

If you experience any issues with this change or have feedback, please:

Thanks,
Tomoya

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/humble-upcoming-behavior-change-improved-log-file-flushing-in-rcl-logging-spdlog/51825

ROS Discourse General: Guidance on next steps after ROS 2 Jazzy fundamentals for a hospitality robot project

I’m keenly working on a hospitality robot project driven by personal interest and a genuine enthusiasm for robotics, and I’m seeking guidance on what to focus on next.

I currently have a solid grasp of ROS 2 Jazzy fundamentals, including nodes, topics, services, actions, lifecycle nodes, URDF/Xacro, launch files, and executors. I’m comfortable bringing up a robot model and understanding how the ROS 2 system fits together.

My aim is to build a simulation-first MVP for a lobby scenario (greeter, wayfinding, and escort use cases). I’m deliberately keeping the scope practical and do not plan to add arms initially unless they become necessary.

At this stage, I would really value direction from more experienced practitioners on how to progress from foundational ROS knowledge toward a real, working robot.

In particular, I’d appreciate insights on:

  • What are the most important areas to focus on after mastering ROS 2 basics?

  • Which subsystems are best tackled first, and in what sequence?

  • What level of completeness is typically expected in simulation before transitioning to physical hardware?

  • Are there recommended ROS 2 packages, example bringups, or architectural patterns well suited for this type of robot?

Any advice, lessons learned, or references that could help shape the next phase of development would be greatly appreciated.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/guidance-on-next-steps-after-ros-2-jazzy-fundamentals-for-a-hospitality-robot-project/51809

ROS Discourse General: [Announcing] LinkForge: A Native Blender Extension for Visual URDF/Xacro Editing (ROS 2 Support)

Hi everyone,

I’d like to share a tool I’ve been working on: LinkForge. It was just approved on the Blender Extensions Platform (v1.1.1).

The Problem We all know the workflow: export meshes from CAD, write URDFs by hand, guess inertia tensors, launch Gazebo, realize a link is rotated 90 degrees, kill Gazebo, edit XML, repeat. It separates the “design” from the “engineering.”

The Solution LinkForge allows you to rig, configure, and export simulation-ready robots directly inside Blender. It is not just a mesh exporter; it manages the entire URDF/Xacro structure.

Key Features for Roboticists:

  • Visual Editor: Import/Export URDF & Xacro files seamlessly
  • Physics: Auto-calculates mass & inertia tensors
  • ROS2 Control Support: Automatically generates hardware interface configurations for ros2_control
  • Complete Sensor Suite: Integrated support for Camera, Depth Camera, LiDAR, IMU, GPS, and Force/Torque sensors with configurable noise models
  • Xacro Support: Preserves macros and properties where possible.

Workflow

  1. Import your existing .urdf or .xacro.
  2. Edit joints and limits visually in the viewport.
  3. Add collision geometry (convex hulls/primitives).
  4. Export valid XML.

Links

This is an open-source project. I’m actively looking for feedback on the “Round-trip” capability and Xacro support.

Happy forging!

4 posts - 3 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/announcing-linkforge-a-native-blender-extension-for-visual-urdf-xacro-editing-ros-2-support/51808

ROS Discourse General: Update on ROS native buffers

Hello ROS community,

as you may have heard, NVIDIA has been working on proposing and prototyping a mechanism to add support for native buffer types into ROS2, to allow ROS2 to natively support APIs to use accelerated buffers like CUDA or Torch tensors efficiently. We had briefly touched on this in a previous discourse post. Since then, a lot of design discussions in the SIG PAI, as well as prototyping on our side has happened, to turn that outline into a full-fledged proposal and prototype.

Below is a rundown of our current status, as well as an outlook of where the work is heading. We are looking forward to discussions and feedback on the proposal.

Native Buffers in ROS 2

Problem statement

Modern robots use advanced, high-resolution sensors to perceive their environment. Whether it’s cameras, LIDARs, time-of-flight sensors or tactile sensor arrays, data rates to be processed are ever-increasing.

Processing of those data streams has for the most part moved onto accelerated hardware that can exploit the parallel nature of the data. Whether that is GPUs, DSPs, NPUs/TPUs, ASICS or other approaches, those hardware engines have some common properties:

  • They are inherently parallel, and as such well suited to processing many small samples at the same time
  • They are dedicated hardware with dedicated interfaces and often dedicated memory

The second property of dedicated memory regions is problematic in ROS2, as the framework currently does not have a way to handle non-CPU memory.

Consider for example the sensor_msgs/PointCloud2 message, which stores data like this:

uint8[] data         # Actual point data, size is (row_step*height)

A similar approach is used by sensor_msgs/Image. In rclcpp, this will map to a member like

std::vector<uint8_t> data;

This is problematic for large pieces of data that are never going to be touched by the CPU. It forces the data to be present in CPU memory whenever the framework handles it, in particular for message transport, and every time it crosses a node boundary.

For truly efficient, fully accelerated pipelines, this is undesirable. In cases where there are one or more hardware engines handling the data, it is preferable for the data to stay resident in the accelerator, and never be copied into CPU memory unless a node specifically requests to do so.

We are therefore proposing to add the notion of pluggable memory backends to ROS2 by introducing a concept of buffers that share a common API, but are implemented with vendor-specific plugins to allow efficient storage and transport with vendor-native, optimized facilities.

Specifically, we are proposing to map uint8[] in rosidl to a custom buffer type in rclcpp that behaves like a std::vector<uint8_t> if used for CPU code, but will automatically keep the data resident to the vendor’s accelerator memory otherwise. This buffer type is also integrated with rmw to allow the backend to move the buffer between nodes using vendor-specific side channels, allowing for transparent zero-copy transport of the data if implemented by the vendor.

Architecture overview

Message encoding

The following diagram shows the overview of a message containing a uint8[] array, and how it is mapped to C++, and then serialized:

It shows the following parts, which we will discuss in more detail later:

  • Declaration of a buffer using uint8[] in a message definition as before
  • Mapping onto a custom buffer type in rclcpp, called Buffer<T> here
  • The internals of the Buffer<T> type, in particular its std::vector<T>-compatible interface, as well as a pointer to a vendor-specific implementation
  • A vendor-specific backend providing serialization, as well as custom APIs

The message being encoded into a vendor-specific buffer descriptor message, which is serialized in place of the raw byte array in the message

Choice of uint8[] as trigger

It is worth noting the choice to utilize uint8[] as a trigger to generate Buffer<T> instances. An alternative approach would have been to add a new Buffer type to the IDL, and to translate that into Buffer<T>. However, this would not only introduce a break in compatibility of the IDL, but also force the introduction of a sensor_msgs/PointCloud3 and similar data types, fracturing the message ecosystem further.

We believe the cost of maintaining a std::vector compatible interface and the slight loss of semantics is outweighed by the benefit of being drop-in compatible with both existing messages and existing code bases.

Integration with rclcpp (and rclpy and rclrs)

rclcpp exposes all uint8[] fields as rosidl_runtime_cpp::Buffer<T> members in their respective generated C++ structs.

rosidl_runtime_cpp::Buffer<T> has a fully compatible interface to std::vector<T>, like size(), operator[](size_type pos) etc.. If any of the std::vector<T> APIs are being used, the vector is copied onto the CPU as necessary, and all members work as expected. This maintains full compatibility with existing code - any code that expects a std::vector<T> in the message will be able to use the corresponding fields as such without any code changes.

In order to access the underlying hardware buffers, the vendor-specific APIs are being used. Suppose a vendor backend named vendor_buffer_backend exists, then the backend would usually contain a static method to convert a buffer to the native type. Our hypothetical vendor backend may then be used as follows:

void topic_callback(const msg::MessageWithTensor & input_msg) {
  vendor_native_handle input_h = vendor_buffer_backend::from_buffer(msg.data);

  msg::MessageWithTensor output_msg =     
    vendor_buffer_backend::allocate<msg::MessageWithTensor>();

  vendor_native_handle output_h = 
    vendor_buffer_backend::from_buffer(output_msg.data);

  output_h = input_h.some_operation();

  publisher_.publish(output_msg);
}

This code snippet does the following:

First, it extracts the native buffer handle from the message using a static method provided by the vendor backend. Vendors are free to provide any interface they choose for providing this interface, but would be encouraged to provide a static method interface for ease of use.

It then allocates the output message to be published using another vendor-specific interface. Note that this allocation creates an empty buffer, it only sets up the relationship between output_msg.data and the vendor_buffer_backend by creating an instance of the backend buffer, and registering it in the impl field of rosidl_runtime_cpp::Buffer<T> class.

The native handle from the output message is also extracted, so it can be used with the native interfaces provided.

Afterwards, it performs some native operations on the input data, and assigns the result of that operation to the output data. Note that this is happening on the vendor native data types, but since the handles are linked to the buffers, the results show up in the output message without additional code.

Finally, the output message is published the same as any other ROS2 message. rmw then takes care of vendor-specific serialization, see the following sections on details of that process.

This design keeps any vendor-specific code completely out of rclcpp. All that rclcpp sees and links against is the generic rosidl_runtime_cpp::Buffer<T> class, which has no direct ties to any specific vendor. Hence there is no need for rclcpp to even know about all vendor backends that exist.

It also allows vendors to provide specific interfaces for their respective platforms, allowing them to implement allocation and handling schemes particular to their underlying systems.

A similar type would exist for rclpy and rclrs. We anticipate both of those easier to implement due to the duck typing facilities in rclpy, and the traits-based object system in rclrs, respectively, which make it much easier to implement drop-in compatible systems.

Backends as plugins

Backends are implemented as plugins using ROS’s pluginlib. On startup, each rmw instance scans for available backend-compatible plugins on the system, and registers them through pluginlib.

A standard implementation of a backend using CPU memory to offer std::vector<T> compatibility is provided by default through the ROS2 distribution, to ensure that there is always a CPU implementation available.

Additional vendor-specific plugins are implemented by the respective hardware vendors. For example, NVIDIA would implement and provide a CUDA backend, while AMD might implement and provide a ROCm backend.

Backends can either be distributed as individual packages, or be pre-installed on the target hardware. As an example, the NVIDIA Jetson systems would likely have a CUDA backend pre-installed as part of their system image.

Instances of rosidl_runtime_cpp::Buffer<T> are tied to a particular backend at allocation time, as illustrated in the section above.

Integration with rmw

rmw implementations can choose to integrate with vendor backends to provide accelerated transports through the backends. rmw implementations that do not choose to integrate with backends, or any existing legacy backends, automatically fall back onto converting all data to CPU data, and will continue working without any changes.

A rmw implementation that chooses to integrate with vendor backends does the following. At graph startup when publishers and subscribers are being created, each endpoint shares a list of installed backends, alongside vendor-specific data to establish any required side channels, and establishes dedicated channels for passing backend-enabled messages based on 4 different data points:

  • The message type for determining if it contains any buffer-typed fields
  • The list of backends supported by the current endpoint
  • The list of backends supported by the associated endpoint on the other side
  • The distance between the two endpoints (same process, different process, across a network etc.)

rmw can choose any mechanism it wants to perform this task, since this step is happening entirely internal to the currently loaded rmw implementation. Side channel creation is entirely hidden inside the vendor plugins, and not visible to rmw.

For publishing a message type that contains buffer-typed fields, if the publisher and the subscriber(s) share the same supported backend list, and there is a matching serialization method implemented in the backend for the distance to the subscriber(s), then instead of serializing the payload of the buffer bytewise, the backend can choose to use a custom serialization method instead.

The backend is then free to serialize into a ROS message type of its choice. This backend-custom message type is called a descriptor. It should contain all information the backend needs to deserialize the message at the subscriber side, and reconstruct the buffer. This descriptor message may contain pointer values, virtual memory handles, IPC handles or even the raw payload if the backend chooses to send that data through rmw.

The descriptor message can be inspected as usual if desired since it is just a normal ROS2 message, but deserializing requires the matching backend. However, since the publisher knows the backends available to the subscriber(s), it is guaranteed that a subscriber only receives a descriptor message if it is able to deserialize it.

Integration with rosidl

While the above sections show the implications visible in rclcpp, the bulk of the changes necessary to make that happen go into rosidl. It is rosidl that is generating the C++ message structures, and hence rosidl that would map to the Buffer type instead of std::vector. Hence the bulk of the work done in order to get this scheme to work is done in rosidl, not in rclcpp.

Layering semantics on top

Having only a buffer is not very useful, as most robotics data has higher level semantics, like images, tensors, point clouds etc..

However, all of those data types ultimately map to one or more large, contiguous regions of memory, in CPU or accelerator memory.

We also observe that a healthy ecosystem of higher level abstractions already exists. There is PCL for point clouds, Torch for tensor handling etc.. Hence, we propose to not try to replicate those ecosystems in ROS, but instead allow those ecosystems to bridge into ROS, and use the buffer abstraction as their backend for storage and transport.

As a demonstration of this, we are providing a Torch backend that allows linking (Py)Torch tensors to the ROS buffers. This allows users to use the rich ecosystem of Torch to perform tensor operations, while relying on the ROS buffers to provide accelerator-native storage and zero-copy transport between nodes, even across processes and chips if supported by the backend.

The Torch backend does not provide a raw buffer type itself, but relies on vendors implementing backends for their platforms (CUDA, ROCm, TPUs etc.). The Torch backend then depends on the vendor-specific backends, and provides the binding of the low-level buffers to the Torch tensors. The coupling between the Torch backend and the hardware vendor buffer types is loose, it is not visible from the node’s code, but is established after the fact.

From a developer’s perspective, all of this is hidden. All a developer writing a Node does is to interact with a Torch buffer, and it maps to the correct backend available on the current hardware automatically. An example of such a code could look like this:

void topic_callback(const msg::MessageWithTensor & input_msg) {
  // extract tensor from input message
  torch::Tensor input_tensor =
    torch_backend::from_buffer(input_msg.tensor);

  // allocate output message
  msg::MessageWithTensor output_msg =
    torch_backend::allocate<MessageWithTensor>();

  // get handle to allocated output tensor
  torch::Tensor & output_tensor =
    torch_backend::from_buffer(output_msg.tensor);

  // perform some torch operations
  output_tensor = torch.abs(input_tensor);

  // publish message as usual
  publisher_.publish(output_msg);
}

Note how this code segment is using Torch-native datatypes (torch::Tensor), and is performing Torch-native operations on the tensors (in this case, torch.abs). There is no mention of any hardware backend in the code.

By keeping the coupling loose, this node can run unmodified on NVIDIA, AMD, TPU or even CPU hardware, with the framework, in this case Torch, being mapped to the correct hardware, and receiving locally available accelerations for free.

Prior work

NITROS

https://docs.nvidia.com/learning/physical-ai/getting-started-with-isaac-ros/latest/an-introduction-to-ai-based-robot-development-with-isaac-ros/05-what-is-nitros.html

NITROS is NVIDIA’s implementation of a similar design based on Type Negotiation. It is specific to NVIDIA and not broadly compatible, nor is it currently possible to layer hardware-agnostics frameworks like Torch on top.

AgnoCast

https://github.com/tier4/agnocast

AgnoCast creates a zero-copy regime for CPU data. However, it is limited to CPU data, and does not have a plugin architecture for accelerator memory regions. It also requires kernel modifications, which some may find intrusive.

Future work

NVIDIA has been working on this proposal, alongside a prototype implementation that implements support for the mechanisms described above. We are working on CPU, CUDA and Torch backends, as well as integration with the Zenoh rmw implementation.

The prototype will move into a branch on the respective ROS repositories in the next two weeks, and continue development into a full-fledged implementation in public.

In parallel, a dedicated working group tasked with formalizing this effort is being formed, with the goal of reaching consensus on the design, and getting the required changes into ROS2 Lyrical.

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/update-on-ros-native-buffers/51771

ROS Discourse General: Pixi as a co-official way of installing ROS on Linux

It’s that time of the year when someone with too much spare time on their hands proposes a radical change to the way ROS is distributed and built. This time, it’s my turn.

So let me start this with acknowledging that without all the tooling the ROS community has developed over the years (rosdep, bloom, the buildfarm - donate if you can, I did! -, colcon, etc.) we wouldn’t be here, 20, 10 years ago it was almost impossible to run a multilanguage federated distributed project without these tools, nothing like that existed! So I’m really grateful for all that.

However, the landscape is different now. We now have projects like Pixi, conda-forge and so on.

As per title of my post, I’m proposing that Pixi not only would be the recommended way of installing ROS 2 on Windows, but also on Linux, or at least, co-recommended for ROS 2 Lyrical Luth and onwards.

One of the first challenges that new users of ROS face is learning a new build tool and a development workflow that is ROS-specific. Although historically we really needed to develop all the tools I’ve mentioned, the optics of having our own build tool and package management system doesn’t help, with the perception that some users still have of ROS as a silo that doesn’t play nice with the outside world.

The main two tools that a user can replace with Pixi are colcon and rosdep, and to some extent bloom.

  • colcon has noble goals, becoming the one build tool for multilanguage workspaces, and as someone who has contributed to it (e.g. extensions for colcon to support Gradle and Cargo) I appreciate having it all under the same tool. However, it hasn’t achieved much widespread adoption outside ROS.
  • rosdep makes it easy to install the multilanguage dependencies, however it still has some long standing issues ( Add support for version_eq · Issue #803 · ros-infrastructure/rosdep · GitHub ) that are taken for granted in other package managers and because of the distribution model we have, ROS packages are installed at a system level, not everything is available via APT, etc.
  • bloom works great for submitting packages to the buildfarm. Pixi provides rattler-build, the process only requires a YAML file and can publish to not only prefix.dev, but also Anaconda.org and JFrog Artifactory.

I’ve been using Pixi for over a year for my own projects, some use ROS some don’t, and the experience couldn’t have been better:

  • No need for vendor packages thanks to conda-forge and robostack (over 43k packages available!)
  • No need for root access, all software is installed in a workspace, and workspaces are reproducible thanks to lockfiles, so I have the same environment on my CI as on my computer.
  • Distro-independent. I’m running AlmaLinux and Debian, I no longer have to worry whether ROS supports my distro or not.
  • Pixi can replace colcon thanks to the pixi build backends ( Building a ROS Package - Pixi )
  • Pixi is fast! It’s written in Rust :wink:

Also, from the ROS side, this would reduce the burden of maintaining the buildfarm, the infrasttructure, all the tools, etc. but that’s probably too far in the future and realisticallly it’d take a while if there’s consensus to replace it with someone else.

Over the years, like good opensource citziens we are, we have collaborated with other projects outside the ROS realm. For example, instead of rolling our own transport like we had in ROS 1, we’ve worked with FastDDS, OpenSplice, CycloneDDS and now Zenoh. I’d say this has been quite symbiotic and we’ve helped each other. I believe collaborating with the Pixi, and Robostack projects would be extremely beneficial for everyone involved.

@ruben-arts can surely say more about the benefits of using Pixi for ROS

11 posts - 6 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/pixi-as-a-co-official-way-of-installing-ros-on-linux/51764

ROS Discourse General: Ferronyx – Real-Time ROS2 Observability & Automated RCA

We’ve been building robots with ROS2 for years, and we hit the same wall every time a robot fails in production:

The debugging process:

  • SSH into the machine

  • Grep through logs

  • Check ROS2 topics (which ones stopped publishing?)

  • Replay bag files

  • Cross-reference with deployment changes

  • Try to correlate infrastructure issues with ROS state

This takes 3-4 hours. Every time.

The problem: ROS gives you raw telemetry, but zero intelligence connecting infrastructure metrics + ROS topology + deployment history. You’re manually stitching pieces together.

So we built Ferronyx to be that intelligence layer.

What we did:

  • Real-time monitoring of ROS2 topics, nodes, actions + infrastructure (CPU, GPU, memory, network)

  • When something breaks, AI analyzes the incident chain and suggests probable root causes

  • Deployment markers show exactly which release caused the failure

  • Track sensor health degradation before failures happen

Real results from our beta customers:

  • MTTR: 3-4 hours → 12-15 minutes

  • One customer caught sensor drift they couldn’t see manually

  • Another correlated a specific firmware version with navigation failures

We’re looking for 8-12 more teams to beta test and help us refine this. We want teams that:

  • Run ROS2 in production (warehouses, humanoids, autonomous vehicles)

  • Actually deal with downtime/reliability issues

  • Will give honest feedback

Free beta access. You help shape the product, we learn what breaks.

If you’re dealing with robot reliability headaches, reply here or send a DM. Would genuinely love to hear your toughest debugging stories.

Links:
https://ferronyx.com/

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/ferronyx-real-time-ros2-observability-automated-rca/51747

ROS Discourse General: ROS 2 Rust Meeting: January 2026

The next ROS 2 Rust Meeting will be Mon, Jan 12, 2026 2:00 PM UTC

The meeting room will be at https://meet.google.com/rxr-pvcv-hmu

In the unlikely event that the room needs to change, we will update this thread with the new info!

Agenda:

  1. Changes to generated message consumption (https://github.com/ros2-rust/ros2_rust/pull/556)
  2. Upgrade to Rust 1.85 (build!: require rustc 1.85 and Rust 2024 edition by esteve · Pull Request #566 · ros2-rust/ros2_rust · GitHub)
  3. Migration from Element to Zulip chat (Open Robotics launches Zulip chat server)

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-2-rust-meeting-january-2026/51726

ROS Discourse General: Easier Protobuf and ROS 2 Integration

For anyone integrating ROS 2 with Protobuf-based systems, we at the RAI Institute want to highlight one of our open-source tools: proto2ros!

proto2ros generates ROS 2 message definitions and bi-directional conversion code directly from .proto files, reducing boilerplate and simplifying integration between Protobuf-based systems and ROS 2 nodes.

Some highlights:

  • Automatic ROS 2 message generation from Protobuf

  • C++ and Python conversion utilities

  • Supports Protobuf v2 and v3

It is currently available for both Humble and Jazzy and can be installed with
apt install ros-<distro>-proto2ros

Check out the full repo here: https://github.com/bdaiinstitute/proto2ros

Thanks to everyone who has contributed to this project including @hidmic @khughes1 @jbarry !
As always, feedback and contributions are welcome!

The RAI Institute

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/easier-protobuf-and-ros-2-integration/51712

ROS Discourse General: ROSCon Review Continued | Cloud Robotics WG Meeting 2026-01-14

Please come and join us for this coming meeting at Wed, Jan 14, 2026 4:00 PM UTCWed, Jan 14, 2026 5:00 PM UTC, where we plan to dive deeper into the ROSCon talks collected together during the last session. By examining more details about the talks, we can highlight any that would be relevant to Logging & Observability, the current focus of the group. We can also pull out interesting tips to release as part of a blog post.

The details for the talks have been gathered into the Links/Notes column of this document. Please feel free to read ahead and take a look at the notes and videos ahead of the meeting, if you’re interested.

The meeting link for next meeting is here, and you can sign up to our calendar or our Google Group for meeting notifications or keep an eye on the Cloud Robotics Hub.

Hopefully we will see you there!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/roscon-review-continued-cloud-robotics-wg-meeting-2026-01-14/51710

ROS Discourse General: Goodbye RQt, Hello RQml [NEW RELEASE]

RQml announcement video

Greetings fellow roboticists,

During our transition to ROS 2 and the build of our new robot Athena, we’ve encountered quite a few issues both in ROS 2, with the middleware, but also with rqt.
For instance, when testing our manipulator, we have noticed that the ControllerManager in rqt gives you around 20 seconds of time before the application freezes completely when used over WiFi.
This is not the only issue, but that’s also not the point of this post.

You could chime in and say, “Hey, you could’ve fixed that and made a PR :index_pointing_up:”, and you would be right, and we did this in several instances.
But I’m not a fan of using Python for UI, and this presented the perfect opportunity to demonstrate how easy it is to create a nice ROS interface using my QML ROS 2 module.
So, instead, I’ve spent that time quickly developing a modern alternative, fixing all the issues that bothered me in rqt.

:waving_hand: Hello RQml :rocket:

Please note that this is still in beta and not all plugins exist yet.
You are very welcome to point me to the ones that you think would be great to have, or even implement them yourself and make a PR :blush:

Currently, the following plugins are available:

  • ActionCaller: Interface for calling ROS 2 Actions.
  • Console: A log viewer for ROS 2 messages.
  • ControllerManager: Manage and switch ROS 2 controllers.
  • ImageView: View camera streams and images.
  • JointTrajectoryController: Interface for sending joint trajectory commands.
  • MessagePublisher: Publish custom ROS 2 messages.
  • RobotSteering: Teleoperation tool for mobile robots.
  • ServiceCaller: Interface for calling ROS 2 Services.

Notably, the ImageView now also uses transparency for depth image values that are not valid (instead of using black, which also represents very close values).

As always, I hope this is of interest to you, and I would love to hear from you if you build something cool with this :rocket:
If it wasn’t, my little turtle buddy will be very disappointed because he already considered you a special friend :worried:

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/goodbye-rqt-hello-rqml-new-release/51697

ROS Discourse General: New packages for Humble Hawksbill 2026-01-07

Package Updates for Humble

Added Packages [27]:

  • ros-humble-ardrone-sdk: 2.0.3-1
  • ros-humble-ardrone-sdk-dbgsym: 2.0.3-1
  • ros-humble-ardrone-sumo: 2.0.3-1
  • ros-humble-ardrone-sumo-dbgsym: 2.0.3-1
  • ros-humble-cloudini-lib: 0.11.1-2
  • ros-humble-cloudini-lib-dbgsym: 0.11.1-2
  • ros-humble-cloudini-ros: 0.11.1-2
  • ros-humble-cloudini-ros-dbgsym: 0.11.1-2
  • ros-humble-event-camera-tools: 3.1.1-1
  • ros-humble-event-camera-tools-dbgsym: 3.1.1-1
  • ros-humble-fibar-lib: 1.0.2-1
  • ros-humble-frequency-cam: 3.1.0-1
  • ros-humble-frequency-cam-dbgsym: 3.1.0-1
  • ros-humble-hitch-estimation-apriltag-array: 0.0.1-1
  • ros-humble-mavros-examples: 2.14.0-1
  • ros-humble-mujoco-vendor: 0.0.6-1
  • ros-humble-mujoco-vendor-dbgsym: 0.0.6-1
  • ros-humble-olive-interfaces: 0.1.0-1
  • ros-humble-olive-interfaces-dbgsym: 0.1.0-1
  • ros-humble-persist-parameter-server: 1.0.4-1
  • ros-humble-persist-parameter-server-dbgsym: 1.0.4-1
  • ros-humble-pointcloud-to-ply: 0.0.1-1
  • ros-humble-qml6-ros2-plugin: 0.25.121-1
  • ros-humble-qml6-ros2-plugin-dbgsym: 0.25.121-1
  • ros-humble-yasmin-editor: 4.2.2-1
  • ros-humble-yasmin-factory: 4.2.2-1
  • ros-humble-yasmin-factory-dbgsym: 4.2.2-1

Updated Packages [390]:

  • ros-humble-ackermann-steering-controller: 2.50.2-1 → 2.52.0-1
  • ros-humble-ackermann-steering-controller-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-admittance-controller: 2.50.2-1 → 2.52.0-1
  • ros-humble-admittance-controller-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-apriltag-detector: 3.0.3-1 → 3.1.0-1
  • ros-humble-apriltag-detector-dbgsym: 3.0.3-1 → 3.1.0-1
  • ros-humble-apriltag-detector-mit: 3.0.3-1 → 3.1.0-1
  • ros-humble-apriltag-detector-mit-dbgsym: 3.0.3-1 → 3.1.0-1
  • ros-humble-apriltag-detector-umich: 3.0.3-1 → 3.1.0-1
  • ros-humble-apriltag-detector-umich-dbgsym: 3.0.3-1 → 3.1.0-1
  • ros-humble-apriltag-draw: 3.0.3-1 → 3.1.0-1
  • ros-humble-apriltag-draw-dbgsym: 3.0.3-1 → 3.1.0-1
  • ros-humble-apriltag-tools: 3.0.3-1 → 3.1.0-1
  • ros-humble-apriltag-tools-dbgsym: 3.0.3-1 → 3.1.0-1
  • ros-humble-aruco-opencv: 2.3.1-1 → 2.4.1-1
  • ros-humble-aruco-opencv-dbgsym: 2.3.1-1 → 2.4.1-1
  • ros-humble-aruco-opencv-msgs: 2.3.1-1 → 2.4.1-1
  • ros-humble-aruco-opencv-msgs-dbgsym: 2.3.1-1 → 2.4.1-1
  • ros-humble-automatika-ros-sugar: 0.4.1-1 → 0.4.2-1
  • ros-humble-automatika-ros-sugar-dbgsym: 0.4.1-1 → 0.4.2-1
  • ros-humble-autoware-internal-debug-msgs: 1.12.0-2 → 1.12.1-1
  • ros-humble-autoware-internal-debug-msgs-dbgsym: 1.12.0-2 → 1.12.1-1
  • ros-humble-autoware-internal-localization-msgs: 1.12.0-2 → 1.12.1-1
  • ros-humble-autoware-internal-localization-msgs-dbgsym: 1.12.0-2 → 1.12.1-1
  • ros-humble-autoware-internal-metric-msgs: 1.12.0-2 → 1.12.1-1
  • ros-humble-autoware-internal-metric-msgs-dbgsym: 1.12.0-2 → 1.12.1-1
  • ros-humble-autoware-internal-msgs: 1.12.0-2 → 1.12.1-1
  • ros-humble-autoware-internal-msgs-dbgsym: 1.12.0-2 → 1.12.1-1
  • ros-humble-autoware-internal-perception-msgs: 1.12.0-2 → 1.12.1-1
  • ros-humble-autoware-internal-perception-msgs-dbgsym: 1.12.0-2 → 1.12.1-1
  • ros-humble-autoware-internal-planning-msgs: 1.12.0-2 → 1.12.1-1
  • ros-humble-autoware-internal-planning-msgs-dbgsym: 1.12.0-2 → 1.12.1-1
  • ros-humble-behaviortree-cpp: 4.7.1-1 → 4.8.3-1
  • ros-humble-behaviortree-cpp-dbgsym: 4.7.1-1 → 4.8.3-1
  • ros-humble-beluga: 2.0.2-1 → 2.1.0-1
  • ros-humble-beluga-amcl: 2.0.2-1 → 2.1.0-1
  • ros-humble-beluga-amcl-dbgsym: 2.0.2-1 → 2.1.0-1
  • ros-humble-beluga-ros: 2.0.2-1 → 2.1.0-1
  • ros-humble-bicycle-steering-controller: 2.50.2-1 → 2.52.0-1
  • ros-humble-bicycle-steering-controller-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-camera-calibration: 3.0.8-1 → 3.0.9-1
  • ros-humble-camera-ros: 0.5.0-1 → 0.5.2-1
  • ros-humble-camera-ros-dbgsym: 0.5.0-1 → 0.5.2-1
  • ros-humble-clearpath-common: 1.3.7-1 → 1.3.8-1
  • ros-humble-clearpath-control: 1.3.7-1 → 1.3.8-1
  • ros-humble-clearpath-customization: 1.3.7-1 → 1.3.8-1
  • ros-humble-clearpath-description: 1.3.7-1 → 1.3.8-1
  • ros-humble-clearpath-generator-common: 1.3.7-1 → 1.3.8-1
  • ros-humble-clearpath-generator-common-dbgsym: 1.3.7-1 → 1.3.8-1
  • ros-humble-clearpath-manipulators: 1.3.7-1 → 1.3.8-1
  • ros-humble-clearpath-manipulators-description: 1.3.7-1 → 1.3.8-1
  • ros-humble-clearpath-mounts-description: 1.3.7-1 → 1.3.8-1
  • ros-humble-clearpath-platform-description: 1.3.7-1 → 1.3.8-1
  • ros-humble-clearpath-sensors-description: 1.3.7-1 → 1.3.8-1
  • ros-humble-control-toolbox: 3.6.2-1 → 3.6.3-1
  • ros-humble-control-toolbox-dbgsym: 3.6.2-1 → 3.6.3-1
  • ros-humble-controller-interface: 2.52.2-1 → 2.53.0-1
  • ros-humble-controller-interface-dbgsym: 2.52.2-1 → 2.53.0-1
  • ros-humble-controller-manager: 2.52.2-1 → 2.53.0-1
  • ros-humble-controller-manager-dbgsym: 2.52.2-1 → 2.53.0-1
  • ros-humble-controller-manager-msgs: 2.52.2-1 → 2.53.0-1
  • ros-humble-controller-manager-msgs-dbgsym: 2.52.2-1 → 2.53.0-1
  • ros-humble-depth-image-proc: 3.0.8-1 → 3.0.9-1
  • ros-humble-depth-image-proc-dbgsym: 3.0.8-1 → 3.0.9-1
  • ros-humble-depthai: 2.30.0-1 → 2.31.0-1
  • ros-humble-depthai-bridge: 2.11.2-1 → 2.12.1-1
  • ros-humble-depthai-bridge-dbgsym: 2.11.2-1 → 2.12.1-1
  • ros-humble-depthai-dbgsym: 2.30.0-1 → 2.31.0-1
  • ros-humble-depthai-descriptions: 2.11.2-1 → 2.12.1-1
  • ros-humble-depthai-examples: 2.11.2-1 → 2.12.1-1
  • ros-humble-depthai-examples-dbgsym: 2.11.2-1 → 2.12.1-1
  • ros-humble-depthai-filters: 2.11.2-1 → 2.12.1-1
  • ros-humble-depthai-filters-dbgsym: 2.11.2-1 → 2.12.1-1
  • ros-humble-depthai-ros: 2.11.2-1 → 2.12.1-1
  • ros-humble-depthai-ros-driver: 2.11.2-1 → 2.12.1-1
  • ros-humble-depthai-ros-driver-dbgsym: 2.11.2-1 → 2.12.1-1
  • ros-humble-depthai-ros-msgs: 2.11.2-1 → 2.12.1-1
  • ros-humble-depthai-ros-msgs-dbgsym: 2.11.2-1 → 2.12.1-1
  • ros-humble-diff-drive-controller: 2.50.2-1 → 2.52.0-1
  • ros-humble-diff-drive-controller-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-dynamixel-hardware-interface: 1.4.16-1 → 1.5.0-2
  • ros-humble-dynamixel-hardware-interface-dbgsym: 1.4.16-1 → 1.5.0-2
  • ros-humble-effort-controllers: 2.50.2-1 → 2.52.0-1
  • ros-humble-effort-controllers-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-event-camera-codecs: 2.0.1-1 → 3.0.0-1
  • ros-humble-event-camera-codecs-dbgsym: 2.0.1-1 → 3.0.0-1
  • ros-humble-event-camera-msgs: 2.0.0-1 → 2.0.1-1
  • ros-humble-event-camera-msgs-dbgsym: 2.0.0-1 → 2.0.1-1
  • ros-humble-event-camera-py: 2.0.1-1 → 3.0.0-1
  • ros-humble-event-camera-renderer: 2.0.1-1 → 3.0.0-1
  • ros-humble-event-camera-renderer-dbgsym: 2.0.1-1 → 3.0.0-1
  • ros-humble-examples-tf2-py: 0.25.17-1 → 0.25.18-1
  • ros-humble-fastcdr: 1.0.24-2 → 1.0.29-1
  • ros-humble-fastcdr-dbgsym: 1.0.24-2 → 1.0.29-1
  • ros-humble-fastrtps: 2.6.10-1 → 2.6.11-1
  • ros-humble-fastrtps-cmake-module: 2.2.3-1 → 2.2.4-1
  • ros-humble-fastrtps-dbgsym: 2.6.10-1 → 2.6.11-1
  • ros-humble-force-torque-sensor-broadcaster: 2.50.2-1 → 2.52.0-1
  • ros-humble-force-torque-sensor-broadcaster-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-forward-command-controller: 2.50.2-1 → 2.52.0-1
  • ros-humble-forward-command-controller-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-generate-parameter-library: 0.5.0-1 → 0.6.0-1
  • ros-humble-generate-parameter-library-py: 0.5.0-1 → 0.6.0-1
  • ros-humble-geometry2: 0.25.17-1 → 0.25.18-1
  • ros-humble-gpio-controllers: 2.50.2-1 → 2.52.0-1
  • ros-humble-gpio-controllers-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-gripper-controllers: 2.50.2-1 → 2.52.0-1
  • ros-humble-gripper-controllers-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-hardware-interface: 2.52.2-1 → 2.53.0-1
  • ros-humble-hardware-interface-dbgsym: 2.52.2-1 → 2.53.0-1
  • ros-humble-hardware-interface-testing: 2.52.2-1 → 2.53.0-1
  • ros-humble-hardware-interface-testing-dbgsym: 2.52.2-1 → 2.53.0-1
  • ros-humble-husarion-components-description: 0.0.2-1 → 0.1.0-1
  • ros-humble-image-pipeline: 3.0.8-1 → 3.0.9-1
  • ros-humble-image-proc: 3.0.8-1 → 3.0.9-1
  • ros-humble-image-proc-dbgsym: 3.0.8-1 → 3.0.9-1
  • ros-humble-image-publisher: 3.0.8-1 → 3.0.9-1
  • ros-humble-image-publisher-dbgsym: 3.0.8-1 → 3.0.9-1
  • ros-humble-image-rotate: 3.0.8-1 → 3.0.9-1
  • ros-humble-image-rotate-dbgsym: 3.0.8-1 → 3.0.9-1
  • ros-humble-image-view: 3.0.8-1 → 3.0.9-1
  • ros-humble-image-view-dbgsym: 3.0.8-1 → 3.0.9-1
  • ros-humble-imu-sensor-broadcaster: 2.50.2-1 → 2.52.0-1
  • ros-humble-imu-sensor-broadcaster-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-joint-limits: 2.52.2-1 → 2.53.0-1
  • ros-humble-joint-limits-dbgsym: 2.52.2-1 → 2.53.0-1
  • ros-humble-joint-state-broadcaster: 2.50.2-1 → 2.52.0-1
  • ros-humble-joint-state-broadcaster-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-joint-trajectory-controller: 2.50.2-1 → 2.52.0-1
  • ros-humble-joint-trajectory-controller-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-kitti-metrics-eval: 2.2.1-1 → 2.4.0-1
  • ros-humble-kitti-metrics-eval-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-kompass: 0.3.2-1 → 0.3.3-1
  • ros-humble-kompass-interfaces: 0.3.2-1 → 0.3.3-1
  • ros-humble-kompass-interfaces-dbgsym: 0.3.2-1 → 0.3.3-1
  • ros-humble-launch-pal: 0.19.0-1 → 0.20.0-1
  • ros-humble-libmavconn: 2.12.0-1 → 2.14.0-1
  • ros-humble-libmavconn-dbgsym: 2.12.0-1 → 2.14.0-1
  • ros-humble-mapviz: 2.5.10-1 → 2.6.0-1
  • ros-humble-mapviz-dbgsym: 2.5.10-1 → 2.6.0-1
  • ros-humble-mapviz-interfaces: 2.5.10-1 → 2.6.0-1
  • ros-humble-mapviz-interfaces-dbgsym: 2.5.10-1 → 2.6.0-1
  • ros-humble-mapviz-plugins: 2.5.10-1 → 2.6.0-1
  • ros-humble-mapviz-plugins-dbgsym: 2.5.10-1 → 2.6.0-1
  • ros-humble-mavlink: 2025.9.9-1 → 2025.12.12-1
  • ros-humble-mavros: 2.12.0-1 → 2.14.0-1
  • ros-humble-mavros-dbgsym: 2.12.0-1 → 2.14.0-1
  • ros-humble-mavros-extras: 2.12.0-1 → 2.14.0-1
  • ros-humble-mavros-extras-dbgsym: 2.12.0-1 → 2.14.0-1
  • ros-humble-mavros-msgs: 2.12.0-1 → 2.14.0-1
  • ros-humble-mavros-msgs-dbgsym: 2.12.0-1 → 2.14.0-1
  • ros-humble-mecanum-drive-controller: 2.50.2-1 → 2.52.0-1
  • ros-humble-mecanum-drive-controller-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-metavision-driver: 2.0.1-1 → 3.0.0-1
  • ros-humble-metavision-driver-dbgsym: 2.0.1-1 → 3.0.0-1
  • ros-humble-mola: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-bridge-ros2: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-bridge-ros2-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-demos: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-gnss-to-markers: 0.1.0-1 → 0.1.2-1
  • ros-humble-mola-gnss-to-markers-dbgsym: 0.1.0-1 → 0.1.2-1
  • ros-humble-mola-imu-preintegration: 1.14.0-1 → 1.14.1-1
  • ros-humble-mola-imu-preintegration-dbgsym: 1.14.0-1 → 1.14.1-1
  • ros-humble-mola-input-euroc-dataset: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-euroc-dataset-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-kitti-dataset: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-kitti-dataset-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-kitti360-dataset: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-kitti360-dataset-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-lidar-bin-dataset: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-lidar-bin-dataset-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-mulran-dataset: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-mulran-dataset-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-paris-luco-dataset: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-paris-luco-dataset-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-rawlog: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-rawlog-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-rosbag2: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-rosbag2-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-video: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-input-video-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-kernel: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-kernel-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-launcher: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-launcher-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-lidar-odometry: 1.2.2-1 → 1.3.1-1
  • ros-humble-mola-lidar-odometry-dbgsym: 1.2.2-1 → 1.3.1-1
  • ros-humble-mola-metric-maps: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-metric-maps-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-msgs: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-msgs-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-pose-list: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-pose-list-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-relocalization: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-relocalization-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-traj-tools: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-traj-tools-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-viz: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-viz-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-yaml: 2.2.1-1 → 2.4.0-1
  • ros-humble-mola-yaml-dbgsym: 2.2.1-1 → 2.4.0-1
  • ros-humble-mp2p-icp: 2.1.1-1 → 2.2.0-1
  • ros-humble-mp2p-icp-dbgsym: 2.1.1-1 → 2.2.0-1
  • ros-humble-mrpt-apps: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-apps-dbgsym: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libapps: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libapps-dbgsym: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libbase: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libbase-dbgsym: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libgui: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libgui-dbgsym: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libhwdrivers: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libhwdrivers-dbgsym: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libmaps: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libmaps-dbgsym: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libmath: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libmath-dbgsym: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libnav: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libnav-dbgsym: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libobs: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libobs-dbgsym: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libopengl: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libopengl-dbgsym: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libposes: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libposes-dbgsym: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libros-bridge: 3.0.2-1 → 3.1.1-1
  • ros-humble-mrpt-libros-bridge-dbgsym: 3.0.2-1 → 3.1.1-1
  • ros-humble-mrpt-libslam: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libslam-dbgsym: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-libtclap: 2.15.1-2 → 2.15.4-1
  • ros-humble-mrpt-path-planning: 0.2.3-1 → 0.2.4-1
  • ros-humble-mrpt-path-planning-dbgsym: 0.2.3-1 → 0.2.4-1
  • ros-humble-multires-image: 2.5.10-1 → 2.6.0-1
  • ros-humble-multires-image-dbgsym: 2.5.10-1 → 2.6.0-1
  • ros-humble-mvsim: 0.14.2-1 → 0.15.0-1
  • ros-humble-mvsim-dbgsym: 0.14.2-1 → 0.15.0-1
  • ros-humble-parameter-traits: 0.5.0-1 → 0.6.0-1
  • ros-humble-pid-controller: 2.50.2-1 → 2.52.0-1
  • ros-humble-pid-controller-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-plotjuggler: 3.13.2-1 → 3.15.0-1
  • ros-humble-plotjuggler-dbgsym: 3.13.2-1 → 3.15.0-1
  • ros-humble-plotjuggler-ros: 2.3.1-1 → 2.3.1-2
  • ros-humble-plotjuggler-ros-dbgsym: 2.3.1-1 → 2.3.1-2
  • ros-humble-pluginlib: 5.1.2-1 → 5.1.3-1
  • ros-humble-pluginlib-dbgsym: 5.1.2-1 → 5.1.3-1
  • ros-humble-pose-broadcaster: 2.50.2-1 → 2.52.0-1
  • ros-humble-pose-broadcaster-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-position-controllers: 2.50.2-1 → 2.52.0-1
  • ros-humble-position-controllers-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-python-mrpt: 2.15.1-1 → 2.15.3-1
  • ros-humble-range-sensor-broadcaster: 2.50.2-1 → 2.52.0-1
  • ros-humble-range-sensor-broadcaster-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-rclcpp: 16.0.16-1 → 16.0.17-1
  • ros-humble-rclcpp-action: 16.0.16-1 → 16.0.17-1
  • ros-humble-rclcpp-action-dbgsym: 16.0.16-1 → 16.0.17-1
  • ros-humble-rclcpp-components: 16.0.16-1 → 16.0.17-1
  • ros-humble-rclcpp-components-dbgsym: 16.0.16-1 → 16.0.17-1
  • ros-humble-rclcpp-dbgsym: 16.0.16-1 → 16.0.17-1
  • ros-humble-rclcpp-lifecycle: 16.0.16-1 → 16.0.17-1
  • ros-humble-rclcpp-lifecycle-dbgsym: 16.0.16-1 → 16.0.17-1
  • ros-humble-rcutils: 5.1.7-1 → 5.1.8-1
  • ros-humble-rcutils-dbgsym: 5.1.7-1 → 5.1.8-1
  • ros-humble-realtime-tools: 2.14.1-1 → 2.15.0-1
  • ros-humble-realtime-tools-dbgsym: 2.14.1-1 → 2.15.0-1
  • ros-humble-rko-lio: 0.1.6-1 → 0.2.0-1
  • ros-humble-rko-lio-dbgsym: 0.1.6-1 → 0.2.0-1
  • ros-humble-robotraconteur: 1.2.6-1 → 1.2.7-1
  • ros-humble-robotraconteur-dbgsym: 1.2.6-1 → 1.2.7-1
  • ros-humble-ros-babel-fish: 0.25.2-1 → 0.25.120-1
  • ros-humble-ros-babel-fish-dbgsym: 0.25.2-1 → 0.25.120-1
  • ros-humble-ros-babel-fish-test-msgs: 0.25.2-1 → 0.25.120-1
  • ros-humble-ros-babel-fish-test-msgs-dbgsym: 0.25.2-1 → 0.25.120-1
  • ros-humble-ros2-control: 2.52.2-1 → 2.53.0-1
  • ros-humble-ros2-control-test-assets: 2.52.2-1 → 2.53.0-1
  • ros-humble-ros2-controllers: 2.50.2-1 → 2.52.0-1
  • ros-humble-ros2-controllers-test-nodes: 2.50.2-1 → 2.52.0-1
  • ros-humble-ros2cli-common-extensions: 0.1.1-4 → 0.1.2-1
  • ros-humble-ros2controlcli: 2.52.2-1 → 2.53.0-1
  • ros-humble-ros2plugin: 5.1.2-1 → 5.1.3-1
  • ros-humble-rosbag2rawlog: 3.0.2-1 → 3.1.1-1
  • ros-humble-rosbag2rawlog-dbgsym: 3.0.2-1 → 3.1.1-1
  • ros-humble-rosidl-adapter: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-cli: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-cmake: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-generator-c: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-generator-cpp: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-parser: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-runtime-c: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-runtime-c-dbgsym: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-runtime-cpp: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-typesupport-fastrtps-c: 2.2.3-1 → 2.2.4-1
  • ros-humble-rosidl-typesupport-fastrtps-c-dbgsym: 2.2.3-1 → 2.2.4-1
  • ros-humble-rosidl-typesupport-fastrtps-cpp: 2.2.3-1 → 2.2.4-1
  • ros-humble-rosidl-typesupport-fastrtps-cpp-dbgsym: 2.2.3-1 → 2.2.4-1
  • ros-humble-rosidl-typesupport-interface: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-typesupport-introspection-c: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-typesupport-introspection-c-dbgsym: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-typesupport-introspection-cpp: 3.1.7-1 → 3.1.8-1
  • ros-humble-rosidl-typesupport-introspection-cpp-dbgsym: 3.1.7-1 → 3.1.8-1
  • ros-humble-rqt-controller-manager: 2.52.2-1 → 2.53.0-1
  • ros-humble-rqt-joint-trajectory-controller: 2.50.2-1 → 2.52.0-1
  • ros-humble-rviz-assimp-vendor: 11.2.23-1 → 11.2.25-1
  • ros-humble-rviz-common: 11.2.23-1 → 11.2.25-1
  • ros-humble-rviz-common-dbgsym: 11.2.23-1 → 11.2.25-1
  • ros-humble-rviz-default-plugins: 11.2.23-1 → 11.2.25-1
  • ros-humble-rviz-default-plugins-dbgsym: 11.2.23-1 → 11.2.25-1
  • ros-humble-rviz-ogre-vendor: 11.2.23-1 → 11.2.25-1
  • ros-humble-rviz-ogre-vendor-dbgsym: 11.2.23-1 → 11.2.25-1
  • ros-humble-rviz-rendering: 11.2.23-1 → 11.2.25-1
  • ros-humble-rviz-rendering-dbgsym: 11.2.23-1 → 11.2.25-1
  • ros-humble-rviz-rendering-tests: 11.2.23-1 → 11.2.25-1
  • ros-humble-rviz-visual-testing-framework: 11.2.23-1 → 11.2.25-1
  • ros-humble-rviz2: 11.2.23-1 → 11.2.25-1
  • ros-humble-rviz2-dbgsym: 11.2.23-1 → 11.2.25-1
  • ros-humble-septentrio-gnss-driver: 1.4.5-1 → 1.4.6-1
  • ros-humble-septentrio-gnss-driver-dbgsym: 1.4.5-1 → 1.4.6-1
  • ros-humble-simple-launch: 1.11.0-1 → 1.11.1-1
  • ros-humble-slider-publisher: 2.4.1-1 → 2.4.2-1
  • ros-humble-steering-controllers-library: 2.50.2-1 → 2.52.0-1
  • ros-humble-steering-controllers-library-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-stereo-image-proc: 3.0.8-1 → 3.0.9-1
  • ros-humble-stereo-image-proc-dbgsym: 3.0.8-1 → 3.0.9-1
  • ros-humble-tcb-span: 1.0.2-2 → 1.2.0-1
  • ros-humble-tf2: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-bullet: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-dbgsym: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-eigen: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-eigen-kdl: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-eigen-kdl-dbgsym: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-geometry-msgs: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-kdl: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-msgs: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-msgs-dbgsym: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-py: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-py-dbgsym: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-ros: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-ros-dbgsym: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-ros-py: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-sensor-msgs: 0.25.17-1 → 0.25.18-1
  • ros-humble-tf2-tools: 0.25.17-1 → 0.25.18-1
  • ros-humble-tile-map: 2.5.10-1 → 2.6.0-1
  • ros-humble-tile-map-dbgsym: 2.5.10-1 → 2.6.0-1
  • ros-humble-tl-expected: 1.0.2-2 → 1.2.0-1
  • ros-humble-tracetools-image-pipeline: 3.0.8-1 → 3.0.9-1
  • ros-humble-tracetools-image-pipeline-dbgsym: 3.0.8-1 → 3.0.9-1
  • ros-humble-transmission-interface: 2.52.2-1 → 2.53.0-1
  • ros-humble-transmission-interface-dbgsym: 2.52.2-1 → 2.53.0-1
  • ros-humble-tricycle-controller: 2.50.2-1 → 2.52.0-1
  • ros-humble-tricycle-controller-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-tricycle-steering-controller: 2.50.2-1 → 2.52.0-1
  • ros-humble-tricycle-steering-controller-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-turtlebot3: 2.3.3-1 → 2.3.6-1
  • ros-humble-turtlebot3-bringup: 2.3.3-1 → 2.3.6-1
  • ros-humble-turtlebot3-cartographer: 2.3.3-1 → 2.3.6-1
  • ros-humble-turtlebot3-description: 2.3.3-1 → 2.3.6-1
  • ros-humble-turtlebot3-example: 2.3.3-1 → 2.3.6-1
  • ros-humble-turtlebot3-navigation2: 2.3.3-1 → 2.3.6-1
  • ros-humble-turtlebot3-node: 2.3.3-1 → 2.3.6-1
  • ros-humble-turtlebot3-node-dbgsym: 2.3.3-1 → 2.3.6-1
  • ros-humble-turtlebot3-teleop: 2.3.3-1 → 2.3.6-1
  • ros-humble-ur: 2.10.0-1 → 2.11.0-1
  • ros-humble-ur-bringup: 2.10.0-1 → 2.11.0-1
  • ros-humble-ur-calibration: 2.10.0-1 → 2.11.0-1
  • ros-humble-ur-calibration-dbgsym: 2.10.0-1 → 2.11.0-1
  • ros-humble-ur-client-library: 2.6.0-1 → 2.6.1-1
  • ros-humble-ur-client-library-dbgsym: 2.6.0-1 → 2.6.1-1
  • ros-humble-ur-controllers: 2.10.0-1 → 2.11.0-1
  • ros-humble-ur-controllers-dbgsym: 2.10.0-1 → 2.11.0-1
  • ros-humble-ur-dashboard-msgs: 2.10.0-1 → 2.11.0-1
  • ros-humble-ur-dashboard-msgs-dbgsym: 2.10.0-1 → 2.11.0-1
  • ros-humble-ur-description: 2.8.0-1 → 2.9.0-1
  • ros-humble-ur-moveit-config: 2.10.0-1 → 2.11.0-1
  • ros-humble-ur-robot-driver: 2.10.0-1 → 2.11.0-1
  • ros-humble-ur-robot-driver-dbgsym: 2.10.0-1 → 2.11.0-1
  • ros-humble-vector-pursuit-controller: 1.0.1-1 → 1.0.2-2
  • ros-humble-vector-pursuit-controller-dbgsym: 1.0.1-1 → 1.0.2-2
  • ros-humble-velocity-controllers: 2.50.2-1 → 2.52.0-1
  • ros-humble-velocity-controllers-dbgsym: 2.50.2-1 → 2.52.0-1
  • ros-humble-yasmin: 3.5.1-1 → 4.2.2-1
  • ros-humble-yasmin-dbgsym: 3.5.1-1 → 4.2.2-1
  • ros-humble-yasmin-demos: 3.5.1-1 → 4.2.2-1
  • ros-humble-yasmin-demos-dbgsym: 3.5.1-1 → 4.2.2-1
  • ros-humble-yasmin-msgs: 3.5.1-1 → 4.2.2-1
  • ros-humble-yasmin-msgs-dbgsym: 3.5.1-1 → 4.2.2-1
  • ros-humble-yasmin-ros: 3.5.1-1 → 4.2.2-1
  • ros-humble-yasmin-ros-dbgsym: 3.5.1-1 → 4.2.2-1
  • ros-humble-yasmin-viewer: 3.5.1-1 → 4.2.2-1
  • ros-humble-yasmin-viewer-dbgsym: 3.5.1-1 → 4.2.2-1
  • ros-humble-zmqpp-vendor: 0.0.2-1 → 0.1.0-3
  • ros-humble-zmqpp-vendor-dbgsym: 0.0.2-1 → 0.1.0-3

Removed Packages [7]:

  • ros-humble-feetech-ros2-driver
  • ros-humble-feetech-ros2-driver-dbgsym
  • ros-humble-generate-parameter-library-example
  • ros-humble-generate-parameter-library-example-dbgsym
  • ros-humble-generate-parameter-library-example-external
  • ros-humble-generate-parameter-library-example-external-dbgsym
  • ros-humble-generate-parameter-module-example

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

  • Adam Serafin
  • Automatika Robotics
  • Autoware
  • Bence Magyar
  • Berkay Karaman
  • Bernd Pfrommer
  • Chris Lalancette
  • Christian Rauch
  • Davide Faconti
  • Felix Exner
  • Fictionlab
  • Gerardo Puga
  • Haroon Rasheed
  • Husarion
  • Ivan Paunovic
  • Jacob Perron
  • Jeremie Deray
  • John Wason
  • Jordan Palacios
  • Jose Luis Blanco-Claraco
  • Jose-Luis Blanco-Claraco
  • José Luis Blanco-Claraco
  • Kostubh Khandelwal
  • Luis Camero
  • M. Fatih Cırıt
  • Markus Bader
  • Masaya Kataoka
  • Meher Malladi
  • Michel Hidalgo
  • Miguel Ángel González Santamarta
  • Olivier Kermorgant
  • Pyo
  • Raul Sanchez Mateos
  • Ryohsuke Mitsudome
  • Sai Kishor Kothakota
  • Samuel Hafner
  • Shane Loretz
  • Southwest Research Institute
  • Stefan Fabian
  • Steven! Ragnarök
  • Temkei Kem
  • Tibor Dome
  • Tomoya Fujita
  • Tyler Weaver
  • Vincent Rabaud
  • Vladimir Ermakov
  • Víctor Mayoral-Vilches
  • Yukihiro Saito
  • bmagyar
  • li9i
  • miguel
  • victor

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/new-packages-for-humble-hawksbill-2026-01-07/51696

ROS Industrial: ROSCon 2025 & RIC-AP Summit 2025 Blog Series: Singapore’s Defining Week for Open-Source Robotics

As we look back on 2025, this blog is a recap of one of the most impactful weeks for open-source robotics in the Asia-Pacific region.

On 30 October, the RIC-AP Summit expanded beyond conference halls into the real world with a series of curated site tours across Singapore. These tours showcased how ROS and Open-RMF are not just concepts but living deployments across manufacturing, healthcare, and smart infrastructure.

If the Summit sessions were about vision and strategy, the tours were about seeing robotics in motion—from factory floors to hospitals, airports, and digital districts.

Importantly, the tours brought together participants from different companies and countries, reflecting the truly international nature of the ROS-Industrial community and the collaborative spirit of Asia Pacific’s robotics ecosystem.

1. ROS in Manufacturing: SIMTech & ARTC + Black Sesame Technologies, Singapore Polytechnic

SIMTech & ARTC

  • Spotlight on smart manufacturing innovations.

  • Demonstrations of autonomous material handling and intelligent inspection systems.

  • ROS-powered robotics showing how open-source frameworks are shaping industrial transformation.

  • Reinforced Singapore’s role as a hub for advanced automation and digitalisation.

Singapore Polytechnic – Robotics, Automation and Control (RAC) Hub

  • Cutting-edge RAC Hub at the School of Electrical and Electronic Engineering.

  • Co-location labs with industry partners like ShenHao and JP Neura.

  • Demonstrations of collaborative and inspection robotics powered by ROS.

  • Clear example of academia-industry collaboration driving automation and intelligent control systems.

2. RMF Deployment in Healthcare & Reconfigurable Robotics: CHART, SUTD

CHART – Centre for Healthcare Assistive & Robotics Technology (CGH)

  • Demonstration of RoMi-H (Robotic Middleware for Healthcare), built on Open-RMF.

  • Multi-fleet interoperability enabling ROS and non-ROS robots to work seamlessly in hospitals.

  • Integration with lifts, automatic doors, and enterprise systems for streamlined operations.

  • Showcased how robotics enhance patient care and operational efficiency in smart hospitals.

SUTD – Reconfigurable Robotics Showcase

  • Outdoor mosquito-catching robot “Dragonfly” and snake-repulsing “Naja.”

  • Infrastructure-focused robots like “Meerkat” and “Panthera 2.0.”

  • Nested reconfigurable robots demonstrating adaptability across environments.

  • A creative exploration of embodied AI, blending research ingenuity with real-world challenges.

59dfa6d0-80c3-4796-8325-c95336f696a5.jpg
dacd2719-0129-4f71-94cf-6530e7e6a5d5.jpg

3. RMF/ROS Deployments: CAG, CPCA, KABAM Robotics, Punggol Digital District – Panasonic

Panasonic – Fleet Management with RMF

  • Proprietary AI-enhanced RMF integration.

  • Features like congestion detection, human presence recognition in elevators, and prevention of unintended companion following.

  • Practical, operationally relevant fleet management for smart districts.

KABAM Robotics

  • Smart+ RMF Solution integrating multi-robot coordination with PABX and access systems.

  • Security robots tied into surveillance, access control, and facility management.

  • Tour of R&D facilities showcasing innovation in robotics for secure, automated environments.

Changi Airport Group (CAG)

  • Firsthand insights into CAG’s Open-RMF journey.

  • Live demonstrations of RMF features supporting airport operations.

  • Strategic vision for scaling interoperability across one of the world’s busiest airports.

CPCA – Hospitality Robotics Integration

  • Work-in-progress deployment of cleaning and delivery robots in hotel operations.

  • Robots integrated with lifts and automated doors via RMF dashboard.

  • Future vision: hotel staff requesting ad hoc robot tasks via StayPlease app.

  • Demonstrations of robots performing floor cleaning, restaurant bussing, and seamless interaction with smart infrastructure.

RIC-AP Summit Tour 2025: Key Takeaways

  • Manufacturing track: ROS is powering industrial transformation, bridging academia and industry.

  • Healthcare track: Open-RMF is operationalised in hospitals, enhancing patient care and efficiency.

  • Smart infrastructure track: Airports, hotels, and digital districts are adopting RMF for multi-robot orchestration.

The tours underscored a powerful message: Singapore is not just hosting conversations about robotics—it is living them. From labs to live deployments, the RIC-AP Summit tours demonstrated how open-source robotics is shaping industries, communities, and everyday life.

[WWW] https://rosindustrial.org/news/2026/1/6/roscon-2025-amp-ric-ap-summit-2025-blog-series-singapores-defining-week-for-open-source-robotics

ROS Discourse General: High frequency log persistence on Jetson Orin (Rosbag alternative?)

Hi everyone,

My team has been working on a storage engine specifically optimized for the Jetson/Orin architecture to handle high bandwidth sensor streams (Lidar/Cameras) that tend to choke rosbag record or mcap writing at the edge.

The main architectural difference is that we bypass the kernel page cache and stream directly to NVMe using custom drivers. We are seeing sustained writes of ~1GB/s with <10us latency on Orin AGX, even ensuring persistence during power cuts (no RAM buffer loss).

We are looking for 3-5 teams running ROS 2 on hardware to test a binary adapter we wrote. It exposes a standard ROS 2 subscriber but pipes the data into our crash-proof storage instead of the standard recorder.

If you are hitting bottlenecks with dropped messages at high frequency or struggling with data corruption on power loss, this might solve it.

DM me or reply here and I can send over the binary for aarch64.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/high-frequency-log-persistence-on-jetson-orin-rosbag-alternative/51657

ROS Discourse General: Best practices for thermal camera intrinsics (FLIR A400) in sensor fusion

I’m working with a FLIR A400 thermal camera as part of a sensor-fusion pipeline
(thermal + radar / LiDAR).

I just found that unlike RGB cameras, FLIR does not expose factory intrinsics, and traditional
OpenCV checkerboard calibration has proven unreliable due to thermal contrast
limitations.

I wanted to start a discussion on what practitioners typically do in this case:

  • Using FOV-derived pinhole intrinsics (fx, fy from datasheet FOV)
  • Optimizing intrinsics during downstream tasks (SLAM / NeRF / reconstruction)
  • Avoiding explicit intrinsics and relying on extrinsics only

I’m especially interested in what has worked in real robotic systems rather than
textbook calibration.

Looking forward to hearing how others approach this.

5 posts - 5 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/best-practices-for-thermal-camera-intrinsics-flir-a400-in-sensor-fusion/51651

ROS Discourse General: Importing PiPER URDF into Isaac_Sim

Importing PiPER URDF into Isaac_Sim

Preface

With the continuous development of robotics technology, simulation platforms play an increasingly important role in the research, development and testing of robotic arms. As a high-performance simulation tool launched by NVIDIA, Isaac Sim helps developers efficiently model, simulate and verify algorithms for robotic arms. This article will detail how to import the URDF model of the PiPER robotic arm into Isaac Sim, perform relevant configurations and operations, providing a reference for subsequent development and applications.

Tags

PiPER robotic arm、Isaac Sim

Respositories

Environment Configuration

  • Operating System:Ubuntu 24.04
  • ROS Version:ROS2 jazzy
  • Graphics Card:5090

Install Graphics Card Driver

sudo apt update
sudo apt upgrade
sudo add-apt-repository ppa:graphics-drivers/ppa
sudo apt update
sudo ubuntu-drivers autoinstall
#reboot
reboot

After rebooting, you can use the following command to verify if the driver is installed successfully:

nvidia-smi

Install isaac sim

Installation Method: Installation — Isaac Sim Documentation

After downloading according to the link, unzip it using the following method:

cd ~/Downloads
unzip "isaac-sim-standalone-5.1.0-linux-x86_64.zip" -d ~/
cd isaac-sim-standalone-5.1.0-linux-x86_64/
./post_install.sh
./isaac-sim.selector.sh

Select isaacsim.ros2.bridge for the ROS Bridge Extension; then click Start to launch:

Once successfully opened, you can prepare to import the URDF model.

Import URDF

Download URDF Model

Download Link: GitHub - agilexrobotics/piper_isaac_sim: piper_isaac_sim

The URDF files and USD for the Piper series will be continuously updated in the future.

After opening Isaac Sim; select File->Import in the upper left corner; select the URDF model to import according to the actual path:

After successful import, you can see the robotic arm appear at the center position; you can add a ground plane and increase the brightness:

Click the triangle button on the left; after starting the simulation, you will find that the gripper moves. This is because some physical parameters are not defined in the imported URDF and need to be set in Isaac Sim:

The parameter setting method is as follows: open joint1 of the robotic arm; set Damping to 80 and Stiffness to 400 in Drive->Angular ; set all movable joints in the same way.

These parameters are for reference only.

After setting, start the simulation again, and the robotic arm is successfully imported.

Add Camera

Right-click in the blank space, select Create->Camera ; create a new camera perspective:

After creation, you need to adjust the camera perspective in Property->Transform ; then in Visual->Visibility , select invisible to hide the camera:

Right-click in the blank space, select Create->Visual Scripting->Action Graph ; create an Action Graph to publish the camera perspective via ROS2:

The content in the Action Graph is shown below:

After connecting the modules, you need to set some parameters:

For Isaac Create Render Product ; select the newly created camera perspective for camera Prim :

For ROS2 Camera Helper ; you can set the frame id and topicName of the camera topic:

Press Ctrl+S to save the USD model, and the import of the Piper USD model is completed.

This article details the complete process of importing the PiPER robotic arm URDF model into the Isaac Sim environment, including environment configuration, model import, physical parameter setting, and camera perspective creation and ROS2 topic publishing. Through these steps, developers can quickly realize visualization and interaction of the PiPER robotic arm in the simulation environment, laying a solid foundation for subsequent algorithm development and system integration. If you encounter problems during actual operation, you can refer to relevant official documents or community resources for further study and communication.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/importing-piper-urdf-into-isaac-sim/51526

ROS Discourse General: ROS 2 Kilted on macOS Apple Silicon — TurtleBot4 + Navigation Stack Working End-to-End

I’d like to share the third major milestone in my ongoing work to run ROS 2 natively on macOS Apple Silicon.

At this point, TurtleBot4 is fully functional on ROS 2 Kilted, tested end-to-end with the complete mobile robotics stack:

  • Gazebo Ionic

  • slam_toolbox

  • Navigation2

  • ros2_control (gz_ros2_control)

This setup validates the full mobile robotics pipeline on macOS: simulation, control, SLAM, and autonomous navigation.

What is working reliably

  • SLAM + Navigation2 running stably on Apple Silicon

  • gz_ros2_control integrated and functioning correctly with TurtleBot4

  • Clean integration with ROS 2 Kilted (much closer to upstream than Humble)

  • No runtime hacks, manual relinking, or environment-variable workarounds

After successfully validating Gazebo, MoveIt 2, ros2_control, and now Navigation2, I’ve published the Kilted branchpublicly:

ROS 2 macOS (Kilted):
https://github.com/idesign0/ros2_macOS/tree/kilted

To make TurtleBot4 work correctly with Gazebo Ionic + ROS 2 Kilted, some targeted changes were required compared to the current official simulator instructions.
Those changes are now packaged and ready to test here:

TurtleBot4 (Gazebo Ionic + Kilted, macOS):
https://github.com/idesign0/ROS2_Humble/tree/kilted-mac/turtlebot4

Reference documentation (for comparison):
https://turtlebot.github.io/turtlebot4-user-manual/software/turtlebot4_simulator.html


Toolchain improvements (Kilted, macOS)

One of the biggest improvements over my earlier Humble-on-macOS setup is the maturity of the Kilted toolchain, particularly around RPATH handling and merged installs.

The updated toolchain provides:

  • Proper RPATH configuration for merged installs

  • Reliable runtime library resolution using @loader_path

  • Clean handling of external dependencies (e.g., Boost)

  • No reliance on DYLD_LIBRARY_PATH or manual relinking

As a result, I’ve seen no dynamic library loading issues at runtime, even when launching large stacks like Navigation2and MoveIt 2.
Startup behavior is noticeably smoother, lifecycle transitions are clean, and systems come up consistently—issues that were common for me on macOS Humble.

At this point, the toolchain feels mature enough to support Gazebo, MoveIt 2, ros2_control, and Nav2 under a single, consistent build setup.

Toolchain reference:
https://github.com/idesign0/ros2_macOS/blob/kilted/cmake/toolchain.cmake


Demo video

A short demo of TurtleBot4 running SLAM and Navigation2 on macOS Apple Silicon (Gazebo Ionic + ROS 2 Kilted) is available here:

https://www.linkedin.com/posts/classy29_ros2-ros-navigation2-activity-7409757575837036544-vJtq


This feels like the point where ROS 2 on macOS Apple Silicon moves from experimental to genuinely usable for larger systems.

If you are running ROS 2 on Apple Silicon, feedback and testing are very welcome.
I’ll also be adding Kilted-specific setup and usage instructions to the README soon.

More demos and upstream-related work to follow.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-2-kilted-on-macos-apple-silicon-turtlebot4-navigation-stack-working-end-to-end/51522

ROS Discourse General: How to get a type of the message having its string type name in C++?

As an input I have a topic name. I want to get the message type of this topic.

  1. I know how to find the string type name using node->get_topic_names_and_types();. The resulting string type name is smth like rcl_interfaces/msg/ParameterEvent (just for example)
  2. Is there any ways in C++ how to get the instance of rcl_interfaces::msg::ParameterEvent or it bumps into C++ being static typed language?

In Python I found the following way of doing that:

import argparse
from ros2cli.node.strategy import NodeStrategy
import rosidl_runtime_py.utilities

with NodeStrategy(argparse.Namespace()) as node:
    for (
        topic,
        topic_type,
    ) in node.daemon_node.get_topic_names_and_types():
        if topic == requested_topic:
            requested_type_str = topic_type[0]


topic_type = rosidl_runtime_py.utilities.get_message(requested_type_str)
print(topic_type) # would print something like <class 'my_custom_package.msg._my_custom_message.MyCustomMessage'>)

3 posts - 3 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/how-to-get-a-type-of-the-message-having-its-string-type-name-in-c/51516

ROS Discourse General: UrdfArchitect: A AI-powered visual editor to build robot models without manual XML coding

Hi ROS Community,

I’m excited to share a new tool I’ve been developing called UrdfArchitect.

UrdfArchitect is a state-of-the-art, web-based visual environment engineered for the seamless creation, manipulation, and export of Unified Robot Description Format (URDF) models. By abstracting the complexities of raw XML authoring into an intuitive graphical interface, it empowers roboticists to focus on design and innovation.

This platform orchestrates the entire robotic modeling lifecycle—from kinematic skeleton definition to high-fidelity geometric detailing and precise hardware specification. Enhanced by Generative AI, UrdfArchitect accelerates prototyping and ensures compatibility with industry-standard simulation ecosystems like MuJoCo.

Live demo: https://urdf.d-robotics.cc/

GitHub Link: https://github.com/OpenLegged/URDF-Architect

Core Capabilities

  • :bone: Multi-Mode Design: Seamlessly switch between Skeleton (kinematics), Detail (meshes/collisions), and Hardware (actuator/transmission) design phases.

  • :artist_palette: Immersive 3D Workspace: Real-time, high-fidelity visualization powered by Three.js. Includes professional transformation gizmos and instant visual analytics for joint axes and frames.

  • :robot: AI-Augmented Engineering: A natural language interface (OpenAI/DeepSeek) to automate complex tasks—generate entire quadruped platforms or integrate sensors using simple text prompts.

  • :inbox_tray: Seamless Interoperability: * Import: One-click ZIP ingest of URDFs and meshes.

    • Export: Production-ready packages including standard URDFs, automated BOM (CSV), and pre-configured MuJoCo XML.
  • :gear: Built-in Motor Library: Instant access to industry-standard actuators from Unitree (Go1/A1) and RobStride, with easy custom extensions.

I’d love to hear your feedback or feature requests! If you find it useful, feel free to give it a :star: on GitHub or contribute to the development.

2 posts - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/urdfarchitect-a-ai-powered-visual-editor-to-build-robot-models-without-manual-xml-coding/51484

ROS Industrial: ROSCon 2025 & RIC-AP Summit 2025 Blog Series: Singapore’s Defining Week for Open-Source Robotics

On 30 October, the focus shifted from global to regional at the ROS-Industrial Consortium Asia Pacific Summit, held in conjunction with SWITCH and ROSCon.

ROSCon focused more on the global community, while the RIC-AP Summit was about the growth and contributions within Asia Pacific: deployments, testbeds, standards, talent, and industry partnerships.

1.Launch of ELEVATE @ BCA Braddell Campus

A headline moment of the Annual Summit was the showcase of ELEVATE @ BCA Braddell Campus—an Open-RMF sandbox designed for robot OEMs, system integrators and even end users to validate interoperable solutions, test technical compliance and adopt standard practices for robotics within the built environment space. a-star.edu.sg

Key aspects:

  • Cyber-physical test environment to validate how robots interact with lifts, doors, building management systems and shared infrastructure using Open-RMF.

  • Open to end-users, OEMs, system integrators and startups to trial real deployments before scaling into commercial deployments

  • Backed by a multi-agency partnership (A*STAR ARTC, BCA and S&TPPO) and seeded with leading industry participants including robotics and infrastructure providers.

ELEVATE positions Singapore as a reference site for Open-RMF adoption and conformance, giving both local and international players a low-friction path to test, integrate and prove interoperable systems.

During the summit, 8 companies has shown their commitment to use the space to advance open-RMF through the use of ELEVATE for testing. The companies that have signed a Letter of Support (LoS) are:

  1. Black Sesame Technologies

  2. Delta Electronics 

  3. HOPE Technik

  4. KABAM Robotics

  5. Lionsbot

  6. Megazo Technologies

  7. Panasonic

  8. SIMPPLE

1 ELEVATE stands for EvaLuatE, VAlidate, Test Environment.

Photographs used with permission from Singapore Week of Innovation and Technology (SWITCH) www.switchsg.org

2. Expansion of the ROS-Industrial Train and Place Programme with SGInnovate

The Annual Summit also marked the announcement of a partnership between ROS-Industrial Consortium Asia Pacific (led by ARTC) and SGInnovate, to leverage on the Deep Tech Central platform to accelerate ROS and robotics talent placement into industry roles.

Key Highlights:

  • Future-Ready Robotics Workforce: ROS-Industrial Consortium Asia Pacific (ROS-I AP), led by A*STAR ARTC, partners with SGInnovate to strengthen Singapore’s robotics talent pipeline.

  • Industry Placement Focus: New partnership emphasizes connecting talent directly with robotics companies such as AiTreat, Fabrica AI, Griffin Labs, Hivebotics, and Vilota, and many others.

  • Hands-On Experience: Trainees gain practical deployment exposure, addressing talent gaps in robotics and embodied AI.

  • Data-Driven Insights: Deep Tech Central provides analytics to understand talent needs and industry demand, strengthening Singapore’s robotics ecosystem and contributing globally.

Through this collaboration, we connect talent with real deployment opportunities, strengthening Singapore’s position in robotics and deep tech.


What We Shared at RIC-AP Summit 2025 @ SWITCH Beyond: Highlights from the Stage

The summit showcased cutting-edge developments in robotics and embodied AI, starting with a keynote on open-source frameworks like ROS 2 and Open-RMF driving adaptive systems, with Yadu from Intrinsic setting the stage for this transformative conversation

Global leaders shared the state of ROS-Industrial across APAC, America, and Europe, emphasizing collaboration and talent exchange, with insights from Paul Evans (Executive Director, SwRI), Vishnuprasad Prachandabhanu (Consortium Manager, ROS-I Europe at Fraunhofer), and Maria Vergo (Consortium Manager, ROS-I APAC)

A_205324.jpg
A_205448.jpg
A_205540.jpg
A_205417.jpg

Photographs used with permission from Singapore Week of Innovation and Technology (SWITCH) www.switchsg.org

Real-world Open-RMF deployments in airports and hospitals demonstrated its maturity as production infrastructure, while the ELEVATE Sandbox at BCA Braddell Campus was introduced as the national testbed for interoperability.

Panels spotlighted diversity through Women in Robotics and explored multidisciplinary challenges in embodied AI beyond code. It was refreshing to hear from women leaders such as Suchitra Narayan (SGInnovate), Chan Min Ling (HMGICS), Samantha Su (IMDA), and Prof. Malika Meghjani (SUTD), who shared inspiring stories of leadership, innovation, and resilience in robotics.

Photographs used with permission from Singapore Week of Innovation and Technology (SWITCH) www.switchsg.org

Future-facing sessions explored two critical themes: the rise of humanoid robotics and the need for a strong talent pipeline. Prof. Han Boon Siew (Schaeffler) delivered an insightful presentation on innovations in humanoid design, mobility, and interaction, framing their societal impact and strategic opportunities in Asia.

Adding to the excitement, Panasonic’s Duyhinh Nguyen shared their journey with Open-RMF, underscoring growing interest from Japanese companies in interoperability and real-world deployments.

On the talent front, Priscilla (SGInnovate) and Sheila (ROS-I APAC) introduced the ROS-Industrial Train-and-Place Programme, calling on industry partners to collaborate in building a future-ready robotics workforce through Deep Tech Central.

20251030 SWITCH2025-HEND 00439.jpg
20251030 SWITCH2025-HEND 00541.jpg
Beyond Stage 1315 presentation4 [unedited - speaker blur].jpeg
Beyond Stage 1315 presentation19 (1).jpeg

Photographs used with permission from Singapore Week of Innovation and Technology (SWITCH) www.switchsg.org

The RIC-AP Summit 2025 made one thing clear: Asia Pacific is not just participating in the robotics revolution—it is leading it. From the heart of Singapore, the region is building the future of interoperable robotics, where open-source frameworks, industry partnerships, and talent converge to transform industries and societies.

A special thanks to all our speakers and panellists for sharing their insights and driving meaningful conversations that shape the future of robotics.

[WWW] https://rosindustrial.org/news/2025/12/21/roscon-2025-amp-ric-ap-summit-2025-blog-series-singapores-defining-week-for-open-source-robotics

ROS Discourse General: Low-cost ROS2/SLAM educational kit upgraded to Jazzy

Hello,

a while ago I designed, manufactured and made available commercially a low-cost LiDAR robot kit for ROS2 beginners. This includes Nav2, SLAM, Gazebo and complete step-by-step from-scratch video instructions. Everything is open source.

I’d like to share that the kit’s software has been upgraded to ROS2 Jazzy.

This includes porting all Gazebo simulations :sweat_droplets: This means, if you are building (or migrating) a differential LiDAR robot for Jazzy or later - you can copy my working Gazebo simulations.

Here are pointers to more information:

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/low-cost-ros2-slam-educational-kit-upgraded-to-jazzy/51450

ROS Discourse General: Upcoming a new RMW Implementation RMW_SWIFTDDS by Greenstone

We are excited to announce that a new ROS2 RMW (ROS Middleware) layer implementation based on SWIFT_DDS will be released soon! This integration enables ROS2 developers to leverage the high-performance, safety-certified commercial DDS communication middleware developed by Greenstone Technology Co., Ltd.

This RMW implementation contains one projects:

About Greenstone

Greenstone Technology Co., Ltd.​ (Greenstone) was founded in 2020 by a team with strong roots in Tsinghua University. The company brings together years of research and commercial experience across multiple domains including communications, computer science, intelligent driving, and artificial intelligence. Greenstone is dedicated to building fully autonomous and controllable intelligent driving foundational software platforms through its proprietary core technologies.

For more information, please visit https://www.greenstonesoft.com/en_homepage.

About SWIFT_DDS

1. Remarkable Performance & High reliability

  • Commercial DDS product with 6 years of R&D and verifications by projects from OEMs and Tier1’s

  • Proprietary deterministic execution/communication technology, guarantee the real-time data communication

  • Low overhead and high throughput

  • Efficient resource utilization

2. Industry-Leading Safety Guarantees

  • ISO 26262 ASIL-D​ functional safety product certification

  • Suitable for safety-critical applications in autonomous driving

  • Compliant with MISRA C/C++ coding standards and pass static code analysis in QAC

  • Comprehensive tests (unit, integration, embedded, and fault injection) achieving 100% safety requirements coverage and mandatory MC/DC code coverage, as required by ISO 26262 ASIL-D certification

3. Minimum Dependency

  • No third-party codes included

  • Can be easily customized/extended according to users’ needs

  • Support a wide range of SoC/OS and MCU/RTOS. Easy for cross-platform porting

4. Full DDS Standard Compliance and performance-enhancing extensions

  • Full compliance with DDS standard specifications

  • Support for all DDS QoS policies

  • Seamless interoperability with other DDS implementations

  • Supported Customized Features:

    • ZeroCopy: Enables direct data transfer mechanisms to eliminate unnecessary memory copying operations

    • Inter-core communication: Support direct inter-core communication on heterogeneous chips (Horizon Robotics Journal 6, TDA4, S32G, etc.) such as arm A Core with R Core and arm A Core with M Core

    • UDP_FlowControl: Supports configuring DataWriter transmission and DataReader reception bandwidth at the process level, with flexible options to set either individually or simultaneously—the most restrictive flow limit will take effect

    • PreferTransportOrder: Dynamically selects the optimal communication channel based on the configured channel priority order

    • NetworkPortRangeQoSPolicy: Restricts processes to operate within a specified UDP port range

    • ThreadInstanceConfigQoSPolicy: When enabled, elevates thread resources from the participant level to the process level, reducing the number of threads created for nodes

5. Comprehensive toolchains that further accelerate system integration

  • Developers only need to focus on the intelligent driving logic and algorithms. Compatible with common intelligent driving chips, help fast integration and verification of intelligent driving products

Performance Results

Based on the ROS2 standard performance testing framework (https://github.com/ros2/performance_test), we executed systematic performance evaluations of SWIFT_DDS for both fixed-length array data and variable-length string data.

I. Hardware Configuration

1. CPU: x86_64, Intel(R) Xeon(R) E-2314 CPU, 4 cores/4 threads, 2.80 GHz

2. Memory: 32 GB DDR4 3200 MHz

3. Network Card: NetXtreme BCM5720 Gigabit Ethernet PCIe

II. System Configuration

1. Operating System: Ubuntu 24.04 LTS, kernel 6.8.0-87-generic, x86_64

2. Compiler: GCC 13.3.0

The tests covered key metrics including latency, throughput, and CPU utilization. The results show that SWIFT_DDS delivers excellent performance across all tests. Detailed data and testing methodologies are documented in the attached test report(https://greenstonesoft.github.io/performance_test_report/swift_array_result.html,https://greenstonesoft.github.io/performance_test_report/swift_string_result.html), and we welcome the community to review and provide feedback.

The Merge Request for the RMW_SWIFTDDS binary package has now been submitted. Once the implementation is approved and officially released, we will post an update announcement here. Stay tuned for further updates!Thanks.

Below is some test result excerpted from the attached test report. The latency results for different transport channels are shown: Intra-process communication using local transmission (INTRA), inter-process communication using shared memory transmission (SHMEM), and inter-process communication using zero_copy transmission (Zero Copy), for fixed-lengh data type (array) packets with packet size 1k, 4k, 16k, 64k, 256k, 1M, 4M, and ROS_DISABLE_LOANED_MESSAGES=0.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/upcoming-a-new-rmw-implementation-rmw-swiftdds-by-greenstone/51447

Wiki: TullyFoote/TestPlanetRSS (last edited 2014-09-25 22:49:53 by TullyFoote)