RTX Lidar Sensors#

Isaac Sim RTX or Raytraced Lidar supports both Solid State and Rotating Lidar configuration via a JSON config file. Each RTX Sensor must be attached to its own viewport or Render Product to simulate properly.

Important

Docking windows in the Isaac Sim UI when an RTX Lidars simulation is running will likely lead to a crash. Pause the simulation before re-docking the window.

Learning Objectives#

In this example, you will:

  • Be briefly introduced to how to use RTX Lidar sensors.

  • Create an RTX Lidar sensor and attach it to a Turtlebot3 robot.

  • Publish sensor data to ROS as LaserScan and PointCloud2 messages.

  • Use the menu shortcut to create RTX Lidar sensor publishers.

  • Put it all together and visualize multiple sensors in RViz.

Getting Started#

Note

Make sure to source your ROS installation from the terminal before running Isaac Sim. If sourcing ROS is a part of your bashrc then Isaac Sim can be run directly.

Prerequisites

Adding an RTX Lidar Publisher#

  1. Add a Lidar sensor to the robot. Go to Create > Isaac > Sensors > RTX Lidar > Rotating. Rename the prim from Rotating to Lidar.

  2. To place the synthetic Lidar sensor at the same place as the robot’s Lidar unit, drag the Lidar prim under /World/turtlebot3_burger/base_scan. Zero out any displacement in the Transform fields inside the Property tab. Verify that the Lidar prim now overlaps with the scanning unit of the robot.

  3. Connect the ROS bridge with the sensor output via Omnigraph Nodes. Open a visual scripting editor by going to Window > Visual Scripting > Action Graph. Add the following nodes to this graph:

    1. On Playback Tick: This is the node responsible for triggering all the other nodes after Play is pressed.

    2. Isaac Create Render Product: In the input camera target prim select the RTX Lidar created in step 2.

    3. ROS1 RTX Lidar Helper: This node handles publishing of the laser scan message from the RTX Lidar. The input render product is obtained from the output of Isaac Create Render Product in step b. Change the frameId name to Lidar to match the name of the Lidar prim.

    4. To also publish point cloud data, add another ROS1 RTX Lidar Helper node. Under input type select point_cloud and change the topic name to point_cloud. This node handles publishing the point cloud from the RTX Lidar. The input render product is obtained from the output of the Isaac Create Render Product node from step b. Change the frameId name to Lidar to match the name of the Lidar prim.

      Action Graph Layout

4. After the graph has been set correctly, hit Play to begin simulation. Verify that the RTX Lidar is sending the LaserScan and PointCloud2 messages and can be visualized in RViZ.

For RViZ visualization:

  1. Run RViZ (rosrun rviz rviz) in a sourced terminal.

  2. The fixed frame name in Isaac Sim for the RTX Lidar is set to Lidar, update the Fixed Frame under Global Options tab in RViz accordingly.

  3. Add LaserScan visualization and set topic to /scan.

  4. Add PointCloud2 visualization and set topic to /point_cloud.

    RViZ Visualization for RTX Lidar

Multiple Sensors in RViz#

To display multiple sensors in RViz, there are a few things that are important to make sure all the messages are synced up and timestamped correctly.

Simulation Timestamp

Use Isaac Read Simulation Time as the node that feeds the timestamp into all of the publishing nodes’ timestamps.

ROS clock

To publish the simulation time to the ROS clock topic, you can setup the graph as shown in the Running ROS Clock Publisher tutorial:

ROS Clock publisher

frameId and topicName

  1. To visualize all the sensors as well as the TF tree all at once inside RViz, the frameId and topicNames must follow a certain convention in order for Rviz to recognize them. The table below roughly describes these conventions. To see the multi-sensor example below, consult the USD asset Isaac/Samples/ROS/Scenario/turtlebot_tutorial.usd.

    Source

    frameId

    nodeNamespace

    topicName

    type

    Camera RGB

    (device_name):(data_type)

    (device_name)/(data_type)

    image_raw

    rgb

    Camera Depth

    (device_name):(data_type)

    (device_name)/(data_type)

    image_rect_raw

    depth

    Lidar

    Lidar

    scan

    laser_scan

    Lidar

    Lidar

    point_cloud

    point_cloud

    TF

    tf

    tf

  2. To see the RViz image below, make sure the simulation is playing. In a ROS-sourced terminal, open with the configuration provided using the command: rviz -d <noetic_ws>/src/isaac_tutorials/rviz/camera_lidar.rviz. After the RViz window finishes loading, you can enable and disable the sensor streams inside the Display Panel on the left hand side.

Camera Lidar sensors in RViz

Adding an RTX Lidar in Standalone#

  1. In one terminal source your ROS workspace and run roscore.

  2. In a new terminal with your ROS environment sourced, run rviz -d <noetic_ws>/src/isaac_tutorials/rviz/rtx_lidar.rviz to start RViz and show the Lidar point cloud.

  3. Run the sample script:

    ./python.sh standalone_examples/api/omni.isaac.ros_bridge/rtx_lidar.py
    
  4. After the scene finishes loading, verify that you see the point cloud for the rotating Lidar sensor being simulated.

RTX Lidar Standalone Script Explained#

While most of the sample code is fairly generic, there are a few specific pieces needed to create and simulate the sensor.

  1. Create the RTX Lidar Sensor.

    _, sensor = omni.kit.commands.execute(
        "IsaacSensorCreateRtxLidar",
        path="/sensor",
        parent=None,
        config="Example_Rotary",
        translation=(0, 0, 1.0),
        orientation=Gf.Quatd(1.0, 0.0, 0.0, 0.0),
    )
    

    Here Example_Rotary defines the configuration for the Lidar sensor. Two generic configuration files (in addition to manufacturer/model-specific configurations) are provided in extsbuild/omni.sensors.nv.common/data/lidar/: Example_Rotary.json and Example_Solid_State.json.

  2. To switch the Lidar to the example solid state configuration replace config="Example_Rotary", with config="Example_Solid_State".

  3. Create a render product and attach this sensor to it:

    hydra_texture = rep.create.render_product(sensor.GetPath(), [1, 1], name="Isaac")
    
  4. Create the post process pipeline that takes the rendered RTX Lidar data and publishes it to ROS:

    writer = rep.writers.get("RtxLidar" + "ROS1PublishPointCloud")
    writer.initialize(topicName="point_cloud", frameId="sim_lidar")
    writer.attach([hydra_texture])
    

Note

You can specify an optional attributes={…} dictionary when calling activate_node_template to set node specific parameters. See the API Documentation for complete usage information.

Summary#

This tutorial covered:

  • Adding an RTX Lidar sensor.

  • Adding RTX Lidar and PointCloud ROS2 nodes.

  • Displaying multiple sensors in RViz.

Next Steps#

Continue on to the next tutorial in our ROS Tutorials series, Transform Trees and Odometry, to learn how to add global and relative transforms to a TF tree.

Further Learning#

Explore the inner workings of RTX Lidar sensors by learning How They work, the RTX Lidar Nodes that use them, and how to get RTX Lidar Synthetic Data.