RTX Radar Sensor#
RTX Radar sensors are simulated at render time on the GPU with RTX hardware.
Their results are then copied to the RtxSensorCpu
rendering buffer for use.
How They work#
Before a RTX Radar can render, there must be a camera asset with attributes set appropriately.
The render product from that camera will contain the RtxSensorCpu
result.
The following code snippet, when run in the Script Editor, will:
create a RTX Radar sensor in the scene with the
IsaacSensorCreateRtxRadar
command.render its results into a Hydra texture
create a point cloud from the results using the
RtxRadarDebugDrawPointCloudBuffer
writercreate a
RtxSensorCpuIsaacCreateRTXRadarScanBuffer
annotator that can read the data from the radar.
import omni.kit.commands
from pxr import Gf
import omni.replicator.core as rep
radar_config = "Example"
# 1. Create The Camera
_, sensor = omni.kit.commands.execute(
"IsaacSensorCreateRtxRadar",
path="/sensor",
parent=None,
config=radar_config,
translation=(0, 0, 1.0),
orientation=Gf.Quatd(1,0,0,0),
)
# 2. Create and Attach a render product to the camera
render_product = rep.create.render_product(sensor.GetPath(), [1, 1]).path
# 4. Create a Replicator Writer that "writes" points into the scene for debug viewing
writer = rep.writers.get("RtxRadarDebugDrawPointCloud")
writer.attach(render_product)
Note
To observe something similar to the example picture above, run the RTX Radar standalone example using ./python.sh standalone_examples/api/omni.debug.draw/rtx_radar.py
.
The action graph this creates hooks all the nodes up and makes this work when you hit play. You may want to explore an overview of the Action Graph it created, as well as details on the nodes specifically created for RTX Radar.
The most important parameter to the IsaacSensorCreateRtxRadar
command is config
.
Radar Config Files#
Currently, only one radar configuration is supported in Isaac Sim: the omni.sensors
WpmDmatApproxRadar
model.
The configuration parameters are described in their documentation.
Sensor Materials#
The material system for RTX Lidar allows content creators to assign sensor material types, to partial material prim names on a USD stage. The materials will make the beams react different when they hit surfaces.
Standalone Examples#
For examples of creating RTX Radar see the examples:
./python.sh standalone_examples/api/omni.isaac.debug_draw/rtx_radar.py
Note
See the Isaac Sim Conventions documentation for a complete list of Omniverse Isaac Sim conventions.