Pollux Logo

OV Sensors Summary Notes

1. Overall Overview of Sensors

https://docs.isaacsim.omniverse.nvidia.com/latest/sensors/index.html

Isaac Sim’s sensor system is divided into six categories and supports simulation of various sensors that are physics-based, RTX-based, camera-based, and applicable to real robots. Each category is used for the following purposes.

Sensor CategoryDescriptionPrimary Uses
Camera SensorsSimulation of visual sensors such as RGB/DepthTraining data collection, Sim-to-Real, SLAM
RTX SensorsRTX-based lidar/radar sensorsPrecise detection of distance, velocity, and reflectance
Physics-Based SensorsSensors based on the physics engine (IMU, contact, etc.)Robot motion, force, and collision detection
PhysX SDK SensorsLightweight sensors based on PhysX SDK raycastSimple distance sensing, performance optimization
Camera and Depth SensorsReal camera models provided as USDDigital twins of RealSense, ZED, Orbbec, etc.
Non-Visual SensorsUSD assets of physics-based sensorsIMU, proximity, contact, etc.

2. Camera Sensors

https://docs.isaacsim.omniverse.nvidia.com/latest/sensors/isaacsim_sensors_camera.html

The basic RGB/Depth cameras used in Isaac Sim can be simulated similar to real lenses and support various annotator outputs.

What are Annotators?

In Isaac Sim, an annotator automatically generates additional ground-truth data streams beyond the images produced by a sensor.

This data is extremely useful for machine learning training, robot perception, and validation.

Major Annotator Types and Descriptions:

AnnotatorDescriptionExample Uses
RGBStandard color imageImage classification, recognition
DepthPer-pixel distance mapRange perception, SLAM
NormalsSurface normal vectorsPose estimation, material analysis
Motion VectorsInter-frame motion vectorsObject tracking, motion blur
Instance IDUnique ID per objectInstance segmentation
Semantic IDClass ID (e.g., person=1, box=2)Semantic segmentation
2D/3D BBoxAutomatic bounding boxesObject detection
Optical FlowPer-pixel velocity vectorsRobot vision, autonomous driving prediction
In Isaac Sim, this data can be output as .png, .json, .npy, ROS messages, etc., and can be controlled via Python API or Action Graph.

Supported Features:

Image
  • Configure focal length, FOV, resolution, sensor size
ImageImage
  • Apply lens distortion models (pinhole, fisheye, etc.)
  • Integration with render products for rendering
  • Annotator support: RGB, Depth, Normals, Motion Vectors, Instance Segmentation, etc.
  • Creation and control via Python or GUI

Example Uses:

  • Simulation for object recognition data collection
  • Synthetic data generation for deep-learning-based vision training
  • RGB-D data output for ROS/SLAM integration

3. RTX Sensors

https://docs.isaacsim.omniverse.nvidia.com/latest/sensors/isaacsim_sensors_rtx.html

A family of high-precision sensors for distance/velocity detection using NVIDIA RTX acceleration. It consists of lidar, radar, and visual annotators.

Subcomponents:

Image
  • RTX Lidar Sensor: Outputs point clouds; adjustable rotation angle, FOV, and ray density
Image
  • RTX Radar Sensor: Extracts distance + velocity (simulates Doppler effect)
  • RTX Sensor Annotators: Output properties for visualization
  • Visual/Non-Visual Materials: Materials for sensor response

Example Uses:

  • Autonomous vehicles, AMRs, and robot ranging
  • Comparing detection capabilities across sensor types

Visual Materials (Sensor Materials for Visual Response)

Simulate the visual properties of object surfaces that RTX sensors detect (e.g., reflectance, absorption).

These affect both rendering and sensor response; a total of 21 fixed material types are provided.

IndexMaterial TypeDescription (Expected)
0DefaultDefault material
1AsphaltStandardStandard asphalt road
2AsphaltWeatheredWeathered asphalt
3VegetationGrassGrass/vegetation
4WaterStandardWater surface
5GlassStandardStandard glass
6FiberGlassFiberglass
7MetalAlloyAlloy metal
8MetalAluminumAluminum
9MetalAluminumOxidizedOxidized aluminum
10PlasticStandardStandard plastic
11RetroMarkingsHigh-reflective road markings
12RetroSignHigh-reflective traffic sign
13RubberStandardRubber
14SoilClayClay soil
15ConcreteRoughRough concrete
16ConcreteSmoothSmooth concrete
17OakTreeBarkOak tree bark
18FabricStandardFabric
19PlexiGlassStandardPlexiglass
20MetalSilverSilver metal
31INVALIDInvalid (for exception handling)

Non-Visual Sensor Material Properties

Do not affect rendering, but control detectability and reflectance strength for RTX sensors.

This allows specific objects to be detected or ignored in simulation.

Purposes:

  • Improve accuracy of sensor tests
  • Make certain objects detectable or hidden
  • Keep visual material intact while altering only sensor response
Property NameTypeDescription
no_sensor_hitboolIf True, not detected by RTX sensors (sensor-transparent)
sensor_visibility_boostfloatDefault 1.0; higher values make it more detectable (simulate strong reflection)

4. Physics-Based Sensors

https://docs.isaacsim.omniverse.nvidia.com/latest/sensors/isaacsim_sensors_physics.html

Sensors implemented based on interactions in the physics engine (PhysX) to sense internal robot states or external contacts.

Supported Sensors:

  • Articulation Joint Sensor: Full joint state sensing (position/velocity/effort)
ImageImage
  • Contact Sensor: Ground contact, collision detection
  • Effort Sensor: Measures joint torque only
  • IMU Sensor: Inertial data (acceleration, gyro)
  • Proximity Sensor: Simple detection of whether an object exists within a certain range (True/False)

Example Uses:

  • Evaluating robot walking stability
  • Force/torque analysis for manipulators
  • Landing detection, fall detection

5. PhysX SDK Sensors

https://docs.isaacsim.omniverse.nvidia.com/latest/sensors/isaacsim_sensors_physx.html

Lightweight distance sensors using the PhysX SDK’s raycast functionality.

Supported Items:

ImageImage
  • Generic Sensor: Custom ray-based sensing
Image
  • PhysX Lidar: Fixed lidar simulation (low-cost version)
Image
  • Lightbeam Sensor: One-way detector similar to a laser

Example Uses:

  • Simple distance-based obstacle detection
  • Environment change detection
  • Low-compute-cost sensor simulation

6. Camera and Depth Sensors (USD Assets)

https://docs.isaacsim.omniverse.nvidia.com/latest/assets/usd_assets_camera_depth_sensors.html

A collection of USD-based camera and depth sensor assets modeled after frequently used real sensors.

Included Sensors:

Image
ManufacturerModel (Product Name)Type
Leopard ImagingHawk Stereo Camera (LI-AR0234CS-STEREO-GMSL2-30)Stereo RGB + IMU
Owl Fisheye Camera (LI-AR0234CS-GMSL2-OWL)Fisheye RGB Camera
SensingSG2-AR0233C-5200-G2A-H100F1AHDR mono camera
SG2-OX03CC-5200-GMSL2-H60YAHDR camera for ADAS
SG3-ISX031C-GMSL2F-H190XA3MP automotive camera
SG5-IMX490C-5300-GMSL2-H110SA5MP ADAS + surround view
SG8S-AR0820C-5300-G2A-H30YA4K HDR automotive camera
SG8S-AR0820C-5300-G2A-H60SA4K HDR automotive camera
SG8S-AR0820C-5300-G2A-H120YA4K HDR automotive camera
IntelRealSense D455RGB + Depth + IMU
OrbbecGemini 2Depth via active stereo IR
Femto MegaRGB + multi-mode depth
Gemini 335Depth
Gemini 335LDepth
StereolabsZED XStereo RGB + IMU

Example Uses:

  • Sim-to-Real consistency testing with specific real sensors
  • Replicating real sensor positions/fields of view

7. Non-Visual Sensors (USD Assets)

https://docs.isaacsim.omniverse.nvidia.com/latest/assets/usd_assets_nonvisual_sensors.html

A collection of assets for digital twins of non-visual sensors (IMU, force, contact, etc.).

Asset Contents:

Image
ManufacturerModelSensor TypeNotes
NVIDIADebug RotaryRotary lidar (for debugging)No mesh
Example Rotary 2D2D rotary lidarNo mesh
Example Rotary3D rotary lidarNo mesh
Example Solid StateSolid-state lidarNo mesh
Simple Example Solid StateSimple solid-state lidarNo mesh
HESAIXT32 SD1032-channel 360° spinning lidarCertified
OusterOS0High-res 3D lidar (short-range)Multiple configs
OS1High-res 3D lidar (mid-range)Multiple configs
OS2High-res 3D lidar (long-range)Multiple configs
VLS 128Ultra high-res long-range lidar
SICKmicroScan32D safety lidarCertified
multiScan1363D lidarCertified
multiScan1653D lidarCertified
nanoScan3Ultra-compact safety lidarCertified
picoScan1502D industrial lidarCertified
TiM7812D collision/monitoring lidarCertified
SLAMTECRPLIDAR S2E2D scanning lidarLow-cost
ZVISIONML-30s+Short-range solid-state automotive lidarCertified / no mesh
ML-XsLong-range solid-state automotive lidarCertified / no mesh

Example Uses:

  • Extracting internal robot physical quantities (forces, acceleration)
  • Motion estimation, control feedback system development

Share this post:

Copyright 2025. POLLUX All rights reserved.