This project involved building a mobile service robot in NVIDIA Isaac Sim, using a custom CAD model designed in Onshape and a detailed digital twin of the robotics lab at Innov8 Hub. The goal was to validate navigation, perception, and basic mobility before committing to hardware fabrication.
Objectives
Create a simulation-ready version of a custom service robot design
Build a high-fidelity digital twin of the Innov8 Hub robotics lab for accurate environmental testing
Integrate ROS 2 for control, navigation, and teleoperation
Process & Tools
CAD & Asset Import: Converted models from Onshape, Blender, and Unreal Engine into Isaac Sim via URDF and other supported formats
Physics Setup: Configured joints, rigid body physics, and collision properties for realistic movement
Omnigraph Workflow: Used NVIDIA Omniverse’s node-based system to manage simulation behaviors and scene logic
ROS 2 Integration: Enabled teleoperation and sensor data streaming through ROS 2 nodes
Sensor Suite: Added and configured virtual LiDAR, cameras, and IMU for perception testing
Results & Recognition
Successfully ran navigation and teleoperation tests in a fully simulated lab environment
Project submission recognized in the LycheeAI Isaac Tournament
Invited to showcase the work in an NVIDIA Omniverse livestream
Conclusion
This project marked a significant step in my simulation journey, allowing me to merge CAD design, robotics simulation, and ROS 2 integration into a single, end-to-end workflow. By building both the robot model and the digital twin of our lab, I created a testbed that accelerates development, minimizes physical prototyping risks, and bridges the gap between design and real-world deployment. The experience not only deepened my understanding of Isaac Sim’s capabilities but also opened doors to new collaborations, including recognition from NVIDIA and participation in global robotics challenges.


