This repo is a versatile tool designed to seamlessly rendering objects from simulation or sensors to Mixed Reality (MR) or Augmented Reality (AR) headsets.
Please check this for information on how to setup simpub with isaac sim.
Are you looking forward to integrate your Human-Robot Interaction (HRI) application with MR/VR/AR headsets effortlessly? This repository is perfect for you.
We provide a ROS-style interface that allows you to easily use Python code to project simulation scenes onto MR headsets. Additionally, you can use input data from the headset to control virtual robots and objects within the simulation environment or even real robots.
This repo uses zmq to to communicate with MR application with automaticlly searching devices and reconnecting features.
We also offer a Unity Application that is easy to deploy for projecting and updating simulation scenes in the MR headset.
- Automatically Searching: SimPub will search all the devices in the subnet and connect them fully automatically.
- Reconnecting: Reconnecting to the PC if you shutdown the Python script.
- Remote Logger: The log will be sent to the PC Simulation including FPS and latency.
- Meta Quest3
- HoloLens2
- Install the dependency
pip install zmq trimesh
- Install this repo
cd $the_path_to_this_project
pip install -e .
-
Deploy the Unity application to your headset with the device name. Please refer to the website.
-
Connect your simulation PC and headset to the same subnet. For example, if your simulation PC address is
192.168.0.152
, the headset address should share the same prefix, such as192.168.0.142
or192.168.0.73
. We recommend using a single WiFi router for PC-headset communication to ensure optimal connectivity,. Additionally, using a wired cable to connect your PC can significantly reduce latency. -
Run the usage examples under the folder
/demos/
, then wear the headset, start the unity application and enjoy!