NVIDIA is partnering with Apple to develop its humanoid robots, using the Apple Vision Pro virtual headset, which acts as a tool to control NVIDIA’s robots in connection with the MimicGen NIM mini service, which trains humanoid robots from data
According to 9to5mac, one way to capture human-rendering data is to use remote playback, but this process has become increasingly expensive and time-consuming
A reference remote playback workflow powered by NVIDIA AI and Omniverse, which was demonstrated at SIGGRAPH, allows researchers and AI developers to generate massive amounts of synthetic motion and perception data from a small amount of remotely captured human renderings
Developers use Apple Vision Pro to capture a small number of remotely played renders, then simulate the recordings in NVIDIA Isaac Sim and use the MimicGen NIM service to generate synthetic datasets from the recordings
Developers train the Project GR00T humanoid baseline model using real and synthetic data, allowing developers to save time and reduce costs. They use Isaac Lab’s Robocasa NIM service, a robotics learning framework, to generate experiments to retrain the robot model