Revealed at CES 2023, Nvidia’s robotics simulation toolkit can simulate human behavior in industrial environments like warehouses and manufacturing facilities. The goal is to help collaborative robots (cobots) or autonomous mobile robots (AMRs) understand and identify common behaviors and potential obstacles in the real world. The use of industrial and commercial robots is growing rapidly and according to Nvidia, the improvements to the robotics platform will accelerate the development and deployment of autonomous robots. Developing the artificial intelligence (AI) can ensure the robots successfully and safely operate in a variety of environments. “Simulation is the critical technology that will allow the development of the complex software systems that will power the coming wave of smarter, more autonomous robots. In simulation, the virtual robots have a proving ground for their complex software stacks and multitude of AI models,” said Gerard Andrews, senior product manager at Nvidia. Also: The drone wranglers: How the most authentic Old West town in the US is delivering the future of flight By adding simulations of human behaviour and interaction within environments, like picking up and moving items, pushing carts and moving to new locations, it’s possible to test how adding robots to the environment would potentially play out, without endangering people. Both normal events, like people interacting with industrial settings or moving around a warehouse, and abnormal events, such as unexpected emergencies and other scenarios, can be simulated in order to help build robots that appropriately react to situations in busy environments. Also: Nvidia’s latest Studio Laptop series showcases its fastest, most powerful GPUs To aid this, using Nvidia RTX technology, Isaac Sim has improved sensor support, allowing it to render physically accurate data from sensors in real time, which includes ray tracing to provide more accurate sensor data under various lighting conditions and in response to reflective materials. This allows simulated worlds to be based upon physically accurate sensor models, minimizing the differences between the simulation and the real environment, to ensure the robots are as accurately trained as possible. The new version of Isaac Sim also provides numerous new simulation-ready 3D assets – including warehouse parts and popular robots, so developers and users can quickly start building. Built on Nvidia Omniverse, the company’s platform for creating and operating metaverse applications, Isaac Sim is accessible via the cloud, providing teams working on robotics projects with the ability to collaborate with increased accessibility, agility and scalability for testing and training virtual robots. “With cloud access and its expansive set of photoreal and physically accurate simulation capabilities, Isaac Sim is set to establish new methodologies for the development of intelligent robots,” said Andrews. MORE ON ROBOTICS
Video games and robots want to teach us a surprising lesson. We just have to listenCan AI step up to offer help where humans cannot?Would you eat food cooked by a robot? You might have to soonNvidia CEO Jensen Huang: AI language models as-a-service “potentially one of the largest software opportunities ever"The autonomous enterprise is near, but there are still some missing pieces