March 2, 2021

UnitySimulation400x275Unity, which develops a platform for creating and operating real-time 3D content, has released its Object Pose Estimation demonstration, which combines computer vision and simulation technologies aimed at the use of robotics in industrial settings. The new demonstration follows recent releases aimed to support the Robot Operating System (ROS) framework, allowing roboticists to safely, quickly and cost-effectively explore, test, and develop robotic solutions.

“This is a powerful example of a system that learns instead of being programmed, and as it learns from the synthetic data, it is able to capture much more nuanced patterns than any programmer ever could,” said Danny Lange, senior vice president of artificial intelligence at Unity. “Layering our technologies together shows how we are crossing a line, and we are starting to deal with something that is truly AI, and in this case, demonstrating the efficiencies possible in training robots.”

Unity said that simulation technology is effective and advantageous when testing applications in situations that are dangerous, expensive, or rare. Validating applications in simulations before deploying to a robot helps shorten iteration time by revealing any potential issues early. With Unity’s built-in physics engine and the Unity Editor, developers can create endless permutations of virtual environments, enabling objects to be controlled by an approximation of the forces that act on them in the real world.

The demonstration follows the release of Unity’s URDF Importer, an open-source Unity package for importing a robot into a Unity scene from its URDF file, which takes advantage of enhanced support for articulations for more realistic kinematic simulations. The company also released the ROS-TCP-Connector, which reduces the latency of messages being passed between ROS nodes and Unity, allowing the robot to react in near real-time to its simulated environment. 

“With Unity, we have not only democratized data creation, but we’ve also provided access to an interactive system for simulating advanced interactions in a virtual setting,” said Lange. “You can develop the control systems for an autonomous vehicle, for example, or here for highly expensive robotic arms, without the risk of damaging equipment or dramatically increasing cost of industrial installations. To be able to prove the intended applications in a high-fidelity virtual environment will save time and money for the many industries poised to be transformed by robotics combined with AI and machine learning.”

More details are available at Unity’s Robotics website.

Related video: