April 22, 2021

AI2THOR 400x275The Allen Institute for AI (AI2), which conducts research and engineering in the field of artificial intelligence, has announced the 3.0 release of its embodied AI framework, known as AI2-THOR. The new version adds active object manipulation to its testing framework to allow robotics developers to train robots in a virtual environment instead of real-world training methods.

The ManipulaTHOR virtual agent has a highly articulated robot arm equipped with three joints of equal limb length and composed entirely of swivel joints to bring a more human-like approach to object manipulation. The AI2-THOR framework can study the problem of object manipulation in more than 100 visually rich, physics-enabled rooms. AI2 said that by enabling the training and evaluation of generalized capabilities in manipulation models, ManipulaTHOR allows for much faster training, while also being safer and more cost-effective than real-world training methods.

“Imagine a robot being able to navigate a kitchen, open a refrigerator and pull out a can of soda,” said Oren Etzioni, CEO at AI2. “This is one of the biggest and yet often overlooked challenges in robotics and AI2-THOR is the first to design a benchmark for the task of moving objects to various locations in virtual rooms, enabling reproducibility and measuring progress. After five years of hard work, we can now begin to train robots to perceive and navigate the world more like we do, making real-world usage models more attainable than ever before.”

Despite being an established research area in robotics, the visual reasoning aspect of object manipulation has consistently been one of the biggest hurdles researchers face, AI2 said. The AI2-THOR framework can solve the problems of robots struggling to perceive, navigate, act and communicate with others through complex simulated testing environments that researchers can use to train robots for eventual activities in the real world.

“In comparison to running an experiment on an actual robot, AI2-THOR is incredibly fast and save,” said Roozbeh Mottaghi, research manager at AI2. “Over the years, AI2-THOR has enabled research on many different tasks such as navigation, instruction following, multi-agent collaboration, performing household tasks, reasoning if an object can be opened or not. This evolution of AI2-THOR allows researchers and scientists to scale the current limits of embodied AI.”

In addition to the version 3.0 release, the AI2 team is hosting the RoboTHOR Challenge 2021 in collaboration with the Embodied AI Workshop at this year’s Conference on Computer Vision and Pattern Recognition (CVPR). AI2’s challenges will cover RoboTHOR object navigation; ALFRED (instruction following robots) and Room Rearrangement.

More details are available through the AI2 website.

Related video: