Facebook AI Research (FAIR) presents an evolution in AI robotics training: Habitat 3.0.
FAIR is aiming to bridge the gap between AI and the physical world by building AI agents capable of understanding their environment and collaborating with humans.
Habitat 3.0 serves as a virtual training ground for building embodied AI agents, enabling robots and virtual humans to cooperatively complete tasks in a digital environment.
Today we’re announcing Habitat 3.0, Habitat Synthetic Scenes Dataset and HomeRobot — three major advancements in the development of social embodied AI agents that can cooperate with and assist humans in daily tasks.
More details on these announcements ➡️ https://t.co/WGSjkkyQx3 pic.twitter.com/GdU54AD0qg
— AI at Meta (@AIatMeta) October 20, 2023
Training robots in the real world can be challenging, time-consuming, and potentially hazardous. Nvidia’s Isaac Sim is an established virtual training system for industrial robotics, but Meta’s Habitat is more focused on domestic environments.
Robots can make errors without real-world consequences by conducting training in a simulated non-destructive environment like Habitat 3.0.
Under the hood of Habitat 3.0
Previous iterations of Habitat laid the groundwork for robot navigation and interaction within a home-like digital environment.
Habitat 3.0, however, introduces a collaborative model, incorporating both robots and humanoid avatars to simulate real-world human-robot interaction scenarios.
FAIR states that this new platform is not just about movement and interaction – it also factors in the visual and semantic detail of real-world tasks, using humanoid avatars with natural movements and behaviors.
These avatars can be controlled by both pre-set algorithms and actual human input.
FAIR’s new Habitat platform enables:
- Human-robot collaboration in simulated home-like environments. Here, robots can learn to work alongside human avatars, mastering tasks like house cleaning.
- Realistic Interactions with human avatars, complete with natural movement and appearance, to mimic real-world interactions.
- Human-in-the-loop evaluations, where real humans can interact and control these avatars via various interfaces, including keyboards, mice, and even VR headsets.
As per Meta, the platform offers several advantages over conventional robot training methods:
- Faster learning for reinforcement algorithms, allowing experiments that would take years in real-world scenarios to be completed in days.
- Swift and seamless environment adaptability, removing logistical challenges like physically moving robots.
- A safer testing ground, ensuring that AI models don’t pose threats in real-world scenarios.
Together with Habitat 3.0, FAIR also released the Habitat Synthetic Scenes Dataset (HSSD-200).
This dataset includes over 18,000 objects and provides robots with a more authentic training environment that closely mirrors real-world scenarios.
FAIR acknowledges that true socially intelligent robots will need to understand the dynamic environments in which humans live.
The next research phase will leverage the capabilities of Habitat 3.0 to further refine AI models for more enhanced human-robot collaboration.