
Google's AI aces Minecraft without training—Why it's a major breakthrough
What's the story
Google DeepMind's AI system, Dreamer, has hit a major milestone by successfully finding diamonds in the popular video game, Minecraft.
The complex task was performed without any prior instruction on how to play the game.
The creators of Dreamer believe this marks a major leap toward developing machines that can apply knowledge from one domain to new situations - a key goal in artificial intelligence (AI) research.
AI advancement
Dreamer's self-improving capabilities and understanding of environment
Danijar Hafner, a computer scientist at Google DeepMind, describes Dreamer as a major step toward general AI systems.
He says the system can understand its physical environment and enhance itself over time without any human guidance.
This is particularly impressive considering Minecraft's unique ability to throw players into a new randomly-generated world every time they play, making every experience different.
AI exploration
Dreamer's unique approach to diamond collection in Minecraft
In Minecraft, players explore a 3D world with different terrains, using resources to build objects and collect items like diamonds.
Hafner explains that collecting diamonds is a complex process that requires deep exploration.
Instead of relying on human gameplay videos or guidance like previous AI systems, Dreamer employs reinforcement learning - a trial-and-error technique where it identifies rewarding actions, repeats them, and discards others.
AI prediction
Dreamer's world model guides decision-making in Minecraft
Hafner credits Dreamer's success to its capability to build a model of its surroundings, which then guides its decision-making.
This 'world model' enables Dreamer to predict potential rewards from various actions with less computation than what would be required to actually perform those actions in Minecraft.
Hafner hopes this capability could eventually help develop robots that can learn how to interact with the real world.