Large Language Model
Niantic Uses Meta Llama to Bring Digital Creatures to Life
May 22, 2024

Since the ’90s, virtual pets have captivated our imagination, evolving from simple digital toys to sophisticated companions capable of lifelike interactions. And Niantic, known for its pioneering augmented reality games like Pokémon GO, has taken virtual pets to the next level with its latest mobile AR adventure, Peridot.

A first-of-its-kind AR game, Peridot uses generative AI to create a virtual world of lifelike pets. To drive the virtual world of Peridot, Niantic integrated Meta Llama 2, transforming its adorable creatures, called “Dots,” into responsive AR pets that now exhibit smart behaviors to simulate the unpredictable nature of physical animals. Llama 2 generates each Dot’s reaction in real time, making every interaction dynamic and unique.

“Leveraging LLMs like Llama gives us an opportunity to let generative AI drive Peridot’s gameplay in meaningful and realistic ways,” says Asim Ahmed, Global Marketing Lead for Peridot at Niantic. “Instead of manually programming a limited range of reactions when our creatures encounter various elements in our real world, we’ve used Llama 2 to help determine how the creature might react and select appropriate responses from our vast library of animations.”

Enhancing the Immersive Experience Through Open Source

Niantic prides itself on building games that become a part of players’ daily lives. The Peridot team understood that players were seeking more immersive experiences with their Dots, prompting them to explore new technologies that could help foster deeper connections and enhance the sense of companionship. Niantic chose to leverage LLMs given their ability to “learn” from a given Dot’s current surroundings and respond to prompts with consistency. By using an open source model like Llama 2, Niantic significantly expedited the development process, allowing the Peridot team to forgo lengthy approval functions and begin creating immediately.

As Peridot grew more complex and interactive, the engineers at Niantic needed a system capable of handling dynamic interactions between creature and player to deepen engagement and enhance the game’s interactions with its environment. Llama’s open source approach let the Niantic team quickly prototype and iterate in their own environment to prioritize data privacy. Llama’s framework also offered more flexibility, enabling Niantic to enhance its generative AI capabilities and accelerate the delivery of new, immersive features for Peridot to players in just under three months.

“We are eager to see more models open sourced to enable teams like ours to freely explore their capabilities without being caught in discussions around cost, privacy, and cloud dependencies at the onset of said explorations,” says Ahmed.

Evolving Peridot to Become Even Smarter

Niantic initially integrated Llama 2 into Peridot in November 2023 to allow the Dots to react appropriately to their surroundings. The Peridot team wanted to explore the use of a general-purpose dialogue LLM in tandem with its AR recognition system and its vast library of creature animations without having to fine-tune the LLM itself. When implementing Llama, the team prioritized creativity and response time without overcomplicating their one-shot prompts. They faced challenges in creating a prompt that had the right amount of expressiveness and creativity in the chosen creature’s reactions and ensuring those reactions stayed in a consistent format. Because these creatures were packed with unique personality traits, they risked overloading Llama with too much information in a one-shot prompt and slowing response time.

After widespread success and positive feedback from the gaming community, the technology’s application has since evolved. The team at Niantic leaned further into Llama 2’s capabilities to give players the ability to have “conversations” with their Dots, creating more personalized in-game experiences.

Llama’s latest integration allows each Dot to exhibit unpredictable and surprising behaviors instead of relying on a limited set of predetermined actions. Whether a Dot reacts with joy, curiosity, or even mischief, each AI-driven response brings a sense of organic realism that heightens the players’ excitement when interacting with their virtual creatures.

Niantic uses advanced computer vision algorithms to convert physical-world images from a player’s camera into accurate 3D models. This technology lets the Dots interact with their surroundings. Using the Niantic Lightship ARDK, the Dots recognize physical objects like flowers, food, and pets. These observations are then processed by a custom version of Llama 2, which considers each Dot’s unique traits—such as personality and history—to determine how these virtual creatures might react to their discoveries.

Because players could now input any command via voice recognition or text input, there was an added latency to consider in addition to Llama’s prompt response. The Peridot team needed to work with Llama to create a consistent reaction that could comprehend the player’s input and additional information, including the creature’s hunger, attention status, and any detected objects in the scene. They addressed this by defining an expected response format in JSON that instantly improved the quality of the LLM responses.

“We were surprised to learn that by prompting Llama, we could create unpredictable behaviors,” Ahmed adds. “For something like a virtual pet, this really brought life into our characters. We see a wide set of opportunities to keep leveraging Llama in different ways to drive new areas of gameplay more procedurally.”

The Future Gets More Adorable

By moving away from static algorithms and embracing AI-driven spontaneity, Peridot’s virtual pets can now deliver experiences that were previously unimaginable. For example, if the player asks their Dot if they want to go for a walk, the creature might respond with an excited spin to indicate they’re ready to go.

Peridot’s success with generative AI gives us a glimpse into what’s possible, and we plan to elevate the way players interact with Peridot across devices,” says Ahmed. “We’re excited to keep pushing the boundaries with the Peridot franchise and Llama and truly making these creatures come to life in our world in new and interesting ways.”


Our latest updates delivered to your inbox

Subscribe to our newsletter to keep up with Meta AI news, events, research breakthroughs, and more.

Join us in the pursuit of what’s possible with AI.

Related Posts
Computer Vision
Introducing Segment Anything: Working toward the first foundation model for image segmentation
April 5, 2023
MultiRay: Optimizing efficiency for large-scale AI models
November 18, 2022
ML Applications
MuAViC: The first audio-video speech translation benchmark
March 8, 2023