Open Source

ExecuTorch Adoption in Reality Labs: Powering On-Device AI Across Meta Devices

November 21, 2025
10 minute read

At Meta, on-device AI is a cornerstone for delivering fast, private, and intelligent experiences to the people who use our products. ExecuTorch, our open source, lightweight, and efficient inference engine, has been instrumental in enabling these capabilities across our various apps. Today, we’re excited to share that ExecuTorch is now enabling cutting-edge machine learning experiences across Reality Labs’ portfolio of VR headsets and AI glasses. ExecuTorch lets developers easily deploy state-of-the-art machine learning models, unlocking richer, smarter, and more immersive experiences. By streamlining the path from research to applications and allowing people to get our work in their hands, ExecuTorch is setting a new standard for advanced on-device AI.

ExecuTorch Adoption Across Reality Labs Devices

Reality Labs’ goal is to build the next computing platform. To that end, the teams across Reality Labs develop next-generation hardware, including Ray-Ban Meta and Oakley Meta glasses, Meta Ray-Ban Display glasses and the Meta Neural Band, and Meta Quest headsets. These products require AI models to run efficiently on a wide range of hardware — from high-performance systems-on-a-chip (SoCs) to ultra-low-power microcontrollers. The modular design of ExecuTorch along with its portability, small footprint, and hardware abstraction make it an ideal fit for the diverse product ecosystem at Reality Labs.

On-device AI deployment presents a fundamental challenge that requires us to balance supporting researchers and engineers with quick experimentation with optimization flows across diverse hardware targets, while not sacrificing performance and productivity. Traditional approaches require converting PyTorch models to other formats, which introduces numerical mismatches and costly debug cycles that break the tight iteration loop researchers need. This challenge becomes particularly acute where there's a diverse set of product lines and AI use cases.

ExecuTorch addresses this challenge by eliminating the conversion step and providing an entirely PyTorch-native flow. It uses PyTorch’s export capability to create a portable representation of the model that can execute both on-device and in PyTorch for pre-deployment validation. Developers can then further compile the graph for specific hardware — for example, quantizing the model to optimize for memory constraints or applying compiler passes to optimize for latency on target SoCs. One of the important benefits of ExecuTorch is that most optimizations can still be validated within PyTorch before deployment. This is made possible by tight integration with hardware partners who contribute by adhering to ExecuTorch’s API and design principles.

With this design and its hardware abstraction, ExecuTorch enables Reality Labs to deploy models consistently across different products and chipsets with minimal modifications. Here’s a look at how this works in our Reality Labs hardware.

Meta Quest 3 and Quest 3S

ExecuTorch enables Meta Quest 3 and 3S to run advanced AI workloads — such as depth estimation and scene understanding — directly on the device. This local processing ensures fast, reliable performance, allowing features like Passthrough to seamlessly blend the physical world with virtual content. The result is a natural, realistic interaction with your environment, where the boundaries between virtual and physical spaces fade away. With high-performance AI features that are both responsive and convenient, these headsets deliver a next-level experience.

Real-time AI models enabled by ExecuTorch also drive important features like hand tracking and controller tracking. These capabilities form the foundation for accurate UI controls, gesture recognition, and virtual keyboards, making interactions intuitive and precise.

Another standout feature enabled by ExecuTorch is persistent room memory. Thanks to efficient on-device inference, Meta Quest 3 and 3S can remember up to 15 different rooms, each with its own unique layout and boundaries. This flexibility saves time and ensures a consistent, personalized experience wherever you use your headset.

Meta Ray-Ban Display

ExecuTorch enables complex models to run directly on both our Ray-Ban Meta and Meta Ray-Ban Display glasses to deliver new features like live translation as well as visual captions shown in real time on the glasses’ display. Unlike traditional audio-only translation, this visual approach — made possible by fast, local inference — lets people scroll and review transcripts at their own pace, making conversations in foreign languages or noisy environments far more accessible.

Another breakthrough is the text-in-the-wild capability, which is enabled by on-device AI and egocentric OCR (Optical Character Recognition).

Because the glasses see what you see, they can instantly identify someone’s region of interest from the scene and recognize relevant text. This enables translation, read-outs, and contextual actions from documents, menus, and street signs, providing immediate translation, dictation, and contextual actions right on the display. Someone wearing the glasses can walk into a restaurant, glance at a menu in another language, and get the translation in seconds, or point at specific items on the menu and ask for more information without ever reaching for their phone.

Meta Ray-Ban Display’s reading assistant feature takes this even further: By simply pointing to physical text, a person can trigger instant translations, definitions, or dictation, all thanks to real-time hand tracking and advanced OCR models running directly on the device. These capabilities make Meta Ray-Ban Display glasses a helpful companion for navigating the world and accessing information in the moment.

Oakley Meta Vanguard

Designed for athletes, Oakley Meta Vanguard glasses leverage cutting-edge AI to deliver real-time performance insights when connected to a Garmin account. Whether tracking a run, bike ride, or hike, Meta AI provides instant feedback on key performance metrics like pace, heart rate, and calories burned — all through simple voice commands.

For example, you can say: “Hey Meta, what’s my heartrate?” or “Hey Meta, how am I doing?” And while cooling down athletes can also receive Meta AI summaries of their workouts.

Expanding ExecuTorch’s Reach

ExecuTorch has rapidly grown beyond Meta into a vibrant community of developers, researchers, and hardware partners who share a vision for open, efficient, and portable on-device AI. From chip vendors and device makers to app developers and AI researchers, contributors across the ecosystem are helping expand ExecuTorch’s reach.

The project’s open source development ensures transparency and collaboration, with contributions from companies like Apple, Arm, Cadence, Intel, MediaTek, NXP Semiconductors, Qualcomm Technologies Inc., and Samsung. ExecuTorch backends for these platforms are publicly available. In addition to hardware partners, ExecuTorch is now an important part of the AI ecosystem. For example, in the recent General Availability release, ExecuTorch demonstrated model export coverage with Hugging Face transformers and Ultralytics libraries, interoperability with Unsloth AI and torchao frameworks, and integration into SDKs such as Liquid AI, NimbleEdge, and React-Native-Executorch. See the Success Stories page for more up-to-date information.

Openness is important because on-device AI spans multiple platforms, chips, and use cases. By working in the open, ExecuTorch lets the entire ecosystem innovate together, ensuring PyTorch models can run efficiently anywhere — from phones and AI apps to future AR glasses and beyond.

Looking Ahead

ExecuTorch is a foundational technology for on-device AI at Meta, and its adoption within Reality Labs is accelerating the delivery of intelligent, private, and responsive experiences across our products. Our close collaboration with silicon partners also ensures that ExecuTorch remains at the forefront of on-device AI innovation. We’re excited to continue this journey and share more updates as we help unlock new capabilities for people around the world. We invite you to contribute to ExecuTorch and share feedback on our GitHub page. You can also join our growing community on the ExecuTorch Discord server.

Join us in the pursuit of what’s possible with AI.