Software Mansion Norbert Klockiewicz Nov 14, 2024

Introducing React Native ExecuTorch

Article Summary

Norbert Klockiewicz from Software Mansion just dropped a library that lets you run LLaMA 3.2 models directly on mobile devices. No API calls, no backend costs, no data leaving the device.

React Native ExecuTorch bridges PyTorch's edge AI framework with React Native, making on-device AI accessible to mobile developers without deep ML expertise. Born from building a photo eraser app, Software Mansion abstracted their native AI code into a simple JavaScript API that prioritizes privacy and slashes infrastructure costs.

Key Takeaways

Critical Insight

React Native developers can now run large language models locally on mobile with a simple JavaScript API, eliminating backend costs and keeping user data private.

The team hints at how they're using the same tech in their photo eraser app, and bigger models are already in the pipeline.

About This Article

Problem

React Native developers struggled to integrate PyTorch's ExecuTorch framework into mobile apps. They needed deep native code expertise just to export models and run them on edge devices.

Solution

Software Mansion built React Native ExecuTorch with a useLLM hook that hides the native complexity. Developers can load models from URLs and tokenizers in three lines of JavaScript.

Impact

Teams can now run LLaMA 3.2 models directly on devices instead of relying on backend infrastructure or APIs. Pre-exported models on Hugging Face keep library and model versions in sync.