Software Mansion Jakub Mroz Nov 5, 2025

Building an AI-Powered Note-Taking App in React Native

Article Summary

Jakub Mroz from Software Mansion just built a note-taking app that understands what you mean, not just what you type. No cloud APIs, no recurring costs, and it works completely offline.

This is Part 1 of a series on building AI-powered mobile apps with on-device models. Mroz walks through adding semantic search to a React Native note app using ExecuTorch and the All-MiniLM-L6-v2 embedding model (just 80 MB). The entire AI stack runs locally on the user's device.

Key Takeaways

Critical Insight

You can now build privacy-first, offline-capable AI features in React Native without backend infrastructure or API costs.

Part 2 promises multimodal search where you can find images using text queries or even other images as input.

About This Article

Problem

Keyword search doesn't understand meaning. If you search for 'meeting', you won't find notes called 'team sync'. This makes it hard to discover things in unstructured note data.

Solution

Software Mansion added the All-MiniLM-L6-v2 embedding model, which is 80 MB in size. They integrated it with React Native ExecuTorch and OP-SQLite to create 384-dimensional semantic vectors. These vectors capture what text means instead of just matching exact words.

Impact

The app can now search by meaning entirely offline, with no API costs. It automatically syncs embeddings whenever notes are created, updated, or deleted. No backend infrastructure needed.