Building an AI-Powered Note-Taking App in React Native
Article Summary
Jakub Mroz from Software Mansion just built a note-taking app that understands what you mean, not just what you type. No cloud APIs, no recurring costs, and it works completely offline.
This is Part 1 of a series on building AI-powered mobile apps with on-device models. Mroz walks through adding semantic search to a React Native note app using ExecuTorch and the All-MiniLM-L6-v2 embedding model (just 80 MB). The entire AI stack runs locally on the user's device.
Key Takeaways
- Semantic search finds 'team sync' when you search for 'meeting'
- All-MiniLM-L6-v2 model generates 384-dimensional vectors for meaning-based search
- Uses OP-SQLite as vector store with React Native ExecuTorch
- Notes split into 500-character chunks with 100-character overlap
- Vector store syncs automatically on create, update, and delete
You can now build privacy-first, offline-capable AI features in React Native without backend infrastructure or API costs.
About This Article
Keyword search doesn't understand meaning. If you search for 'meeting', you won't find notes called 'team sync'. This makes it hard to discover things in unstructured note data.
Software Mansion added the All-MiniLM-L6-v2 embedding model, which is 80 MB in size. They integrated it with React Native ExecuTorch and OP-SQLite to create 384-dimensional semantic vectors. These vectors capture what text means instead of just matching exact words.
The app can now search by meaning entirely offline, with no API costs. It automatically syncs embeddings whenever notes are created, updated, or deleted. No backend infrastructure needed.