An on-device AI wellness coach built with Expo and @react-native-ai/mlc, running Llama-3.2-3B-Instruct model entirely on your device.
- 🤖 Real On-Device AI: Uses MLC (Machine Learning Compilation) to run Llama-3.2-3B-Instruct locally
- 🔒 Privacy-First: Your data never leaves the device
- ⚡ Fast Responses: Local inference means instant replies
- 📱 Native Mobile: Works on iOS and Android (requires Expo Dev Build)
- 💬 Wellness Coach: Get personalized advice for wellness, productivity, and mental health
- Node.js and npm installed
- Expo CLI installed (
npm install -g expo-cli) - For iOS: Xcode and CocoaPods
- For Android: Android Studio and JDK
-
Install dependencies
npm install
-
Prebuild native directories (already done, but you can regenerate)
npx expo prebuild --clean
-
For iOS (fix CocoaPods encoding if needed):
export LANG=en_US.UTF-8 cd ios && pod install && cd ..
-
Start the app
npx expo start --dev-client
-
Run on device
- Android:
npx expo run:android - iOS:
npx expo run:ios
- Android:
On first launch, the model will be downloaded and initialized. This may take:
- Download: 10-30 minutes depending on connection (~3GB)
- Preparation: 1-2 minutes
- Subsequent runs: Much faster (model is cached)
- Model: Llama-3.2-3B-Instruct
- Size: ~3GB (downloaded on first use)
- Provider: MLC (Machine Learning Compilation)
- Inference: On-device, no API calls
- React Native New Architecture: Enabled (already configured)
- Increased Memory Limit: Configured via MLC plugin
- Storage: ~3GB free space needed
- Internet: Required for initial model download only
thrive/
├── ai/
│ └── ai.ts # Real MLC on-device AI engine
├── components/
│ ├── CoachInput.tsx # Input component
│ └── CoachOutput.tsx # Output component
├── app/
│ └── (tabs)/
│ └── index.tsx # Main AI coach screen
└── ios/ # Native iOS (generated by prebuild)
└── android/ # Native Android (generated by prebuild)
Since this uses native modules, you need Expo Dev Client (not Expo Go):
# Build dev client
npx expo run:android
# or
npx expo run:ios
# Then start development server
npx expo start --dev-clientModel won't download:
- Check internet connection
- Ensure ~3GB free storage
- Check device permissions
Build errors:
- Run
npx expo prebuild --cleanagain - For iOS: Fix CocoaPods encoding:
export LANG=en_US.UTF-8 - Clear cache:
npx expo start -c
Memory issues:
- The MLC plugin already configures increased memory limit
- Close other apps if needed