A cross-platform AI-powered tutoring assistant built with Electron, React, and Firebase, featuring secure authentication and local AI processing for enhanced privacy and performance.
- Firebase Authentication integration
- Google OAuth support
- Secure user session management
- Protected chat environments
- Local Ollama LLM integration for 95% latency reduction
- Real-time AI responses
- Context-aware conversations
- Educational content optimization
- 100% chat history recovery with Firestore
- Persistent conversation storage
- Cross-device synchronization
- Search through past conversations
- Built with Electron for Windows, macOS, and Linux
- Native desktop experience
- System tray integration
- Offline functionality
| Technology | Purpose | Version |
|---|---|---|
| Electron | Desktop app framework | Latest |
| React | Frontend UI library | 18+ |
| Firebase Auth | User authentication | 9+ |
| Firestore | Database & chat storage | 9+ |
| Ollama | Local LLM processing | Latest |
| Node.js | Backend runtime | 16+ |
- Node.js (v16 or higher)
- npm or yarn
- Ollama installed locally
-
Clone the repository
git clone https://github.com/AtharvMixraw/ai-tutor.git cd ai-tutor -
Install dependencies
npm install # or yarn install -
Configure Firebase
- Create a Firebase project
- Enable Authentication and Firestore
- Add your Firebase config to
.env
REACT_APP_FIREBASE_API_KEY=your_api_key REACT_APP_FIREBASE_AUTH_DOMAIN=your_auth_domain REACT_APP_FIREBASE_PROJECT_ID=your_project_id REACT_APP_FIREBASE_STORAGE_BUCKET=your_storage_bucket REACT_APP_FIREBASE_MESSAGING_SENDER_ID=your_sender_id REACT_APP_FIREBASE_APP_ID=your_app_id
-
Install and setup Ollama
# Install Ollama (platform specific) # Pull a model (e.g., llama2) ollama pull llama2
-
Start the application
npm run electron-dev # or yarn electron-dev
npm start- Start React development servernpm run electron- Start Electron appnpm run electron-dev- Start in development modenpm run build- Build for productionnpm run electron-pack- Package for distribution
ai-tutor/
├── public/
│ ├── electron.js # Electron main process
│ └── index.html
├── src/
│ ├── components/ # React components
│ │ ├── Auth/ # Authentication components
│ │ ├── Chat/ # Chat interface
│ │ └── Common/ # Shared components
│ ├── services/ # Firebase & API services
│ ├── hooks/ # Custom React hooks
│ ├── utils/ # Utility functions
│ └── App.js # Main App component
├── package.json
└── README.md
-
Install Ollama on your system
-
Pull desired models:
ollama pull llama2 # For general tutoring ollama pull codellama # For coding assistance ollama pull mistral # Alternative model
-
Configure model selection in
src/services/ollamaService.js
- Create a new Firebase project
- Enable Authentication (Email/Password and Google)
- Create a Firestore database
- Add security rules for chat data protection
- Local LLM Processing: Using Ollama eliminates API call overhead
- Efficient Caching: Smart response caching for common queries
- Optimized Electron: Minimized main-renderer process communication
- 100% Chat Recovery: All conversations stored in Firestore
- Offline Support: Local storage fallback for temporary outages
- Real-time Sync: Immediate cloud backup of conversations
- End-to-End Authentication: Secure Firebase Auth integration
- Local AI Processing: No data sent to external AI services
- Encrypted Storage: Firestore security rules and encryption
- Session Management: Automatic token refresh and validation
- Student Tutoring: Personalized learning assistance
- Code Review: Programming help and debugging
- Research Support: Academic question answering
- Language Learning: Conversation practice and corrections
- Homework Help: Step-by-step problem solving
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Atharv Mishra
- GitHub: @AtharvMixraw
- LinkedIn: Atharv Mishra
- Email: antilogatharv@gmail.com
Star this repository if you found it helpful!