A web application that uses Apple's CoreML model to provide AI chat functionality.
This application consists of:
- Backend: A FastAPI server that loads and uses a CoreML model for processing chat messages
- Frontend: A React application that provides a user-friendly chat interface
- Remote Model Server (New): A separate server that hosts the CoreML model and provides prediction endpoints
- Python 3.8+
- Node.js 14+
- Docker and Docker Compose (optional, for containerized deployment)
git clone <repository-url>
cd <repository-directory>The application requires a CoreML model file (BERTSQUADFP16.mlmodel) to function. You have two options:
You need to provide a Dropbox direct download link to your model file.
cd backend
python setup_model.py --dropbox-link "https://www.dropbox.com/your-direct-download-link?dl=1"This script will:
- Update the Dropbox link in the configuration
- Download the model
- Verify that the model works correctly
This approach offloads model inference to a separate server, avoiding the need to download the model to the main application server.
- Deploy the remote model server (see
model_server/DEPLOYMENT.mdfor instructions) - Configure the main application to use the remote model server:
export USE_REMOTE_MODEL_SERVER=true export REMOTE_MODEL_SERVER_URL="https://your-model-server-url.com" export REMOTE_MODEL_SERVER_API_KEY="your-api-key"
For more details on the remote model server approach, see REMOTE_MODEL_SERVER.md.
cd backend
pip install -r requirements.txt
python run.pyThe backend server will start at http://localhost:8000
cd frontend
npm install
npm startThe frontend application will start at http://localhost:3000
You can also deploy the application using Docker Compose:
docker-compose up -dThis will start both the backend and frontend services in containers.
If the model is not loading correctly:
-
Verify that the model file exists:
cd backend python verify_model.py -
Check the backend logs for any errors:
cd backend python run.py -
Make sure the Dropbox link is a direct download link (ends with
?dl=1)
If the chat functionality is not working:
- Check the browser console for any errors
- Verify that the backend is running and accessible
- Check the backend health endpoint: http://localhost:8000/
- Try restarting both the backend and frontend services
The backend API documentation is available at http://localhost:8000/docs when the server is running.
[Your License Information]