The Intelligent Interface for Indian Ocean ARGO Float Data.
FloatChat-AI explores the depths of the Indian Ocean using real-time data from the ARGO Float Network. It combines a modern interactive dashboard with an AI-powered streaming RAG chatbot to make oceanographic data accessible, understandable, and actionable.
- Key Features
- Technology Stack
- Quick Start
- Dev Container / Codespaces
- AI Configuration
- Project Structure
- API Reference
- Environment Variables
- Scientific Verification
- Contributing
- License
- Acknowledgments
- Real-time Streaming β Tokens arrive word-by-word (like ChatGPT) via Ollama streaming, so you never wait for the full response.
- Context-Aware RAG β Uses FAISS vector embeddings + SentenceTransformers to retrieve relevant float data for every query.
- Local & Cloud AI β Supports Ollama (Llama 3.2, FREE) for local inference or OpenAI GPT-4o-mini for cloud-based reasoning.
- Auto Pre-warming β Ollama model is pre-loaded into RAM at server startup for instant first responses.
- Chart Intent Detection β Automatically detects when a visualization is needed and renders the right chart.
- Scientifically Verified Data β Only real, confirmed ARGO floats from the Indian Ocean (Arabian Sea, Bay of Bengal, Equatorial, Southern IO).
- Live Map β Interactive Leaflet map showing real-time float positions and drift trajectories.
- Advanced Visualization:
- Temperature & Salinity depth profiles
- T-S Diagrams for water mass analysis
- Depth-time sections
- Regional surface statistics (0β10 m)
- Data Pipeline β Automated fetching and validation of float profiles from the Argovis 2.0 API.
- Vector Search β Automatic FAISS embedding generation for new float data.
- SQLite Database β Efficient local storage for metadata and profiles.
- Fallback Mode β Structured data responses even when no AI is configured.
| Layer | Technology | Purpose |
|---|---|---|
| Frontend | React 19, Recharts, React-Leaflet | UI, charts, interactive maps |
| Backend | Flask 3.1, SQLAlchemy, Gunicorn | REST API, ORM, production server |
| AI / RAG | FAISS, SentenceTransformers (all-MiniLM-L6-v2) |
Vector search & embeddings |
| LLM | Ollama (Llama 3.2) / OpenAI (GPT-4o-mini) | Conversational AI |
| Data Source | Argovis 2.0 API | Scientific ARGO float observations |
| Database | SQLite | Local persistent storage |
| DevOps | Docker, Dev Containers, GitHub Codespaces | Reproducible environments |
| Tool | Version | Required |
|---|---|---|
| Python | 3.10+ | Yes |
| Node.js | 18+ | Yes |
| Ollama | latest | Recommended (for free AI) |
| Git | any | Yes |
git clone https://github.com/Rohitgautam02/Floatchat-AI.git
cd Floatchat-AI
chmod +x setup.sh
./setup.shThe interactive script will:
- Install all Python & Node.js dependencies
- Let you choose AI mode (Ollama / OpenAI / Basic)
- Create the SQLite database
- Fetch sample ARGO data from the Indian Ocean
- Show you how to start the app
# 1. Clone the repository
git clone https://github.com/Rohitgautam02/Floatchat-AI.git
cd Floatchat-AI
# 2. Backend setup
cd backend
pip install -r requirements.txt
cp .env.example .env # then edit .env with your AI settings
# 3. Initialize database and fetch ocean data
python fetch_argovis.py
# 4. Start the backend (Terminal 1)
python app.py # runs on http://localhost:5000
# 5. Frontend setup (Terminal 2)
cd ../frontend
npm install
npm start # opens http://localhost:3000git clone https://github.com/Rohitgautam02/Floatchat-AI.git
cd Floatchat-AI
cp backend/.env.example backend/.env # edit .env with your settings
docker compose up --buildOpen http://localhost:3000 β the frontend proxies API calls to the backend automatically.
On Windows, the setup.sh script requires Git Bash or WSL. Alternatively, follow the manual setup:
# PowerShell
git clone https://github.com/Rohitgautam02/Floatchat-AI.git
cd Floatchat-AI
# Backend
cd backend
pip install -r requirements.txt
Copy-Item .env.example .env # edit .env with notepad
python fetch_argovis.py
python app.py
# Frontend (new PowerShell window)
cd frontend
npm install
npm startThis project includes a full Dev Container configuration for one-click development.
- Go to github.com/Rohitgautam02/Floatchat-AI
- Click Code β Codespaces β Create codespace on main
- Wait for the container to build (~2β3 minutes)
- The post-create script auto-installs all dependencies
- Open two terminals:
# Terminal 1 β Backend cd backend && python app.py # Terminal 2 β Frontend cd frontend && npm start
- Codespaces will auto-forward ports 5000 & 3000
Note: Ollama cannot run inside Codespaces. Use
AI_MODE=openaiwith an API key inbackend/.env, or use Basic Mode.
- Install Docker Desktop and the Dev Containers extension
- Open the repo folder in VS Code
- Press
Ctrl+Shift+Pβ Dev Containers: Reopen in Container - All dependencies install automatically via
postCreateCommand - For Ollama support, install Ollama on your host machine (not in the container) and set:
OLLAMA_URL=http://host.docker.internal:11434
FloatChat-AI gives you three options β choose what fits your needs:
| Option | Cost | Speed | Setup |
|---|---|---|---|
| π Ollama | FREE forever | ~5 tok/s (CPU) | 5 min local install |
| π³ OpenAI | ~$0.001 / chat | Very fast | 2 min (API key) |
| π Basic | Free | Instant | None |
# Install Ollama (one-time)
# Linux / macOS:
curl -fsSL https://ollama.ai/install.sh | sh
# Windows: download from https://ollama.com/download
# Pull the model (~2 GB download)
ollama pull llama3.2
# Start Ollama (keep running in background)
ollama serveIn backend/.env:
AI_MODE=ollama
OLLAMA_URL=http://localhost:11434
OLLAMA_MODEL=llama3.2Performance tip: The backend pre-warms Ollama at startup (loads model into RAM). The first chat after a cold start may take a few extra seconds, but all subsequent responses stream in real-time.
Get an API key from platform.openai.com/api-keys.
In backend/.env:
AI_MODE=openai
OPENAI_API_KEY=sk-your-actual-key-here
OPENAI_MODEL=gpt-4o-miniNo configuration needed. You'll get structured data tables, charts, and statistics without AI conversations. Upgrade anytime by editing .env.
Floatchat-AI/
βββ .devcontainer/
β βββ devcontainer.json # Dev Container / Codespaces config
β βββ setup.sh # Post-create setup for Codespaces
βββ .vscode/
β βββ settings.json # VS Code workspace settings
βββ backend/
β βββ app.py # Flask application & API routes
β βββ fetch_argovis.py # ARGO data fetcher (Argovis 2.0 API)
β βββ requirements.txt # Python dependencies
β βββ .env.example # Environment variable template
β βββ services/
β β βββ ai_service.py # LLM integration (Ollama streaming / OpenAI)
β β βββ rag_service.py # FAISS vector search & RAG retrieval
β β βββ data_service.py # Database queries, stats & chart data
β βββ db/
β β βββ models.py # SQLAlchemy models (ArgoRecord, FloatMetadata)
β β βββ session.py # Database session factory
β βββ data/
β βββ argo_indian_ocean.csv
β βββ faiss_index/ # FAISS vector index + documents
βββ frontend/
β βββ package.json
β βββ public/
β β βββ index.html
β βββ src/
β βββ App.jsx # Main app with routing
β βββ App.css # Ocean-themed dark mode styles
β βββ index.jsx # React entry point
β βββ components/
β βββ ChatPanel.jsx # AI chat with real-time streaming
β βββ Dashboard.jsx # Data dashboard with stats cards
β βββ Charts.jsx # Recharts visualizations
β βββ MapView.jsx # Leaflet interactive map
β βββ Navbar.jsx # Navigation bar
βββ screenshots/ # App screenshots
βββ setup.sh # Interactive one-click setup
βββ docker-compose.yml # Docker Compose multi-service
βββ Dockerfile # Multi-stage Docker build
βββ CONTRIBUTING.md # Contribution guidelines
βββ LICENSE # MIT License
βββ README.md # This file
All endpoints are served from http://localhost:5000.
| Method | Endpoint | Description |
|---|---|---|
POST |
/api/chat |
Send message, get full JSON response |
POST |
/api/chat/stream |
Send message, get streaming NDJSON tokens |
Request body:
{ "message": "What is the average ocean temperature?" }/api/chat response:
{
"reply": "Based on 885,570 measurements...",
"chart_type": "temperature_profile"
}/api/chat/stream response (newline-delimited JSON):
{"token": "Based ", "done": false, "chart_type": "temperature_profile"}
{"token": "on ", "done": false, "chart_type": "temperature_profile"}
...
{"token": "", "done": true, "chart_type": "temperature_profile"}
| Method | Endpoint | Description |
|---|---|---|
GET |
/ |
Health check & setup status |
GET |
/api/data/stats |
Dataset statistics overview |
GET |
/api/data/floats |
All float metadata + positions |
GET |
/api/data/floats/<wmo_id> |
Single float details |
GET |
/api/data/floats/<wmo_id>/trajectory |
Float trajectory data |
GET |
/api/data/charts/temp-profile?platform=X |
Temperature vs depth |
GET |
/api/data/charts/sal-profile?platform=X |
Salinity vs depth |
GET |
/api/data/charts/ts-diagram?platform=X |
T-S diagram data |
GET |
/api/data/charts/depth-time?platform=X |
Depth-time section |
POST |
/api/data/query |
Query records with filters |
POST |
/api/data/export |
Export filtered data as CSV |
All configuration lives in backend/.env. Copy from the template:
cp backend/.env.example backend/.env| Variable | Default | Description |
|---|---|---|
AI_MODE |
ollama |
AI provider: ollama, openai, or leave unset for basic |
OLLAMA_URL |
http://localhost:11434 |
Ollama server URL |
OLLAMA_MODEL |
llama3.2 |
Ollama model name |
OPENAI_API_KEY |
β | OpenAI API key (only for AI_MODE=openai) |
OPENAI_MODEL |
gpt-4o-mini |
OpenAI model name |
DATABASE_URL |
sqlite:///./db/argo_data.db |
SQLite database path |
FLASK_ENV |
development |
development or production |
FLASK_SECRET_KEY |
your-secret-key-here |
Flask session secret |
PORT |
5000 |
Backend server port |
FRONTEND_URL |
http://localhost:3000 |
Frontend URL for CORS |
This project adheres to strict oceanographic data quality standards:
- Boundary Checks β Floats drifting outside 20Β°Eβ120Β°E longitude are automatically filtered.
- Surface Statistics β "Surface Temperature" uses only the top 10 m of the water column (scientifically accepted mixed-layer definition).
- Verified Sources β All data from the Argovis 2.0 API, cross-checked against official float indices.
- No Synthetic Data β Every record is a real ARGO float profile in the Indian Ocean.
| Region | Latitude | Longitude |
|---|---|---|
| Arabian Sea | 5Β°N β 30Β°N | 45Β°E β 78Β°E |
| Bay of Bengal | 5Β°N β 25Β°N | 78Β°E β 100Β°E |
| Equatorial Indian Ocean | 10Β°S β 10Β°N | 40Β°E β 100Β°E |
| Southern Indian Ocean | 35Β°S β 10Β°S | 20Β°E β 120Β°E |
Contributions are welcome! Please see CONTRIBUTING.md for detailed guidelines.
Quick version:
- Fork the repo
- Create a feature branch:
git checkout -b feature/amazing-feature - Commit changes:
git commit -m 'Add amazing feature' - Push:
git push origin feature/amazing-feature - Open a Pull Request
Distributed under the MIT License. See LICENSE for details.
- Argovis β API for accessing ARGO float data
- CSIRO & INCOIS β Deploying Indian Ocean ARGO floats
- Ollama β Democratizing local AI inference
- FAISS β Efficient vector similarity search
- SentenceTransformers β State-of-the-art text embeddings
Made with π for ocean science
.png)
.png)
.png)
.png)