High-performance backend server for the Nvisy document processing platform.
- High-Performance - Async HTTP server with Axum and Tokio
- RAG Pipeline - Build knowledge bases with document embeddings and semantic search
- LLM Annotations - AI-driven document edits via structured annotations
- Real-Time Updates - Live collaboration via NATS pub/sub and WebSocket
- Interactive Docs - Auto-generated OpenAPI with Scalar UI
| Feature | Description |
|---|---|
| tls | HTTPS support with rustls |
| otel | OpenTelemetry log filtering |
| dotenv | Load config from .env files |
| ollama | Ollama AI backend |
| mock | Mock AI services for testing |
server/
├── crates/
│ ├── nvisy-cli/ # Server binary with CLI
│ ├── nvisy-core/ # Shared types and AI service traits
│ ├── nvisy-nats/ # NATS client (streams, KV, queues)
│ ├── nvisy-ollama/ # Ollama client (embeddings, OCR, VLM)
│ ├── nvisy-postgres/ # PostgreSQL database layer
│ └── nvisy-server/ # HTTP handlers and middleware
├── migrations/ # PostgreSQL database migrations
└── Cargo.toml # Workspace configuration
# Install tools and generate keys
make install-all
make generate-keys
# Run database migrations
make generate-migrations
# Start the server
cargo run --features ollama,dotenvSee .env.example for all available environment variables.
- Scalar UI:
http://localhost:8080/api/scalar - OpenAPI JSON:
http://localhost:8080/api/openapi.json - Health Check:
POST http://localhost:8080/health
cd docker
# Development (with hot reload)
docker-compose -f docker-compose.dev.yml up -d
# Production
docker-compose up -dSee CHANGELOG.md for release notes and version history.
MIT License - see LICENSE.txt
- Documentation: docs.nvisy.com
- Issues: GitHub Issues
- Email: support@nvisy.com
- API Status: nvisy.openstatus.dev