A demo of Retrieval-Augmented Generation (RAG) application with MCP server integration.
- MCP server integration
- Document retrieval using vector search with ChromaDB
- Context-aware prompt generation
- Integration with LLM APIs
pip install -r requirements.txt
Connect to the MCP server with Claude Desktop, Cursor, or your preferred IDE.
Use the process_query tool to ask questions about the company.
Set up your environment variables in .env:
OPENAI_API_KEY=your_api_key
app/retrieval.py: Document retrieval functionality app/context.py: Context management app/llm_client.py: LLM API integration app/prompt_builder.py: Prompt construction
MIT
