NTViz is a Streamlit application that performs various data analysis, visualization tasks and interacts with Large Language Models (LLMs) through the LIDA framework.
Access NTViz directly in your browser.
# Clone the repository
git clone https://github.com/tramphan748/lida_streamlit_app.git
cd lida_streamlit_app
# Install the required packages
pip install -r requirements.txt
# Run the application
streamlit run streamlit_app.pyNote: The requirements.txt file includes llmx-gemini, a customized version of the llmx library that supports the Gemini API. If you manually installed llmx in editable mode, ensure that the same code or git+... line is present in requirements.txt.
Introduction to the project, overview of capabilities, and user interface.
Step-by-step guide to obtain API keys from two main providers:
- Cohere
- Gemini (using our customized llmx-gemini library)
Generate comprehensive data overviews:
- Automatic statistics generation
- Charts and visualizations
- Missing value detection
- Correlation analysis
- Basic exploratory data analysis (EDA)
Leverage LLMs to automatically generate:
- Data summaries
- Top 5 analytical goals
Create visualizations based on natural language user queries.
Based on summaries, goals, code, and generated visualizations, the system automatically suggests 1-5 additional charts.
- LIDA Github – Original LIDA framework
- llmx-gemini – Custom Gemini integration for llmx
- YData Profiling – Data profiling library for quick overviews
- Streamlit Documentation – Building interactive data apps in Python
A paper describing LIDA (Accepted at ACL 2023 Conference) is available here.
@inproceedings{dibia2023lida,
title = "{LIDA}: A Tool for Automatic Generation of Grammar-Agnostic Visualizations and Infographics using Large Language Models",
author = "Dibia, Victor",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 3: System Demonstrations)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-demo.11",
doi = "10.18653/v1/2023.acl-demo.11",
pages = "113--126",
}




