Live Demo: thedailyledger.nunnarivulabs.in
The Daily Ledger is a full-stack business intelligence dashboard built for a fictional e-commerce store. It's designed from the ground up to handle massive datasets with high performance, providing a clean, intuitive, and real-time interface for monitoring key business metrics.
This project was architected not just to work with a few hundred rows, but to be truly scalable. The local development environment is populated with over 10 million order records to rigorously test and demonstrate the efficiency of the backend queries and frontend rendering strategies.
The public live demo is hosted on Netlify and connected to a free-tier Supabase Postgres database. To respect these resource limits, it has been seeded with a still-impressive 50,000 orders. While this demonstrates all features, the true power of the application's architecture is best seen with the full 10 million record dataset.
I would be happy to provide a live, one-on-one demonstration of the application running against the full 10M+ record database to showcase its performance under heavy load.
- Dynamic KPI Cards: At-a-glance metrics for Revenue, Profit, Orders, and New Users, with contextual comparison against previous periods.
- Interactive Trend Chart: A multi-line chart visualising Sales vs. Profit over any selected date range.
- High-Performance Data Grid: An orders table architected to handle millions of records with a fluid, infinitely scrolling user experience.
- Advanced Data Fetching: A "sparse virtualisation" engine that only fetches the data for the portion of the table the user is currently looking at.
- Intercepted Modal Routes: Seamlessly view order details in a modal that has its own dedicated, shareable URL, showcasing a professional UX pattern.
- Fully Responsive: A clean and adaptive UI that works beautifully on all screen sizes, from mobile sheets to desktop modals.
The biggest challenge was building the Orders page to handle the dataset of over 10 million records. A standard list would crash any browser. The solution is a custom-built sparse virtualisation engine.
The Problem: You can't render 10 million rows. Even a simple infinite scroll fetching 50 rows at a time would eventually overwhelm the browser's memory.
The Solution: Think of the table as a magic book with 10 million pages. The virtualisation engine is smart enough to only ever render the 20-30 pages you're looking at right now.
- Efficient Rendering: It keeps the DOM footprint incredibly small, ensuring a smooth scrolling experience.
- Sparse Data Fetching: The "sparse" part is the real magic. If you grab the scrollbar and jump to the middle of the list (say, to order #5,000,000), the app is intelligent enough to only fetch the pages for that specific window, without loading all the pages in between.
This is an advanced, production-grade pattern that ensures the application remains blazing fast and responsive at any scale.
sparse_loading.mp4
This project is a modern full-stack application built with a focus on performance, type safety, and a great developer experience.
- Frontend: Next.js (App Router), React, TypeScript
- Styling: Tailwind CSS, shadcn/ui
- Data Fetching (Client): TanStack Query
- Data Grid: TanStack Table & TanStack Virtual
- Charting: Recharts
- Database: PostgreSQL (Cloud: Supabase, Local: Docker)
- ORM & Queries: Drizzle ORM
- Deployment: Netlify
To run the full-scale version of this project on your local machine, follow these steps:
-
Clone the repository:
git clone https://github.com/nunnarivu-labs/the-daily-ledger.git cd the-daily-ledger -
Install dependencies:
npm install
-
Set up environment variables: Create a
.envfile in the root of the project and add your local database URL:DATABASE_URL="postgresql://[your_username]:[your_password]@localhost:5432/daily_ledger" -
Start the local database: Ensure Docker is installed and running.
docker-compose up -d
-
Apply the database schema: This will create all the necessary tables in your local Postgres instance.
npm run db:push
-
Seed the database: This will populate your database with over 10 million records. Warning: This is a very long-running process (30-60+ minutes). Modify the constants in
db/seed.tsfor a smaller dataset if desired.npm run db:seed
-
Run the development server:
npm run dev
Open http://localhost:3000 to view the application.
Once the initial setup is complete, you can use this single, streamlined command for your daily work. It starts both the Postgres Docker container and the Next.js development server in parallel. When you stop the process, it will automatically and cleanly shut down the Docker container as well, so you don't have to manage them separately.
npm run dev:allBuilding this project involved solving several interesting and challenging problems:
- High-Performance SQL: Wrote advanced PostgreSQL queries using
json_aggto shape complex, nested data directly in the database, dramatically simplifying the API layer and improving performance for fetching order details. - Conditional Aggregation: Implemented efficient conditional aggregation (
SUM(CASE WHEN ...) to calculate KPI values for both a current and a previous period in a single database query. - Virtualisation Layout: Solved complex CSS layout challenges to build a truly robust virtualised table.
- Database Indexing: Identified performance bottlenecks and added indexes to foreign keys and date columns, learning how critical they are for query performance at scale.
This project is licensed under the Apache 2.0 License.