Skip to content

Extract Reference Data Service from TraderX Monorepo#2

Open
devin-ai-integration[bot] wants to merge 1 commit intomainfrom
devin/1750294667-extract-reference-data-service
Open

Extract Reference Data Service from TraderX Monorepo#2
devin-ai-integration[bot] wants to merge 1 commit intomainfrom
devin/1750294667-extract-reference-data-service

Conversation

@devin-ai-integration
Copy link

Extract Reference Data Service from TraderX Monorepo

This PR extracts the reference data service from the COG-GTM/traderXCognitiondemos monorepo into a standalone microservice repository. The service provides REST API access to S&P 500 company data for ticker symbols and company names.

Changes Made

Service Extraction

  • ✅ Copied complete /reference-data directory structure from monorepo to repository root
  • ✅ Included all source code (src/), tests (test/), and configuration files
  • ✅ Preserved S&P 500 companies CSV data file at /data/s-and-p-500-companies.csv
  • ✅ Maintained OpenAPI specification (openapi.yaml)

Standalone Configuration

  • ✅ Created new standalone Dockerfile with /app working directory
  • ✅ Updated Docker configuration to expose port 18085 and run npm run start
  • ✅ Service runs independently without dependencies on other TraderX services or shared database

API Contract Preservation

  • GET /stocks - Returns all securities with ticker and company name
  • GET /stocks/{ticker} - Returns specific security by ticker or 404 if not found
  • GET /health - Service health check endpoint
  • GET /api - OpenAPI/Swagger documentation interface

Documentation

  • ✅ Comprehensive README.md with standalone service documentation
  • ✅ Integration points documented for Trade Service (REFERENCE_DATA_HOST environment variable)
  • ✅ Web Frontend integration instructions included
  • ✅ Docker and Node.js usage examples provided

Testing Results

Local Testing Completed ✅

The service was successfully tested locally with all endpoints working correctly:

# Service startup - successful
npm run build  # ✅ Built successfully
npm run start  # ✅ Started on port 18085

# API endpoint testing - all working
curl http://localhost:18085/stocks        # ✅ Returns full S&P 500 list
curl http://localhost:18085/stocks/AAPL   # ✅ Returns {"ticker":"AAPL","companyName":"Apple"}
curl http://localhost:18085/health        # ✅ Returns {"status":"ok","info":{},"error":{},"details":{}}

Service Verification

  • ✅ CSV data loads correctly on startup (503 S&P 500 companies)
  • ✅ NestJS application initializes all modules successfully
  • ✅ All routes mapped correctly: /stocks, /stocks/:ticker, /health
  • ✅ No external database dependencies - reads from CSV file
  • ✅ Port 18085 configuration preserved
  • ✅ OpenAPI documentation accessible

Integration Points

For Trade Service

Set the REFERENCE_DATA_HOST environment variable to point to this service:

REFERENCE_DATA_HOST=reference-data-service:18085

For Web Frontend

Configure the frontend to make direct REST calls to this service's URL.

Files Added/Modified

New Files (25 total):

  • Dockerfile - Standalone Docker configuration
  • src/ - Complete NestJS application source code
  • data/s-and-p-500-companies.csv - S&P 500 companies dataset
  • openapi.yaml - API specification
  • package.json - Node.js dependencies and scripts
  • Configuration files: .eslintrc.js, .prettierrc, nest-cli.json, tsconfig.json

Modified Files:

  • README.md - Comprehensive standalone service documentation

Next Steps

After this PR is merged, the original monorepo can be updated to point to this external service, but that will be handled in a separate task.


Link to Devin run: https://app.devin.ai/sessions/048fc560be1843c0bb83357f5e5abfad
Requested by: Samir Chaudhry (samir@cognition.ai)

- Copy complete reference-data service from traderXCognitiondemos monorepo
- Create standalone Dockerfile with /app working directory and port 18085
- Include S&P 500 companies CSV data file
- Preserve API contract: GET /stocks and GET /stocks/{ticker} endpoints
- Update README.md with comprehensive standalone service documentation
- Service tested and verified working with all endpoints responding correctly
- No external dependencies - reads from CSV file on startup
- Integration points documented for Trade Service and Web Frontend

Co-Authored-By: Samir Chaudhry <schaudhry123@gmail.com>
@devin-ai-integration
Copy link
Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants