Enterprise-grade, production-ready NestJS boilerplate with modern architecture patterns
| Statements | Branches | Functions | Lines |
|---|---|---|---|
- Overview
- Architecture
- Key Features
- Tech Stack
- Prerequisites
- Quick Start
- Running the Application
- Database Migrations
- CRUD Scaffolding
- Testing
- API Documentation
- Documentation
- Observability
- Code Quality
- Project Structure
- Advanced Usage
- Contributing
- License
- Contributors
- Support
- Resources
This boilerplate provides a production-ready foundation for building scalable microservices with NestJS. It implements industry-standard architecture patterns including Onion Architecture, Domain-Driven Design (DDD), and Ports and Adapters (Hexagonal Architecture).
Built with enterprise needs in mind, it offers comprehensive features for authentication, authorization, multi-database support, observability, and automated CRUD generation - allowing teams to focus on business logic rather than infrastructure.
This project follows Clean Architecture principles with a focus on maintainability, testability, and scalability.
The codebase is organized in concentric layers where dependencies point inward:
- Core Layer: Contains business entities, use cases, and repository interfaces
- Infrastructure Layer: Implements external concerns (databases, cache, HTTP clients)
- Application Layer: Modules that wire everything together
- Presentation Layer: Controllers and API adapters
- Entities: Business objects with identity (
src/core/*/entity) - Use Cases: Application-specific business rules (
src/core/*/use-cases) - Repository Pattern: Abstract data access (
src/core/*/repository) - Value Objects: Validated domain primitives
- Ports: Abstract interfaces defining contracts
- Adapters: Concrete implementations for external systems
- Dependency Inversion: Core logic independent of frameworks
βββββββββββββββββββββββββββββββββββββββββββ
β Controllers (Modules) β β HTTP/API Layer
βββββββββββββββββββββββββββββββββββββββββββ€
β Use Cases (Core) β β Business Logic
βββββββββββββββββββββββββββββββββββββββββββ€
β Entities & Repositories (Core) β β Domain Layer
βββββββββββββββββββββββββββββββββββββββββββ€
β Infrastructure (DB, Cache, HTTP) β β External Services
βββββββββββββββββββββββββββββββββββββββββββ
-
JWT-based Authentication
- Login/Logout endpoints
- Access token generation
- Refresh token mechanism
- Token blacklisting on logout
-
Password Management
- Secure password hashing
- Password change functionality
- Forgot password flow with email
- Reset password with token validation
-
Role-Based Access Control (RBAC)
- Dynamic role assignment
- Granular permission system
- Endpoint-level authorization
- Permission inheritance
- Relational data modeling
- Complex queries and joins
- Transaction support
- Migration system
- Soft delete functionality
- Document-based storage
- 3-node Replica Set for high availability
- Automatic failover and data redundancy
- Flexible schema design
- Built-in pagination
- Text search capabilities
- Migration support
Automatically generate complete CRUD operations in seconds:
- Entity generation with validation
- Use cases (Create, Read, Update, Delete, List)
- Repository implementation
- Controller with routes
- Swagger documentation
- Unit tests (100% coverage)
- Input/Output DTOs
- OpenTelemetry integration
- Zipkins for trace visualization
- HTTP request tracing
- Database query tracing
- Inter-service call tracking
- Custom span creation
- Pino high-performance logger
- Structured JSON logging
- Request/Response logging
- Error tracking with stack traces
- Log aggregation with Loki
- Configurable log levels
- Request duration
- HTTP status codes
- Database query performance
- Cache hit/miss ratio
- Custom business metrics
- Prometheus-compatible format
- Database connectivity
- Cache availability
- Memory usage
- CPU metrics
- Custom health indicators
-
Circuit Breaker (Opossum)
- Automatic failure detection
- Fallback mechanisms
- Configurable thresholds
- Service degradation
-
Retry Logic
- Exponential backoff
- Configurable retry policies
- Request timeout handling
- Multi-language support
- Dynamic language switching
- Validation message translation
- API response localization
- Supported languages: English, Portuguese
- Redis for distributed caching
- NodeCache for in-memory caching
- TTL configuration
- Cache invalidation
- Cache-aside pattern
- Helmet.js for HTTP headers
- CORS configuration
- Request rate limiting
- Input validation with Zod
- SQL injection prevention
- XSS protection
- CSRF protection ready
- Handlebars templates
- Welcome emails
- Password reset emails
- SMTP configuration
- HTML/Plain text support
- NestJS 11.x - Progressive Node.js framework
- TypeScript 5.9.3 - Type-safe development
- Node.js 22.x - Runtime environment
- PostgreSQL with TypeORM - Relational database
- MongoDB with Mongoose - Document database
- Redis - Caching and sessions
- OpenTelemetry - Distributed tracing
- Pino - High-performance logging
- Zipkins - Trace visualization
- Prometheus - Metrics collection
- Jest - Testing framework
- Supertest - HTTP assertions
- Testcontainers - Integration testing with Docker
- ESLint - Linting
- Prettier - Code formatting
- Husky - Git hooks
- Commitlint - Commit message linting
- Lint-staged - Staged files linting
- Docker - Containerization
- Docker Compose - Multi-container orchestration
- PM2 - Process management
- Artillery - Load testing
- Zod - Schema validation
- Axios - HTTP client
- JWT - Token management
- Nodemailer - Email sending
- Swagger - API documentation
Before you begin, ensure you have the following installed:
- Node.js >= 22.0.0 (Download)
- npm >= 9.x or yarn >= 1.22.x
- Docker >= 20.x (Download)
- Docker Compose >= 2.x
- NVM (Node Version Manager) - Recommended
- OS: Linux, macOS, or Windows (with WSL2)
- Memory: Minimum 4GB RAM (8GB recommended)
- Disk Space: 2GB free space
git clone https://github.com/mikemajesty/nestjs-microservice-boilerplate-api.git
cd nestjs-microservice-boilerplate-api# Install the required Node.js version
nvm install
# Use the installed version
nvm usenpm installStart all required services (PostgreSQL, MongoDB, Redis, Zipkins, etc.):
npm run setupThis command will:
- Stop and remove existing containers
- Clean up volumes
- Start fresh containers for all services
- Wait for services to be ready
Migrations run automatically on application start, but you can run them manually:
# Run all migrations
npm run migration:run# Development mode with hot-reload
npm run start:dev
# Debug mode
npm run start:debug
# Production mode
npm run startThe API will be available at: http://localhost:5000
Open your browser and navigate to:
http://localhost:5000/api-docs
Login with default credentials:
curl -X 'POST' \
'http://localhost:5000/api/v1/login' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"email": "admin@admin.com",
"password": "admin"
}'nest-cli.json- NestJS CLI configurationtsconfig.json- TypeScript compiler optionsjest.config.ts- Testing configurationeslint.config.mjs- Linting rules.prettierrc- Code formatting rulesdocker-compose-infra.yml- Infrastructure services
After running npm run setup, the following services will be available:
| Service | URL | Credentials | Description |
|---|---|---|---|
| PostgreSQL | localhost:5432 |
User: adminPassword: admin |
Primary relational database |
| MongoDB Replica Set | localhost:27017 (Primary)localhost:27018 (Secondary)localhost:27019 (Tertiary) |
User: adminPassword: admin123 |
3-node MongoDB replica set for high availability |
| Redis | localhost:6379 |
Password: redis123 |
In-memory cache and session store |
| Service | URL | Credentials | Description |
|---|---|---|---|
| PgAdmin | http://localhost:16543 |
Email: pgadmin@gmail.comPassword: PgAdmin2019! |
PostgreSQL administration |
| Mongo Express | http://localhost:8081 |
- | MongoDB web interface |
| Service | URL | Credentials | Description |
|---|---|---|---|
| Zipkin | http://localhost:9411 |
- | Distributed tracing UI |
| Prometheus | http://localhost:9090 |
- | Metrics collection and querying |
| Grafana | http://localhost:3000 |
User: adminPassword: grafana123 |
Metrics visualization and dashboards |
| Loki | http://localhost:3100 |
- | Log aggregation system |
| AlertManager | http://localhost:9093 |
- | Alert management and routing |
| Service | Ports | Description |
|---|---|---|
| OpenTelemetry Collector | 4317 (gRPC)4318 (HTTP)9464 (Prometheus) |
Receives, processes and exports telemetry data |
| Promtail | - | Ships logs to Loki |
Hot-reload enabled for rapid development:
npm run start:devAttach a debugger to inspect and debug:
npm run start:debugThen attach your IDE debugger to port 9229.
Optimized build for production:
# Build the application
npm run build
# Start production server
npm run startRun the entire application stack with Docker Compose:
# Start all services
docker-compose up -d
# View logs
docker-compose logs -f microservice-api
# Stop services
docker-compose downPM2 provides process management and monitoring:
# Start with PM2
npm run start
# Monitor processes
pm2 monit
# View logs
pm2 logs
# Restart application
pm2 restart ecosystem.config.js
# Stop application
pm2 stop ecosystem.config.jsnpm run migration-postgres:createThis creates a new migration file in src/infra/database/postgres/migrations/.
npm run migration-postgres:runnpm run migration-postgres:undonpm run migration-mongo:createProvide a name when prompted (e.g., createUsersCollection).
npm run migration-mongo:runnpm run migration-mongo:undoRun both PostgreSQL and MongoDB migrations concurrently:
npm run migration:runGenerate a complete, production-ready CRUD module in seconds!
npm run scaffold-
Choose Database
POSTGRES:CRUD- Generate CRUD for PostgreSQLMONGO:CRUD- Generate CRUD for MongoDBLIB- Generate a library moduleINFRA- Generate infrastructure componentMODULE- Generate a custom moduleCORE- Generate core domain logic
-
Enter Module Name
- Use singular form (e.g.,
product,order,customer) - Follow camelCase or kebab-case naming
- Use singular form (e.g.,
For a module named product:
src/
βββ core/
β βββ product/
β βββ entity/
β β βββ product.ts # Domain entity with validation
β βββ repository/
β β βββ product.ts # Repository interface
β βββ use-cases/
β βββ product-create.ts # Create use case
β βββ product-update.ts # Update use case
β βββ product-delete.ts # Delete use case (soft)
β βββ product-get-by-id.ts # Find by ID use case
β βββ product-list.ts # List with pagination/search
β βββ __tests__/ # Unit tests (100% coverage)
β βββ product-create.spec.ts
β βββ product-update.spec.ts
β βββ product-delete.spec.ts
β βββ product-get-by-id.spec.ts
β βββ product-list.spec.ts
βββ modules/
β βββ product/
β βββ adapter.ts # Use case adapters
β βββ controller.ts # REST controller
β βββ module.ts # NestJS module
β βββ repository.ts # Repository implementation
β βββ swagger.ts # API documentation
βββ infra/
βββ database/
βββ [postgres|mongo]/
βββ schemas/
βββ product.ts # Database schema
Each CRUD module includes:
β
Entity Validation - Zod schemas for type-safe validation
β
Pagination - Offset/limit based pagination
β
Search - Full-text search capabilities
β
Sorting - Multi-field sorting
β
Soft Delete - Logical deletion with deletedAt
β
Filtering - Dynamic query filters
β
Swagger Docs - Auto-generated API documentation
β
Unit Tests - 100% test coverage
β
Type Safety - Full TypeScript support
β
Error Handling - Consistent error responses
After generation, follow the instructions in the CLI output to:
- Import the module in
app.module.ts - Run migrations if database schema was created
- Access the new endpoints in Swagger
This project maintains 100% code coverage with comprehensive test suites.
test/
βββ initialization.ts # Global test setup
βββ **/*.spec.ts # Unit tests
src/
βββ core/
βββ */use-cases/__tests__/ # Use case tests
npm run testnpm run test:covCoverage reports are generated in:
coverage/- HTML reportcoverage/lcov.info- LCOV format- Badges automatically updated in README
npm run test:debugThen attach your debugger to the Node process.
npm run test -- --watch- Use Case Tests: Business logic validation
- Entity Tests: Domain model validation
- Service Tests: Infrastructure service testing
- API Tests: End-to-end API testing with Supertest
- Database Tests: Using Testcontainers for real databases
- Cache Tests: Redis integration testing
Located in src/utils/tests.ts:
- Mock factories
- Test data generators
- Common assertions
- Setup/teardown helpers
Automatically spins up isolated Docker containers for integration tests:
- PostgreSQL container
- MongoDB container
- Redis container
Ensures tests run in isolation with clean state.
This project follows best practices for test data generation using ZodMockSchema to automatically generate type-safe mock data from Zod schemas.
Use ZodMockSchema to generate test data automatically from your entity schemas:
import { ZodMockSchema } from '@mikemajesty/zod-mock-schema';
import { CatEntitySchema } from '../../entity/cat';
describe('CatCreateUseCase', () => {
let useCase: CatCreateUseCase;
let repository: jest.Mocked<ICatRepository>;
beforeEach(() => {
repository = createMockRepository();
useCase = new CatCreateUseCase(repository);
});
// Generate mock data automatically from schema
const mock = new ZodMockSchema(CatEntitySchema);
const input = mock.generate();
it('should create a cat successfully', async () => {
repository.create = jest.fn().mockResolvedValue(input);
const result = await useCase.execute(input, mockTracing);
expect(result).toEqual(input);
expect(repository.create).toHaveBeenCalledWith(
expect.objectContaining(input)
);
});
});Why use ZodMockSchema?
β
Type Safety: Generates data that matches your Zod schemas
β
Automatic Updates: Mock data updates when schema changes
β
Consistency: Same data structure across all tests
β
Less Boilerplate: No need to manually create test fixtures
β
Valid Data: Generated data always passes schema validation
Documentation: For advanced usage and customization, see ZodMockSchema Documentation
For specific test cases where you need custom data:
it('should create a product successfully', async () => {
const input = { name: 'Test Product', price: 99.99 };
const result = await useCase.execute(input);
expect(result).toMatchObject(input);
expect(repository.create).toHaveBeenCalledWith(
expect.objectContaining(input)
);
});Coverage badges are automatically generated and updated in the README after running npm run test:cov.
This project uses TypeSpec to generate OpenAPI specifications, providing interactive API documentation via Swagger UI.
Open your browser and navigate to:
http://localhost:5000/api-docs
The Swagger UI provides:
- Interactive endpoint testing
- Request/response schemas
- Authentication flows
- Real-time API exploration
The OpenAPI 3.0 specification is auto-generated from TypeSpec definitions and available at:
docs/tsp-output/@typespec/openapi3/openapi.api.1.0.yamlAll API endpoints follow a versioned structure: /api/{version}/resource
| Method | Endpoint | Description |
|---|---|---|
| POST | /api/v1/login |
User login |
| POST | /api/v1/logout |
User logout |
| POST | /api/v1/forgot-password |
Request password reset |
| POST | /api/v1/reset-password |
Reset password with token |
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/v1/users |
List users (paginated) |
| GET | /api/v1/users/:id |
Get user by ID |
| POST | /api/v1/users |
Create new user |
| PUT | /api/v1/users/:id |
Update user |
| DELETE | /api/v1/users/:id |
Delete user (soft) |
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/v1/roles |
List roles |
| GET | /api/v1/roles/:id |
Get role by ID |
| POST | /api/v1/roles |
Create role |
| PUT | /api/v1/roles/:id |
Update role |
| DELETE | /api/v1/roles/:id |
Delete role |
| Method | Endpoint | Description |
|---|---|---|
| GET | /api/v1/permissions |
List permissions |
| GET | /api/v1/permissions/:id |
Get permission by ID |
| POST | /api/v1/permissions |
Create permission |
| PUT | /api/v1/permissions/:id |
Update permission |
| DELETE | /api/v1/permissions/:id |
Delete permission |
| Method | Endpoint | Description |
|---|---|---|
| GET | /health |
Application health check |
| GET | /health/database |
Database connectivity |
| GET | /health/cache |
Cache availability |
curl -X 'POST' \
'http://localhost:5000/api/v1/login' \
-H 'Content-Type: application/json' \
-d '{
"email": "admin@admin.com",
"password": "admin"
}'curl -X 'GET' \
'http://localhost:5000/api/v1/users?limit=10&offset=0&sort=createdAt:desc' \
-H 'Authorization: Bearer YOUR_TOKEN'curl -X 'POST' \
'http://localhost:5000/api/v1/users' \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer YOUR_TOKEN' \
-d '{
"name": "John Doe",
"email": "john@example.com",
"password": "securePassword123",
"roles": ["user"]
}'Success responses return the data directly from use cases without additional wrapping:
{
"id": "123e4567-e89b-12d3-a456-426614174000",
"name": "John Doe",
"email": "john@example.com",
"createdAt": "2024-12-08T00:00:00.000Z"
}For list endpoints with pagination:
{
"docs": [
{
"id": "123e4567-e89b-12d3-a456-426614174000",
"name": "John Doe",
"email": "john@example.com"
}
],
"page": 1,
"limit": 10,
"total": 100
}Error responses follow a standardized structure:
{
"error": {
"code": 400,
"traceid": "abc-def-123",
"context": "UserModule",
"message": [
"email: Invalid email format",
"name: String must contain at least 1 character(s)"
],
"timestamp": "08/12/2024 10:30:45",
"path": "/api/v1/users"
}
}Error Response Fields:
code- HTTP status codetraceid- Request trace ID for debuggingcontext- Module/context where the error occurredmessage- Array of error messagestimestamp- Error timestamp in configured formatpath- Request path where error occurred
This project uses TypeSpec as a modern, type-safe way to define API contracts and generate OpenAPI specifications.
TypeSpec is a language for describing cloud service APIs and generating other API description languages, client and service code, documentation, and other assets. It provides excellent IDE support with auto-completion and type checking.
docs/
βββ README.md # Documentation overview
βββ package.json # TypeSpec dependencies
βββ tspconfig.yaml # TypeSpec configuration
βββ docker-compose.yml # Documentation services
βββ src/
β βββ main.tsp # Main TypeSpec entry point
β βββ modules/ # API module specifications
β β βββ cat/ # Example: Cat module
β β β βββ controller.tsp
β β β βββ model.tsp
β β β βββ exception.tsp
β β βββ user/
β β βββ role/
β β βββ permission/
β β βββ login/
β β βββ logout/
β β βββ reset-password/
β β βββ health/
β βββ utils/
β βββ exceptions.tsp # Common exceptions
β βββ model.tsp # Common models
β βββ versioning.tsp # API versioning
βββ tsp-output/ # Generated OpenAPI specs
βββ @typespec/
βββ openapi3/
βββ openapi.api.1.0.yaml
cd docs
npm install
# or
yarn doc:installThis starts a live-reload documentation server:
yarn startThis will:
- Compile TypeSpec files on changes
- Serve Swagger UI with live reload
- Watch for file changes automatically
To manually compile TypeSpec specifications:
cd docs
yarn doc:compilerGenerated files will be in docs/tsp-output/@typespec/openapi3/
For a new module (e.g., product), create these files in docs/src/modules/product/:
controller.tsp - Defines the API endpoints
import "@typespec/http";
import "@typespec/rest";
import "@typespec/openapi3";
import "../../utils/model.tsp";
import "./model.tsp";
import "./exception.tsp";
using TypeSpec.Http;
using Utils.Model;
namespace api.Product;
@tag("Product")
@route("api/{version}/products")
@useAuth(BearerAuth)
interface ProductController {
@post
@doc("Create product")
@returnsDoc("Product created successfully")
create(
...VersionParams,
@body body: CreateInput
): CreateOutput | CreateValidationException;
@get
@doc("List products")
@returnsDoc("Products retrieved successfully")
list(
...VersionParams,
...ListQueryInput
): ListOutput;
}model.tsp - Defines data models
import "../../utils/model.tsp";
using Utils.Model;
namespace api.Product;
model CreateInput {
name: string;
price: decimal;
description?: string;
}
model CreateOutput {
id: string;
name: string;
price: decimal;
description?: string;
createdAt: utcDateTime;
}
model ListOutput is PaginatedResponse<CreateOutput>;exception.tsp - Defines error responses
import "../../utils/exceptions.tsp";
using Utils.Exceptions;
namespace api.Product;
model CreateValidationException is Exception<400, "Validation failed">;
model NotFoundException is Exception<404, "Product not found">;Add your module to docs/src/main.tsp:
import "./modules/product/controller.tsp";cd docs
yarn doc:compilerCheck the generated OpenAPI spec in docs/tsp-output/@typespec/openapi3/openapi.api.1.0.yaml
- Follow the existing module structure
- Use consistent naming conventions (PascalCase for models, camelCase for properties)
- Leverage common models from
utils/model.tsp - Reuse exception definitions from
utils/exceptions.tsp
- Define explicit types for all properties
- Use TypeSpec's built-in types (
string,int32,decimal,utcDateTime) - Leverage models for request/response validation
- Use unions for multiple possible responses
- Add
@docdecorator for endpoint descriptions - Use
@returnsDocfor response descriptions - Include
@examplefor complex models - Add
@summaryfor concise endpoint summaries
- All endpoints use the
{version}path parameter - Leverage the
VersionParamsmodel fromutils/versioning.tsp - Follow semantic versioning for API changes
Why TypeSpec instead of NestJS Swagger decorators?
β
Type Safety: TypeSpec provides compile-time type checking
β
Separation of Concerns: API contracts separated from implementation
β
Better DX: Superior IDE support with IntelliSense
β
Reusability: Shared models and types across endpoints
β
Tooling: Auto-generation of clients, mocks, and documentation
β
Standard: OpenAPI 3.0 compliant output
β
Maintainability: Single source of truth for API contracts
- TypeSpec Official Documentation
- TypeSpec Playground
- OpenAPI 3.0 Specification
- TypeSpec REST Library
- TypeSpec HTTP Library
- TypeSpec Versioning
Access Zipkin for distributed tracing:
http://localhost:9411
- Automatic Instrumentation: HTTP requests, database queries
- Custom Spans: Create application-specific spans
- Context Propagation: Trace requests across services
- Performance Analysis: Identify bottlenecks
See TRACING.md for detailed documentation.
Basic example:
// In your use case
async execute(input: Input, httpService: IHttpAdapter): Promise<Output> {
const http = httpService.instance();
const span = httpService.tracing.createSpan('external-api-call');
try {
span.setTag(httpService.tracing.tags.PEER_SERVICE, 'external-api');
const result = await http.get('https://api.example.com/data');
span.finish();
return result;
} catch (error) {
span.setTag(httpService.tracing.tags.ERROR, true);
span.setTag('message', error.message);
span.finish();
throw error;
}
}- fatal: System is unusable
- error: Error events
- warn: Warning messages
- info: Informational messages
- debug: Debug messages
- trace: Very detailed trace messages
All logs are JSON-formatted for easy parsing:
{
"level": "info",
"time": 1702000000000,
"msg": "User created",
"userId": "123e4567-e89b-12d3-a456-426614174000",
"correlationId": "abc-def-ghi",
"service": "microservice-api"
}Automatic logging of all HTTP requests:
- Request method and URL
- Request headers
- Request body
- Response status
- Response time
- User information
# Application logs
tail -f logs/app.log
# Error logs only
tail -f logs/error.log
# With PM2
pm2 logs microservice-api
# With Docker
docker-compose logs -f microservice-apiLogs are automatically shipped to Loki for centralized log aggregation:
http://localhost:3100
View logs in Grafana with Loki data source pre-configured.
Metrics are exposed in Prometheus format:
http://localhost:5000/metrics
Access Prometheus for metrics visualization:
http://localhost:9090
Access Grafana for advanced monitoring dashboards:
http://localhost:3000
Default credentials:
- Username:
admin - Password:
grafana123
-
HTTP Metrics
- Request count
- Request duration (histogram)
- Response status codes
- Active requests
-
Database Metrics
- Query count
- Query duration
- Connection pool status
-
Cache Metrics
- Hit/miss ratio
- Cache size
- Eviction count
-
Application Metrics
- Memory usage
- CPU usage
- Event loop lag
curl http://localhost:5000/healthResponse:
{
"status": "ok",
"info": {
"database": { "status": "up" },
"cache": { "status": "up" }
},
"details": {
"database": { "status": "up" },
"cache": { "status": "up" }
}
}Checks if the application is ready to receive traffic:
- Database connectivity
- Cache availability
- Required migrations applied
npm run lintnpm run lint -- --fixLocated in eslint.config.mjs:
- TypeScript rules
- NestJS best practices
- Security rules
- Import sorting
- No lodash/underscore (prefer native JS)
npm run prettierLocated in .prettierrc:
- Single quotes
- No semicolons
- 100 character line width
- 2 space indentation
Automatically runs on git commit:
- Lints staged files
- Runs type checking
- Validates commit message format
Only lints files staged for commit:
{
"*.{ts,js}": ["eslint --fix"],
"*.json": ["prettier --write"]
}This project uses a hybrid approach based on Conventional Commits with mandatory scopes enforced by Commitlint.
<type>(<scope>): <subject>
<body>
<footer>
Important: The <scope> is mandatory and must be one of the allowed values.
feat: New featurefix: Bug fixdocs: Documentation changesstyle: Code style changes (formatting)refactor: Code refactoringperf: Performance improvementstest: Adding or updating testsbuild: Build system changesci: CI/CD changeschore: Other changes
Scopes are dynamically generated from the project structure (src/ directories) plus the following fixed scopes:
Project Structure Scopes: Automatically includes all directory names from:
src/core/*- Core domain modules (user, role, permission, cat, etc.)src/infra/*- Infrastructure components (database, cache, http, email, etc.)src/libs/*- Shared libraries (token, event, i18n, etc.)src/modules/*- Application modulessrc/utils/*- Utility directories
Fixed Scopes:
remove- Removing files or featuresrevert- Reverting previous commitsconflict- Resolving merge conflictsconfig- Configuration changesentity- Entity-related changesutils- Utility functionsdeps- Dependency updatesmodules- Module-level changestest- Test filesmigration- Database migrationscore- Core layer changesswagger- API documentationusecases- Use case implementations
The scope is validated on commit using Commitlint configuration in commitlint.config.js. Invalid scopes will reject the commit.
To see all available scopes, check the project structure or run:
node -e "console.log(require('./commitlint.config.js').rules['scope-enum'][2])"feat(user): add email verification feature
fix(auth): resolve token refresh issue
docs(readme): update API documentation section
test(user): add unit tests for user creation use case
refactor(database): optimize connection pooling
chore(deps): update nestjs to version 11.x# Missing scope
feat: add new feature
# Invalid scope
feat(invalid-scope): add new feature
# These will be rejected by CommitlintAutomatic versioning and changelog generation based on commit messages:
- Analyzes commits
- Determines version bump (major/minor/patch)
- Generates changelog
- Creates GitHub release
- Publishes to npm (if configured)
nestjs-microservice-boilerplate-api/
βββ .artillery/ # Load testing configuration
βββ .docker/ # Docker-related files
βββ .github/ # GitHub Actions workflows
βββ .husky/ # Git hooks
βββ .vscode/ # VS Code settings
βββ docs/ # Additional documentation
βββ scripts/ # Utility scripts
βββ src/
β βββ core/ # π― Business Logic Layer
β β βββ cat/ # Example domain: Cat
β β β βββ entity/ # Domain entities
β β β β βββ cat.ts
β β β βββ repository/ # Repository interfaces
β β β β βββ cat.ts
β β β βββ use-cases/ # Business use cases
β β β βββ cat-create.ts
β β β βββ cat-update.ts
β β β βββ cat-delete.ts
β β β βββ cat-get-by-id.ts
β β β βββ cat-list.ts
β β β βββ __tests__/ # Use case tests
β β βββ user/ # User domain
β β βββ role/ # Role domain
β β βββ permission/ # Permission domain
β β βββ reset-password/ # Password reset domain
β β
β βββ infra/ # π§ Infrastructure Layer
β β βββ database/ # Database implementations
β β β βββ mongo/ # MongoDB setup
β β β β βββ config.ts
β β β β βββ migrations/
β β β β βββ schemas/
β β β β βββ service.ts
β β β βββ postgres/ # PostgreSQL setup
β β β βββ config.ts
β β β βββ migrations/
β β β βββ schemas/
β β β βββ service.ts
β β βββ cache/ # Cache implementations
β β β βββ redis/
β β β βββ memory/
β β βββ http/ # HTTP client service
β β βββ email/ # Email service
β β β βββ service.ts
β β β βββ templates/ # Email templates
β β βββ logger/ # Logging service
β β βββ secrets/ # Secrets management
β β βββ repository/ # Generic repository implementations
β β
β βββ libs/ # π Shared Libraries
β β βββ token/ # JWT token service
β β βββ event/ # Event emitter
β β βββ i18n/ # Internationalization
β β βββ languages/
β β β βββ en/
β β β βββ pt/
β β βββ service.ts
β β
β βββ modules/ # π Application Modules
β β βββ cat/ # Cat module
β β β βββ adapter.ts # Use case adapters
β β β βββ controller.ts # REST controller
β β β βββ module.ts # NestJS module
β β β βββ repository.ts # Repository implementation
β β β βββ swagger.ts # Swagger documentation
β β βββ user/ # User module
β β βββ role/ # Role module
β β βββ permission/ # Permission module
β β βββ login/ # Login module
β β βββ logout/ # Logout module
β β βββ reset-password/ # Password reset module
β β βββ health/ # Health check module
β β
β βββ observables/ # ποΈ Cross-cutting Concerns
β β βββ filters/ # Exception filters
β β β βββ http-exception.filter.ts
β β βββ guards/ # Route guards
β β β βββ auth.guard.ts
β β βββ interceptors/ # Request/Response interceptors
β β β βββ http-logger.interceptor.ts
β β β βββ tracing.interceptor.ts
β β β βββ metrics.interceptor.ts
β β β βββ request-timeout.interceptor.ts
β β βββ middlewares/ # Express middlewares
β β βββ authentication.middleware.ts
β β
β βββ utils/ # π οΈ Utility Functions
β β βββ decorators/ # Custom decorators
β β β βββ role.decorator.ts
β β β βββ validate-schema.decorator.ts
β β β βββ request-timeout.decorator.ts
β β βββ docs/ # Documentation utilities
β β β βββ swagger.ts
β β β βββ data/ # Swagger example data
β β βββ entity.ts # Base entity class
β β βββ exception.ts # Custom exceptions
β β βββ pagination.ts # Pagination utilities
β β βββ search.ts # Search utilities
β β βββ sort.ts # Sort utilities
β β βββ tracing.ts # Tracing utilities
β β βββ tests.ts # Test utilities
β β βββ validator.ts # Zod validation helpers
β β
β βββ app.module.ts # Root application module
β βββ main.ts # Application entry point
β
βββ test/ # π§ͺ Test Configuration
β βββ initialization.ts # Global test setup
βββ .env # Environment variables
βββ .nvmrc # Node version
βββ commitlint.config.js # Commit lint configuration
βββ docker-compose.yml # Main docker compose
βββ docker-compose-infra.yml # Infrastructure services
βββ Dockerfile # Application dockerfile
βββ ecosystem.config.js # PM2 configuration
βββ eslint.config.mjs # ESLint configuration
βββ jest.config.ts # Jest configuration
βββ nest-cli.json # NestJS CLI configuration
βββ package.json # Dependencies and scripts
βββ tsconfig.json # TypeScript configuration
βββ README.md # This file
Contains the business logic, independent of frameworks and external services. Includes:
- Entities: Domain models with business rules
- Use Cases: Application-specific business operations
- Repository Interfaces: Abstract data access contracts
Implements external concerns and technical details:
- Database connections and schemas
- Third-party API clients
- Caching mechanisms
- Email services
- Logging infrastructure
Reusable, framework-agnostic libraries:
- Token management
- Event handling
- Internationalization
NestJS modules that wire everything together:
- Controllers for HTTP endpoints
- Dependency injection configuration
- Route definitions
- Swagger documentation
Cross-cutting concerns applied to the entire application:
- Authentication and authorization
- Request/response logging
- Error handling
- Performance monitoring
import { Roles } from '@/utils/decorators';
@Controller('admin')
export class AdminController {
@Get('users')
@Roles('admin', 'superadmin')
async listUsers() {
// Only accessible by admin and superadmin roles
}
}import { RequestTimeout } from '@/utils/decorators';
@Controller('data')
export class DataController {
@Get('export')
@RequestTimeout(60000) // 60 seconds
async exportLargeDataset() {
// Long-running operation
}
}import { ValidateSchema } from '@/utils/decorators';
import { z } from 'zod';
const CreateProductSchema = z.object({
name: z.string().min(1).max(200),
price: z.number().positive(),
description: z.string().optional()
});
@Controller('products')
export class ProductController {
@Post()
@ValidateSchema(CreateProductSchema)
async create(@Body() data: z.infer<typeof CreateProductSchema>) {
// Data is validated and type-safe
}
}Protect your services from cascading failures:
import CircuitBreaker from 'opossum';
const options = {
timeout: 3000, // If function takes longer than 3 seconds, trigger a failure
errorThresholdPercentage: 50, // Open circuit if 50% of requests fail
resetTimeout: 30000 // Try again after 30 seconds
};
const breaker = new CircuitBreaker(asyncFunction, options);
breaker.fire()
.then(result => console.log(result))
.catch(err => console.error(err));Use the event emitter for decoupled communication:
// Emit an event
this.eventEmitter.emit('user.created', {
userId: user.id,
email: user.email
});
// Listen to an event
@OnEvent('user.created')
handleUserCreated(payload: { userId: string; email: string }) {
// Send welcome email
this.emailService.sendWelcome(payload.email);
}Run load tests with Artillery:
npm run test:loadConfigure tests in .artillery/config.yaml.
PostgreSQL pool configuration:
// In postgres config
{
host: process.env.POSTGRES_HOST,
port: parseInt(process.env.POSTGRES_PORT),
poolSize: 20,
maxQueryExecutionTime: 1000,
extra: {
max: 20, // Maximum pool size
min: 5, // Minimum pool size
idleTimeoutMillis: 30000
}
}async getUser(id: string): Promise<User> {
// Try cache first
const cached = await this.cache.get(`user:${id}`);
if (cached) return cached;
// Cache miss - fetch from database
const user = await this.repository.findById(id);
// Update cache
await this.cache.set(`user:${id}`, user, 3600);
return user;
}async updateUser(id: string, data: UpdateUserDTO): Promise<User> {
const user = await this.repository.update(id, data);
// Invalidate cache
await this.cache.del(`user:${id}`);
return user;
}Access secrets securely:
import { ISecretsAdapter } from '@/infra/secrets';
constructor(private readonly secrets: ISecretsAdapter) {}
async someMethod() {
const apiKey = await this.secrets.get('EXTERNAL_API_KEY');
// Use the secret
}Contributions are welcome! Please follow these guidelines:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes
- Run tests:
npm run test - Run linter:
npm run lint - Commit your changes:
git commit -m 'feat: add amazing feature' - Push to the branch:
git push origin feature/amazing-feature - Open a Pull Request
Follow Conventional Commits:
- Use
feat:for new features - Use
fix:for bug fixes - Use
docs:for documentation - Use
test:for tests - Use
refactor:for code refactoring
- Follow the existing code style
- Use TypeScript strict mode
- Write unit tests for new features
- Update documentation as needed
- Maintain 100% test coverage
- Update the README.md with details of changes if applicable
- Update the CHANGELOG.md following Keep a Changelog format
- Ensure all tests pass
- Request review from maintainers
- Squash commits before merging
This project is licensed under the MIT License - see the LICENSE file for details.
Copyright (c) 2024
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Thanks to all contributors who have helped make this project better!
|
Mike Lima π» π§ |
- Documentation: docs/README.md
- Issues: GitHub Issues
- Discussions: GitHub Discussions
Built with β€οΈ by Mike Lima


