This project is a CI/CD system designed to work on repository of any programming languages. It has been tested on Python, Java, and JavaScript repository.
Pypi package: t4-cicd
This README is divided into two main sections:
-
User Section:
- For those who want to use the application, this section provides setup instructions, requirements for external services, and basic usage commands.
-
Developer Section:
- For contributors or developers working on this project, this section includes instructions for setting up the development environment, managing dependencies, running tests, and generating documentation.
The t4-cicd CLI application requires the following external services:
- MongoDB service
- Docker Engine service
- AWS S3 Service (user must use their own AWS account)
Install Docker Desktop from the official website. Docker Engine is included in the Docker Desktop installation.
On macOS or Windows:
Ensure Docker Desktop is open and running before using the t4-cicd CLI application.
On Linux: Install Docker Engine directly via your package manager and ensure the Docker service (dockerd) is running.
The t4-cicd CLI uses the Docker Engine to perform its operations.
You can use a local or remote MongoDB Service. You will need to save the URL to the MongoDB service
into a .env file in your project folder, or as an environment variable in your shell configuration file i.e., ~/.bashrc or ~/.zshrc.
The example below shows how to set up the MONGO_DB_URL for a local and remote MongoDB service.
# Local
export MONGO_DB_URL="mongodb://localhost:27017/"
# Remote, replace the <db_username>, <db_password> and <cluster_address>
export MONGO_DB_URL="mongodb+srv://<db_username>:<db_password>@<cluster_address>/"Follow these instructions for setting up local MongoDB service. Download the MongoDB Community Server from https://www.mongodb.com/try/download/community-kubernetes-operator
Follow the instructions for self-managed deployments https://www.mongodb.com/docs/manual/administration/install-community/
To view the database, use MongoDB Compass (comes together with MongoDB installation)
S3 bucket names must be globally unique across all AWS accounts. In order for Upload Artifact feature to work, user/developer needs to specify a unique name in their configuration file artifact_upload_path: "<UNIQUE_BUCKET_NAME>".
global:
pipeline_name: "cicd_pipeline"
docker: #...
artifact_upload_path: <UNIQUE_BUCKET_NAME>You need to copy and paste the following AWS credentials into ~/.aws/credentials file.
The credentials you provided must be able to create a bucket and upload into S3.
aws_access_key_id=<your_id>
aws_secret_access_key=<your_secret_access_key>
aws_session_token=<your_session_token> #if applicableYou also need to set up the target region in your .env file or set it as environment variable.
DEFAULT_S3_LOC=<AWS_REGION>
# example set to us-west-2
DEFAULT_S3_LOC='us-west-2'You can install directly from PyPI using pip. It is recommended to install it under a virtual environment. See the section under Developer set up for how to activate a virtual environment.
pip install t4-cicdThere are 2 options for running the application:
- Run the command in a Git repository’s root directory to target that repository.
- Run the command in an empty directory and specify the target repository, which can be either local or remote.
Note: The target repository must contain a .cicd-pipelines folder with a pipelines.yml file. These files will be used as the default configuration if not explicitly specified in the command.
Check here for the syntax required to write the YML file, check here for sample pipeline configuration.
The main commands of the CLI you can run are listed below. Use the --help flag for detailed information about each command.
# Checking pipeline configuration file,
# only work within a git repository
cid config
# Setting up repository
cid config set-repo
# Getting repository info
cid config get-repo
# Override pipeline configuration value in Datastore (MongoDB)
# only work within a git repository
cid config override
# Running a pipeline run
cid pipeline run
# Retrieve pipeline report
cid pipeline report-
The program will install three packages with name cli, controller and util. Users have to ensure there is no naming conflict on their python packages installed, as well as naming conflict in the repository that they are working on. Installing in a virtual environment is recommended for testing.
-
When saving data into the Datastore, the current id for user's session will be used to identify the user. The recommendation is to run the program in an user account instead of the root account, so that the user's session detail will be uniquely identified and stored.
-
The program need to have access right to write to the parent folder where the command is executed to write the debug log. For example, if command is executed in directory
/temp/t4-cicd, the program need to write a debug log at/tempfolder.
-
When errors occur, the program stdout and stderr will give a brief summary of the error message.
-
For error details to help in debugging, you can look for the
debug.logfile created:- By default, a
debug.logfile will be created at the parent directory where the command was executed. - i.e., command is executed in directory
/temp/t4-cicd, adebug.logwill be at/temp
- By default, a
First, follow the User Setup Instructions to set the Docker Engine service, MongoDB service and AWS S3 service.
Reference for Poetry and Click here
- Installed Python 3.12 from official website
- Installed pipx and poetry
# Install pipx
python -m pip install --user pipx
python -m pipx ensurepath
# Then go to your environment variable on windows/OS, check the path added and
# update lowercase to uppercase if necessary
# restart your window/OS.
# Installation of poetry using pipx
pipx install poetry- Activated Virtual Environment
# Navigate to the project directory
# First create your virtual environment using venv,
# Give it a name like myenv, if you use other name, add it to the .gitignore file
# venv come with standard python package
python -m venv myenv
# Activate your virtual environment. For bash terminal
source <your_venv_dir>/Scripts/activate
# Example when you name venv as myenv
# For Windows
source myenv/Scripts/activate
# For Mac / Unix
source myenv/bin/activateThe alternative way to activate the virtual environment is to run poetry shell.
This is less recommended as the downside of this approach is any command not starting with poetry
will run using your background environment package instead from the virtual environment.
poetry shell# This will install all dependencies, and the project itself into your venv
poetry install
# Run pylint for src directory, --fail-under score to be 9.5
poetry run pylint ./src --fail-under=9.5
# Run pytest with coverage, note current passing coverage score is set at 50%
poetry run pytest
# Run pydoctor to generate the API documentation. This will generate a folder name 'apidocs/'
# in the repository.
# This required Administrative right. To link the file
poetry run pydoctor
# To test if you can run the command
poetry run cid --help
# Or
cid --helpReference for dependencies management.
Please check the pyproject.toml file to ensure the dependency is added/removed
# Adding new dependency for general
poetry add <dependency/lib/package>
# Adding new dependency for development environment only
poetry add <dependency/lib/package> --group dev
# Remove dependencies
poetry remove <dependency/lib/package>
# Remove dependencies from development environment
poetry remove <dependency/lib/package> --group dev- Use the
get_logger()function fromutil.common_utils - Provide arguments to change the logging level and/or directory
- By default, a
debug.logfile will be created at a parent directory where you run the command. You can change its location. i.e., if you run the command under directory/temp/t4-cicd, adebug.logwill be at/temp
Check out the list of documents in this github repository, in dev-docs/design-docs folder for the following:
- CLI Documentation
- Component Design
- Configuration File Documentation
- Data Store Documentation
- System Design
- Testing Documentation
For API Documentations, you can look for the artifact generated by the last GitHub workflow action step, with name api-documentations. The backup API Documentations are available in the following share location here. Look for index.html to start.
Please fork the repository, create a branch for your changes, and submit a pull request.