This repository is meant as an experimental and demonstration space for some elements of my Python development workflow. This is all highly opinionated and probably not optimal - it's just the result of building up habits and occasionally changing tooling over time.
I'm not releasing it as open source right now for the above reason. Current license for the whole thing is CC-BY 4.0, though if you're just pulling pieces from the code and config you are welcome to do so freely and without attribution.
I use pyenv and pyenv-virtualenv (install both with pyenv-installer). Hence all repositories have a .python-version file locally. I don't commit that file to shared repositories so as to not add uncessary files for those who use different environment management tooling.
- In the past I've generally preferred to omit a
srcfolder if there is only one base module in the repository, but I think thesrcsystem is winning me over now. pyproject.tomlfor packaging.setuptoolsfor build.
Resources:
Use semantic versioning. Depending on the collaboration workflow, either use a __version__ attribute in the top-level __init__.py to declare the version, or use the git tag via setuptools-scm.
When it matures, validate-pyproject looks like a useful tool.
I use pip and requirements.txt (sometimes multiple requirements files for different purposes, such as a separate requirements-dev.txt).
List unpinned dependencies (with minimum versions where known) in pyproject.toml, and then use pip-compile (part of pip-tools or pip freeze (for informal things with no pyproject.toml) to generate pinned requirements.txt files.
Using pip-compile (assuming reqs are in pyproject.toml):
pip-compile -o requirements.txt pyproject.tomlpip-compile --extra dev -o requirements-dev.txt pyproject.toml
If you want to install pinned versions, pip install from the requirements files before (or after) installing the package. If you don't care, just pip install the package itself.
Note - might want to switch over to .in files instead of listing requirements directly in pyproject.toml - would enable better control e.g. https://github.com/jazzband/pip-tools?tab=readme-ov-file#workflow-for-layered-requirements
I always use flake8, and for collaborative projects always use black. I use the minimal black compatibility config for flake8
Note that black has editor integrations.
pytest with liberal use of fixtures
I use nox both to facilitate testing/building across multiple Python versions and to standardise test, lint, deploy, etc config across development environments and CI/CD.
Alternatives:
- tox: very old, still fairly common in the Python world, but I prefer nox's flexibility due to being configured in Python code
- just: haven't used this one but it looks very handy if you want a simple make-style way to save sets of commands that isn't Python-specific
Pre-commit is an amazing tool that uses git pre-commit hooks to automatically call whatever checks you want on your staged changes when you make a git commit. I love pre-commit!
Pre-commit hooks are configured in this repository for black and flake8 as well as a few miscellaneous checks for things like trailing whitespace. See .pre-commit-config.yaml. Note that if any of the hooks fail their checks of your changes or make changes themselves, you will need to re-stage the changed files and re-run the commit.
Don't forget to run pre-commit install on first checkout or on any changes to the precommit config.
My fave pre-commit hook isn't actually in this repository - nbstripout is a lifesaver when using Jupyter Notebooks.
Here are some of my faves:
- Click for command-line applications, though argparse is fine for a very simple interface or an informal script
direnv/dotenv
Essential for anything meant for use as a library.
Still playing with tools/systems - maybe mypy or pyright?
Further reading:
Small scale: handful of markdown files.
Large scale: Sphinx.
Both mermaid.js and draw.io are invaluable for diagrams. If the latter, commit the SVG to git. If the former, the code should be in your markdown file.
... add this later