Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
35fb8ab
add bare workflow to publish image to ghcr
JunAishima Sep 11, 2025
f8d53b5
update branch to main, rename to same as other repos
JunAishima Sep 16, 2025
b1f46ee
add Dockerfile, pixi files, prefect.yaml file
JunAishima Sep 17, 2025
3e6f5e3
remove extra line
JunAishima Oct 1, 2025
4e80ae8
use Secrets class for retrieving API key
JunAishima Oct 1, 2025
b78e244
remove ENV var to pass in API key
JunAishima Oct 1, 2025
960edf3
set version number, remove Tiled API key
JunAishima Oct 1, 2025
26cc1c3
use prefect3-pixi-upgrade for now
JunAishima Oct 1, 2025
2bbf8da
rename build/push branch to current dev branch
JunAishima Oct 1, 2025
fc989c6
add missing file
JunAishima Oct 1, 2025
d81c147
remove unnecessary line
JunAishima Oct 2, 2025
a4c25de
add script for testing beamline
JunAishima Oct 3, 2025
2378fcc
remove posting message to Slack channel
JunAishima Oct 3, 2025
5d0d1ea
fix deployment to run to tst
JunAishima Oct 3, 2025
afb2435
fix name of API key - do not add full prefix
JunAishima Oct 3, 2025
f0dcb04
put Tiled API key into environment variable
JunAishima Oct 3, 2025
0897e36
update dependencies
JunAishima Oct 3, 2025
f57461e
update requirements
JunAishima Oct 3, 2025
d6437d9
remove API key from environment
JunAishima Oct 3, 2025
268b730
create utils to reduce number of Tiled clients
JunAishima Oct 3, 2025
2fa6388
fix parentheses
JunAishima Oct 3, 2025
ed6c13d
use load function
JunAishima Oct 3, 2025
1d302c1
refactor to get beamline/endstation-specific Tiled client
JunAishima Oct 3, 2025
1d2dba6
create another task to test getting the client from utils
JunAishima Oct 3, 2025
dbd40ee
missing import, use run logging
JunAishima Oct 3, 2025
0811677
make new clients each time get_tiled_client() is called
JunAishima Oct 3, 2025
9c01be5
refactor to remove raw from get_tiled_client()
JunAishima Oct 6, 2025
a573d8d
go back to using from_profile(..., api_key=api_key)
JunAishima Oct 7, 2025
7a40d29
fix bad import
JunAishima Oct 7, 2025
cf9ba09
add improt
JunAishima Oct 8, 2025
772daf3
give a dummy uid
JunAishima Oct 8, 2025
8d41d16
update to add tzdata to honor timezone
JunAishima Oct 9, 2025
acd6d16
fix punctuation
JunAishima Oct 9, 2025
a950aea
clean up syntax - no extra call to apt-get
JunAishima Oct 9, 2025
033dd70
fix pixi version in Dockerfile
JunAishima Jan 13, 2026
14c41e1
fix to publish on main, remove extra spaces
JunAishima Jan 13, 2026
17a0c17
update pixi dependencies, store pixi.lock, update prefect.yaml
JunAishima Jan 13, 2026
1a6432a
update test.py to default.py, update Dockerfile
JunAishima Jan 13, 2026
5f94bab
update deployment file to match others
JunAishima Feb 4, 2026
27f09b8
add linting and pre-commit files
JunAishima Feb 4, 2026
3602d36
LINT: linting fixes
JunAishima Feb 4, 2026
c260efc
LINT: fix more issues
JunAishima Feb 4, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions .github/workflows/linting.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
name: pre-commit

on:
pull_request:
push:
branches: [main]

jobs:
pre-commit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4.3.0
- uses: actions/setup-python@v5.6.0
- uses: pre-commit/action@v3.0.1
59 changes: 59 additions & 0 deletions .github/workflows/publish-ghcr.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
#
name: Create and publish a Docker image

# Configures this workflow to run every time a change is pushed to the branch called `release`.
on:
push:
branches: ["main"]

# Defines two custom environment variables for the workflow. These are used for the Container registry domain, and a name for the Docker image that this workflow builds.
env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}

# There is a single job in this workflow. It's configured to run on the latest available version of Ubuntu.
jobs:
build-and-push-image:
runs-on: ubuntu-latest
# Sets the permissions granted to the `GITHUB_TOKEN` for the actions in this job.
permissions:
contents: read
packages: write
attestations: write
id-token: write
#
steps:
- name: Checkout repository
uses: actions/checkout@v5
# Uses the `docker/login-action` action to log in to the Container registry registry using the account and password that will publish the packages. Once published, the packages are scoped to the account defined here.
- name: Log in to the Container registry
uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
# This step uses [docker/metadata-action](https://github.com/docker/metadata-action#about) to extract tags and labels that will be applied to the specified image. The `id` "meta" allows the output of this step to be referenced in a subsequent step. The `images` value provides the base name for the tags and labels.
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
# This step uses the `docker/build-push-action` action to build the image, based on your repository's `Dockerfile`. If the build succeeds, it pushes the image to GitHub Packages.
# It uses the `context` parameter to define the build's context as the set of files located in the specified path. For more information, see [Usage](https://github.com/docker/build-push-action#usage) in the README of the `docker/build-push-action` repository.
# It uses the `tags` and `labels` parameters to tag and label the image with the output from the "meta" step.
- name: Build and push Docker image
id: push
uses: docker/build-push-action@f2a1d5e99d037542a71f64918e516c093c6f3fc4
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}

# This step generates an artifact attestation for the image, which is an unforgeable statement about where and how it was built. It increases supply chain security for people who consume the image. For more information, see [Using artifact attestations to establish provenance for builds](/actions/security-guides/using-artifact-attestations-to-establish-provenance-for-builds).
- name: Generate artifact attestation
uses: actions/attest-build-provenance@v3
with:
subject-name: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME}}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
1 change: 0 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -80,4 +80,3 @@ target/

#Ipython Notebook
.ipynb_checkpoints

92 changes: 92 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
ci:
autoupdate_commit_msg: "chore: update pre-commit hooks"
autofix_commit_msg: "style: pre-commit fixes"

exclude: ^.cruft.json|.copier-answers.yml$

repos:
- repo: https://github.com/adamchainz/blacken-docs
rev: "1.19.1"
hooks:
- id: blacken-docs
additional_dependencies: [black==24.*]

- repo: https://github.com/pre-commit/pre-commit-hooks
rev: "v5.0.0"
hooks:
- id: check-added-large-files
- id: check-case-conflict
- id: check-merge-conflict
- id: check-symlinks
- id: check-yaml
- id: debug-statements
- id: end-of-file-fixer
- id: mixed-line-ending
- id: name-tests-test
args: ["--pytest-test-first"]
- id: requirements-txt-fixer
- id: trailing-whitespace

- repo: https://github.com/pre-commit/pygrep-hooks
rev: "v1.10.0"
hooks:
- id: rst-backticks
- id: rst-directive-colons
- id: rst-inline-touching-normal

- repo: https://github.com/rbubley/mirrors-prettier
rev: "v3.5.3"
hooks:
- id: prettier
types_or: [yaml, markdown, html, css, scss, javascript, json]
args: [--prose-wrap=always]

- repo: https://github.com/astral-sh/ruff-pre-commit
rev: "v0.11.11"
hooks:
- id: ruff
args: ["--fix", "--show-fixes"]
- id: ruff-format

- repo: https://github.com/pre-commit/mirrors-mypy
rev: "v1.15.0"
hooks:
- id: mypy
files: src|tests
args: []
additional_dependencies:
- pytest

- repo: https://github.com/codespell-project/codespell
rev: "v2.4.1"
hooks:
- id: codespell
additional_dependencies:
- tomli; python_version<'3.11'
exclude: "pixi.*"

- repo: https://github.com/shellcheck-py/shellcheck-py
rev: "v0.10.0.1"
hooks:
- id: shellcheck

- repo: local
hooks:
- id: disallow-caps
name: Disallow improper capitalization
language: pygrep
entry: PyBind|Numpy|Cmake|CCache|Github|PyTest
exclude: .pre-commit-config.yaml

- repo: https://github.com/abravalheri/validate-pyproject
rev: "v0.24.1"
hooks:
- id: validate-pyproject
additional_dependencies: ["validate-pyproject-schema-store[all]"]

- repo: https://github.com/python-jsonschema/check-jsonschema
rev: "0.33.0"
hooks:
- id: check-dependabot
- id: check-github-workflows
- id: check-readthedocs
27 changes: 27 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
FROM ghcr.io/prefix-dev/pixi:0.57.0

ENV TZ="America/New_York"

RUN apt-get -y update && \
apt-get -y install git tzdata

COPY pixi.toml .
COPY pixi.lock .
# use `--locked` to ensure the lockfile is up to date with pixi.toml
RUN pixi install --locked
# create the shell-hook bash script to activate the environment
RUN pixi shell-hook -s bash > /shell-hook

ENV PYTHONUNBUFFERED=1

COPY default.py .

RUN mkdir /etc/tiled
RUN mkdir /.prefect -m 0777
RUN mkdir /repo -m 0777

RUN /bin/bash /shell-hook

#now reapply deployment to push the image that is being created
ENTRYPOINT ["pixi", "run"]
CMD ["python", "-m", "default", "arg"]
1 change: 0 additions & 1 deletion LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -27,4 +27,3 @@ SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

1 change: 0 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
# Workflows

Repository of workflows for the TST beamline.

6 changes: 4 additions & 2 deletions data_validation.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,15 @@
from prefect import task, flow, get_run_logger
from prefect.blocks.system import Secret
import time as ttime
from tiled.client import from_profile


@task(retries=2, retry_delay_seconds=10)
def read_all_streams(uid, beamline_acronym):
logger = get_run_logger()
tiled_client = from_profile("nsls2")
run = tiled_client[beamline_acronym]["raw"][uid]
api_key = Secret.load("tiled-tst-api-key").get()
cl = from_profile("nsls2", api_key=api_key)
run = cl["tst"]["raw"][uid]
logger.info(f"Validating uid {run.start['uid']}")
start_time = ttime.monotonic()
for stream in run:
Expand Down
28 changes: 28 additions & 0 deletions default.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
from __future__ import annotations

import prefect
import subprocess
import sys
import tiled


def print_argument(argument_to_print=""):
if argument_to_print:
print(f"argument to print: {argument_to_print}") # noqa: T201
else:
print("argument to print: EMPTY") # noqa: T201


def info():
print(f"Prefect info: {prefect.__version_info__}")
print(f"Tiled info: {tiled.__version__}")
output = subprocess.check_output(["pixi", "--version"])
print(f"Pixi info: {output.decode().strip()}")


if __name__ == "__main__":
info()
if len(sys.argv) > 1:
print_argument(sys.argv[1])
else:
print_argument()
6 changes: 3 additions & 3 deletions end_of_run_workflow.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
from prefect import task, flow, get_run_logger
from data_validation import data_validation
from prefect2_test_flow import hello_world
from test_extra_client import get_other_docs
# from long_flow import long_flow


Expand All @@ -13,8 +13,8 @@ def log_completion():
@flow
def end_of_run_workflow(stop_doc):
uid = stop_doc["run_start"]
hello_world()
# hello_world()
data_validation(uid, return_state=True)
get_other_docs(uid)
# long_flow(iterations=100, sleep_length=10)
log_completion()

1 change: 0 additions & 1 deletion long_flow.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
import numpy as np
from prefect import task, flow
import time as ttime

Expand Down
Loading
Loading