Skip to content

Conversation

@dominic-r
Copy link
Member

Closes: #5471

@dominic-r dominic-r added this to the Release 2025.12 milestone Dec 9, 2025
@dominic-r dominic-r self-assigned this Dec 9, 2025
@dominic-r dominic-r requested review from a team as code owners December 9, 2025 01:24
@dominic-r dominic-r added area:frontend Features or issues related to the browser, TypeScript, Node.js, etc area:docs labels Dec 9, 2025
@netlify
Copy link

netlify bot commented Dec 9, 2025

Deploy Preview for authentik-storybook ready!

Name Link
🔨 Latest commit b83daa5
🔍 Latest deploy log https://app.netlify.com/projects/authentik-storybook/deploys/6948a16a40e3c200086f6300
😎 Deploy Preview https://deploy-preview-18686--authentik-storybook.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@netlify
Copy link

netlify bot commented Dec 9, 2025

Deploy Preview for authentik-integrations ready!

Name Link
🔨 Latest commit b83daa5
🔍 Latest deploy log https://app.netlify.com/projects/authentik-integrations/deploys/6948a16a3dc4a10008439a43
😎 Deploy Preview https://deploy-preview-18686--authentik-integrations.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@netlify
Copy link

netlify bot commented Dec 9, 2025

Deploy Preview for authentik-docs ready!

Name Link
🔨 Latest commit b83daa5
🔍 Latest deploy log https://app.netlify.com/projects/authentik-docs/deploys/6948a16a561fc4000810a179
😎 Deploy Preview https://deploy-preview-18686--authentik-docs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

…hrough an exception

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
Signed-off-by: Dominic R <dominic@sdko.org>
@codecov
Copy link

codecov bot commented Dec 9, 2025

❌ 1 Tests Failed:

Tests completed Failed Passed Skipped
2872 1 2871 2
View the top 1 failed test(s) by shortest run time
authentik.blueprints.tests.test_v1_tasks.TestBlueprintsV1Tasks::test_valid_disabled
Stack Traces | 2.41s run time
self = <django.db.backends.utils.CursorWrapper object at 0x7f1b6a4b2b10>
sql = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('6306e120-c24e-4480-9032-7fdf6b845e04'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'Blueprint Vpuo4I9qLlTOvf...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 709981, tzinfo=datetime.timezone.utc), ...)
ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f1b6a4b2b10>})

    def _execute(self, sql, params, *ignored_wrapper_args):
        # Raise a warning during app initialization (stored_app_configs is only
        # ever set during testing).
        if not apps.ready and not apps.stored_app_configs:
            warnings.warn(self.APPS_NOT_READY_WARNING_MSG, category=RuntimeWarning)
        self.db.validate_no_broken_transaction()
        with self.db.wrap_database_errors:
            if params is None:
                # params default might be backend specific.
                return self.cursor.execute(sql)
            else:
>               return self.cursor.execute(sql, params)

.venv/lib/python3.13.../db/backends/utils.py:105: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django_prometheus.db.common.ExportingCursorWrapper.<locals>.CursorWrapper [closed] [BAD] at 0x7f1b2adefdd0>
args = ('INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "...ovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 709981, tzinfo=datetime.timezone.utc), ...))
kwargs = {}

    def execute(self, *args, **kwargs):
        execute_total.labels(alias, vendor).inc()
        with (
            query_duration_seconds.labels(**labels).time(),
            ExceptionCounterByType(errors_total, extra_labels=labels),
        ):
>           return super().execute(*args, **kwargs)

.venv/lib/python3.13.../django_prometheus/db/common.py:69: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django_prometheus.db.common.ExportingCursorWrapper.<locals>.CursorWrapper [closed] [BAD] at 0x7f1b2adefdd0>
query = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('6306e120-c24e-4480-9032-7fdf6b845e04'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'Blueprint Vpuo4I9qLlTOvf...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 709981, tzinfo=datetime.timezone.utc), ...)

    def execute(
        self,
        query: Query,
        params: Params | None = None,
        *,
        prepare: bool | None = None,
        binary: bool | None = None,
    ) -> Self:
        """
        Execute a query or command to the database.
        """
        try:
            with self._conn.lock:
                self._conn.wait(
                    self._execute_gen(query, params, prepare=prepare, binary=binary)
                )
        except e._NO_TRACEBACK as ex:
>           raise ex.with_traceback(None)
E           psycopg.errors.ForeignKeyViolation: insert or update on table "authentik_tasks_tasklog" violates foreign key constraint "authentik_tasks_task_task_id_a82f0835_fk_authentik"
E           DETAIL:  Key (task_id)=(2920b1f0-d466-4bfc-a904-6283e5d6be66) is not present in table "authentik_tasks_task".

.venv/lib/python3.13............/site-packages/psycopg/cursor.py:97: ForeignKeyViolation

The above exception was the direct cause of the following exception:

instance_pk = UUID('8b456399-3fd0-43ea-998c-62b8ff97fd5a')

    @actor(description=_("Apply single blueprint."))
    def apply_blueprint(instance_pk: UUID):
        try:
            self = CurrentTask.get_task()
        except CurrentTaskNotFound:
            self = Task()
        self.set_uid(str(instance_pk))
        instance: BlueprintInstance | None = None
        try:
            instance: BlueprintInstance = BlueprintInstance.objects.filter(pk=instance_pk).first()
            if not instance:
                self.warning(f"Could not find blueprint {instance_pk}, skipping")
                return
            self.set_uid(slugify(instance.name))
            if not instance.enabled:
>               self.info(f"Blueprint {instance.name} is disabled, skipping")

.../blueprints/v1/tasks.py:207: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <Task: 2920b1f0-d466-4bfc-a904-6283e5d6be66>
message = 'Blueprint Vpuo4I9qLlTOvfVq8z5gWbna0YdNepcYslxTbU7f is disabled, skipping'
attributes = {}

    def info(self, message: str | Exception, **attributes) -> None:
>       self.log(self.uid, TaskStatus.INFO, message, **attributes)

authentik/tasks/models.py:138: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <Task: 2920b1f0-d466-4bfc-a904-6283e5d6be66>
logger = 'authentik.events.tasks.event_trigger_dispatch:vpuo4i9qlltovfvq8z5gwbna0ydnepcyslxtbu7f'
log_level = TaskStatus.INFO
message = 'Blueprint Vpuo4I9qLlTOvfVq8z5gWbna0YdNepcYslxTbU7f is disabled, skipping'
attributes = {}

    def log(
        self,
        logger: str,
        log_level: TaskStatus,
        message: str | Exception,
        **attributes,
    ) -> None:
>       TaskLog.create_from_log_event(
            self,
            self._make_log(
                logger,
                log_level,
                message,
                **attributes,
            ),
        )

authentik/tasks/models.py:127: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

cls = <class 'authentik.tasks.models.TaskLog'>
task = <Task: 2920b1f0-d466-4bfc-a904-6283e5d6be66>
log_event = LogEvent(event='Blueprint Vpuo4I9qLlTOvfVq8z5gWbna0YdNepcYslxTbU7f is disabled, skipping', log_level='info', logger='a...cyslxtbu7f', timestamp=datetime.datetime(2025, 12, 22, 2, 11, 38, 709981, tzinfo=datetime.timezone.utc), attributes={})

    @classmethod
    def create_from_log_event(cls, task: Task, log_event: LogEvent) -> Self | None:
        if not task.message:
            return None
>       return cls.objects.create(
            task=task,
            event=log_event.event,
            log_level=log_event.log_level,
            logger=log_event.logger,
            timestamp=log_event.timestamp,
            attributes=sanitize_item(log_event.attributes),
        )

authentik/tasks/models.py:172: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.models.manager.Manager object at 0x7f1b7490f750>, args = ()
kwargs = {'attributes': {}, 'event': 'Blueprint Vpuo4I9qLlTOvfVq8z5gWbna0YdNepcYslxTbU7f is disabled, skipping', 'log_level': 'info', 'logger': 'authentik.events.tasks.event_trigger_dispatch:vpuo4i9qlltovfvq8z5gwbna0ydnepcyslxtbu7f', ...}

    @wraps(method)
    def manager_method(self, *args, **kwargs):
>       return getattr(self.get_queryset(), name)(*args, **kwargs)

.venv/lib/python3.13.../db/models/manager.py:87: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <QuerySet []>
kwargs = {'attributes': {}, 'event': 'Blueprint Vpuo4I9qLlTOvfVq8z5gWbna0YdNepcYslxTbU7f is disabled, skipping', 'log_level': 'info', 'logger': 'authentik.events.tasks.event_trigger_dispatch:vpuo4i9qlltovfvq8z5gwbna0ydnepcyslxtbu7f', ...}
reverse_one_to_one_fields = frozenset()
obj = <TaskLog: 6306e120-c24e-4480-9032-7fdf6b845e04>

    def create(self, **kwargs):
        """
        Create a new object with the given kwargs, saving it to the database
        and returning the created object.
        """
        reverse_one_to_one_fields = frozenset(kwargs).intersection(
            self.model._meta._reverse_one_to_one_field_names
        )
        if reverse_one_to_one_fields:
            raise ValueError(
                "The following fields do not exist in this model: %s"
                % ", ".join(reverse_one_to_one_fields)
            )
    
        obj = self.model(**kwargs)
        self._for_write = True
>       obj.save(force_insert=True, using=self.db)

.venv/lib/python3.13.../db/models/query.py:665: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <TaskLog: 6306e120-c24e-4480-9032-7fdf6b845e04>, force_insert = True
force_update = False, using = 'default', update_fields = None

    def save(
        self,
        *args,
        force_insert=False,
        force_update=False,
        using=None,
        update_fields=None,
    ):
        """
        Save the current instance. Override this in a subclass if you want to
        control the saving process.
    
        The 'force_insert' and 'force_update' parameters can be used to insist
        that the "save" must be an SQL insert or update (or equivalent for
        non-SQL backends), respectively. Normally, they should not be set.
        """
        # RemovedInDjango60Warning.
        if args:
            force_insert, force_update, using, update_fields = self._parse_save_params(
                *args,
                method_name="save",
                force_insert=force_insert,
                force_update=force_update,
                using=using,
                update_fields=update_fields,
            )
    
        self._prepare_related_fields_for_save(operation_name="save")
    
        using = using or router.db_for_write(self.__class__, instance=self)
        if force_insert and (force_update or update_fields):
            raise ValueError("Cannot force both insert and updating in model saving.")
    
        deferred_non_generated_fields = {
            f.attname
            for f in self._meta.concrete_fields
            if f.attname not in self.__dict__ and f.generated is False
        }
        if update_fields is not None:
            # If update_fields is empty, skip the save. We do also check for
            # no-op saves later on for inheritance cases. This bailout is
            # still needed for skipping signal sending.
            if not update_fields:
                return
    
            update_fields = frozenset(update_fields)
            field_names = self._meta._non_pk_concrete_field_names
            not_updatable_fields = update_fields.difference(field_names)
    
            if not_updatable_fields:
                raise ValueError(
                    "The following fields do not exist in this model, are m2m "
                    "fields, primary keys, or are non-concrete fields: %s"
                    % ", ".join(not_updatable_fields)
                )
    
        # If saving to the same database, and this model is deferred, then
        # automatically do an "update_fields" save on the loaded fields.
        elif (
            not force_insert
            and deferred_non_generated_fields
            and using == self._state.db
        ):
            field_names = set()
            pk_fields = self._meta.pk_fields
            for field in self._meta.concrete_fields:
                if field not in pk_fields and not hasattr(field, "through"):
                    field_names.add(field.attname)
            loaded_fields = field_names.difference(deferred_non_generated_fields)
            if loaded_fields:
                update_fields = frozenset(loaded_fields)
    
>       self.save_base(
            using=using,
            force_insert=force_insert,
            force_update=force_update,
            update_fields=update_fields,
        )

.venv/lib/python3.13.../db/models/base.py:902: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <TaskLog: 6306e120-c24e-4480-9032-7fdf6b845e04>, raw = False
force_insert = (<class 'authentik.tasks.models.TaskLog'>,), force_update = False
using = 'default', update_fields = None

    def save_base(
        self,
        raw=False,
        force_insert=False,
        force_update=False,
        using=None,
        update_fields=None,
    ):
        """
        Handle the parts of saving which should be done only once per save,
        yet need to be done in raw saves, too. This includes some sanity
        checks and signal sending.
    
        The 'raw' argument is telling save_base not to save any parent
        models and not to do any changes to the values before save. This
        is used by fixture loading.
        """
        using = using or router.db_for_write(self.__class__, instance=self)
        assert not (force_insert and (force_update or update_fields))
        assert update_fields is None or update_fields
        cls = origin = self.__class__
        # Skip proxies, but keep the origin as the proxy model.
        if cls._meta.proxy:
            cls = cls._meta.concrete_model
        meta = cls._meta
        if not meta.auto_created:
            pre_save.send(
                sender=origin,
                instance=self,
                raw=raw,
                using=using,
                update_fields=update_fields,
            )
        # A transaction isn't needed if one query is issued.
        if meta.parents:
            context_manager = transaction.atomic(using=using, savepoint=False)
        else:
            context_manager = transaction.mark_for_rollback_on_error(using=using)
        with context_manager:
            parent_inserted = False
            if not raw:
                # Validate force insert only when parents are inserted.
                force_insert = self._validate_force_insert(force_insert)
                parent_inserted = self._save_parents(
                    cls, using, update_fields, force_insert
                )
>           updated = self._save_table(
                raw,
                cls,
                force_insert or parent_inserted,
                force_update,
                using,
                update_fields,
            )

.venv/lib/python3.13.../db/models/base.py:1008: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <TaskLog: 6306e120-c24e-4480-9032-7fdf6b845e04>, raw = False
cls = <class 'authentik.tasks.models.TaskLog'>
force_insert = (<class 'authentik.tasks.models.TaskLog'>,), force_update = False
using = 'default', update_fields = None

    def _save_table(
        self,
        raw=False,
        cls=None,
        force_insert=False,
        force_update=False,
        using=None,
        update_fields=None,
    ):
        """
        Do the heavy-lifting involved in saving. Update or insert the data
        for a single table.
        """
        meta = cls._meta
        pk_fields = meta.pk_fields
        non_pks_non_generated = [
            f
            for f in meta.local_concrete_fields
            if f not in pk_fields and not f.generated
        ]
    
        if update_fields:
            non_pks_non_generated = [
                f
                for f in non_pks_non_generated
                if f.name in update_fields or f.attname in update_fields
            ]
    
        if not self._is_pk_set(meta):
            pk_val = meta.pk.get_pk_value_on_save(self)
            setattr(self, meta.pk.attname, pk_val)
        pk_set = self._is_pk_set(meta)
        if not pk_set and (force_update or update_fields):
            raise ValueError("Cannot force an update in save() with no primary key.")
        updated = False
        # Skip an UPDATE when adding an instance and primary key has a default.
        if (
            not raw
            and not force_insert
            and not force_update
            and self._state.adding
            and all(f.has_default() or f.has_db_default() for f in meta.pk_fields)
        ):
            force_insert = True
        # If possible, try an UPDATE. If that doesn't update anything, do an INSERT.
        if pk_set and not force_insert:
            base_qs = cls._base_manager.using(using)
            values = [
                (
                    f,
                    None,
                    (getattr(self, f.attname) if raw else f.pre_save(self, False)),
                )
                for f in non_pks_non_generated
            ]
            forced_update = update_fields or force_update
            pk_val = self._get_pk_val(meta)
            updated = self._do_update(
                base_qs, using, pk_val, values, update_fields, forced_update
            )
            if force_update and not updated:
                raise DatabaseError("Forced update did not affect any rows.")
            if update_fields and not updated:
                raise DatabaseError("Save with update_fields did not affect any rows.")
        if not updated:
            if meta.order_with_respect_to:
                # If this is a model with an order_with_respect_to
                # autopopulate the _order field
                field = meta.order_with_respect_to
                filter_args = field.get_filter_kwargs_for_object(self)
                self._order = (
                    cls._base_manager.using(using)
                    .filter(**filter_args)
                    .aggregate(
                        _order__max=Coalesce(
                            ExpressionWrapper(
                                Max("_order") + Value(1), output_field=IntegerField()
                            ),
                            Value(0),
                        ),
                    )["_order__max"]
                )
            fields = [
                f
                for f in meta.local_concrete_fields
                if not f.generated and (pk_set or f is not meta.auto_field)
            ]
            returning_fields = meta.db_returning_fields
>           results = self._do_insert(
                cls._base_manager, using, fields, returning_fields, raw
            )

.venv/lib/python3.13.../db/models/base.py:1169: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <TaskLog: 6306e120-c24e-4480-9032-7fdf6b845e04>
manager = <django.db.models.manager.Manager object at 0x7f1b748f73d0>
using = 'default'
fields = [<django.db.models.fields.UUIDField: id>, <django.db.models.fields.related.ForeignKey: task>, <django.db.models.fields...ield: log_level>, <django.db.models.fields.TextField: logger>, <django.db.models.fields.DateTimeField: timestamp>, ...]
returning_fields = [], raw = False

    def _do_insert(self, manager, using, fields, returning_fields, raw):
        """
        Do an INSERT. If returning_fields is defined then this method should
        return the newly created data for the model.
        """
>       return manager._insert(
            [self],
            fields=fields,
            returning_fields=returning_fields,
            using=using,
            raw=raw,
        )

.venv/lib/python3.13.../db/models/base.py:1210: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.models.manager.Manager object at 0x7f1b748f73d0>
args = ([<TaskLog: 6306e120-c24e-4480-9032-7fdf6b845e04>],)
kwargs = {'fields': [<django.db.models.fields.UUIDField: id>, <django.db.models.fields.related.ForeignKey: task>, <django.db.mo...r>, <django.db.models.fields.DateTimeField: timestamp>, ...], 'raw': False, 'returning_fields': [], 'using': 'default'}

    @wraps(method)
    def manager_method(self, *args, **kwargs):
>       return getattr(self.get_queryset(), name)(*args, **kwargs)

.venv/lib/python3.13.../db/models/manager.py:87: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <QuerySet []>, objs = [<TaskLog: 6306e120-c24e-4480-9032-7fdf6b845e04>]
fields = [<django.db.models.fields.UUIDField: id>, <django.db.models.fields.related.ForeignKey: task>, <django.db.models.fields...ield: log_level>, <django.db.models.fields.TextField: logger>, <django.db.models.fields.DateTimeField: timestamp>, ...]
returning_fields = [], raw = False, using = 'default', on_conflict = None
update_fields = None, unique_fields = None

    def _insert(
        self,
        objs,
        fields,
        returning_fields=None,
        raw=False,
        using=None,
        on_conflict=None,
        update_fields=None,
        unique_fields=None,
    ):
        """
        Insert a new record for the given model. This provides an interface to
        the InsertQuery class and is how Model.save() is implemented.
        """
        self._for_write = True
        if using is None:
            using = self.db
        query = sql.InsertQuery(
            self.model,
            on_conflict=on_conflict,
            update_fields=update_fields,
            unique_fields=unique_fields,
        )
        query.insert_values(fields, objs, raw=raw)
>       return query.get_compiler(using=using).execute_sql(returning_fields)

.venv/lib/python3.13.../db/models/query.py:1873: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <SQLInsertCompiler model=TaskLog connection=<DatabaseWrapper vendor='postgresql' alias='default'> using='default'>
returning_fields = []

    def execute_sql(self, returning_fields=None):
        assert not (
            returning_fields
            and len(self.query.objs) != 1
            and not self.connection.features.can_return_rows_from_bulk_insert
        )
        opts = self.query.get_meta()
        self.returning_fields = returning_fields
        cols = []
        with self.connection.cursor() as cursor:
            for sql, params in self.as_sql():
>               cursor.execute(sql, params)

.venv/lib/python3.13.../models/sql/compiler.py:1882: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (<django.db.backends.utils.CursorWrapper object at 0x7f1b6a4b2b10>, 'INSERT INTO "authentik_tasks_tasklog" ("id", "tas...ovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 709981, tzinfo=datetime.timezone.utc), ...))
kwargs = {}

    def runner(*args: "P.args", **kwargs: "P.kwargs"):
        # type: (...) -> R
        if sentry_sdk.get_client().get_integration(integration) is None:
            return original_function(*args, **kwargs)
    
>       return sentry_patched_function(*args, **kwargs)

.venv/lib/python3.13....../site-packages/sentry_sdk/utils.py:1816: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.backends.utils.CursorWrapper object at 0x7f1b6a4b2b10>
sql = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('6306e120-c24e-4480-9032-7fdf6b845e04'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'Blueprint Vpuo4I9qLlTOvf...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 709981, tzinfo=datetime.timezone.utc), ...)

    @ensure_integration_enabled(DjangoIntegration, real_execute)
    def execute(self, sql, params=None):
        # type: (CursorWrapper, Any, Optional[Any]) -> Any
        with record_sql_queries(
            cursor=self.cursor,
            query=sql,
            params_list=params,
            paramstyle="format",
            executemany=False,
            span_origin=DjangoIntegration.origin_db,
        ) as span:
            _set_db_data(span, self)
>           result = real_execute(self, sql, params)

.venv/lib/python3.13.../integrations/django/__init__.py:651: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.backends.utils.CursorWrapper object at 0x7f1b6a4b2b10>
sql = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('6306e120-c24e-4480-9032-7fdf6b845e04'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'Blueprint Vpuo4I9qLlTOvf...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 709981, tzinfo=datetime.timezone.utc), ...)

    def execute(self, sql, params=None):
>       return self._execute_with_wrappers(
            sql, params, many=False, executor=self._execute
        )

.venv/lib/python3.13.../db/backends/utils.py:79: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.backends.utils.CursorWrapper object at 0x7f1b6a4b2b10>
sql = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('6306e120-c24e-4480-9032-7fdf6b845e04'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'Blueprint Vpuo4I9qLlTOvf...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 709981, tzinfo=datetime.timezone.utc), ...)
many = False
executor = <bound method CursorWrapper._execute of <django.db.backends.utils.CursorWrapper object at 0x7f1b6a4b2b10>>

    def _execute_with_wrappers(self, sql, params, many, executor):
        context = {"connection": self.db, "cursor": self}
        for wrapper in reversed(self.db.execute_wrappers):
            executor = functools.partial(wrapper, executor)
>       return executor(sql, params, many, context)

.venv/lib/python3.13.../db/backends/utils.py:92: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.backends.utils.CursorWrapper object at 0x7f1b6a4b2b10>
sql = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('6306e120-c24e-4480-9032-7fdf6b845e04'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'Blueprint Vpuo4I9qLlTOvf...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 709981, tzinfo=datetime.timezone.utc), ...)
ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f1b6a4b2b10>})

    def _execute(self, sql, params, *ignored_wrapper_args):
        # Raise a warning during app initialization (stored_app_configs is only
        # ever set during testing).
        if not apps.ready and not apps.stored_app_configs:
            warnings.warn(self.APPS_NOT_READY_WARNING_MSG, category=RuntimeWarning)
        self.db.validate_no_broken_transaction()
>       with self.db.wrap_database_errors:

.venv/lib/python3.13.../db/backends/utils.py:100: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.utils.DatabaseErrorWrapper object at 0x7f1b80b8e7b0>
exc_type = <class 'psycopg.errors.ForeignKeyViolation'>
exc_value = ForeignKeyViolation('insert or update on table "authentik_tasks_tasklog" violates foreign key constraint "authentik_ta...entik"\nDETAIL:  Key (task_id)=(2920b1f0-d466-4bfc-a904-6283e5d6be66) is not present in table "authentik_tasks_task".')
traceback = <traceback object at 0x7f1b2249b000>

    def __exit__(self, exc_type, exc_value, traceback):
        if exc_type is None:
            return
        for dj_exc_type in (
            DataError,
            OperationalError,
            IntegrityError,
            InternalError,
            ProgrammingError,
            NotSupportedError,
            DatabaseError,
            InterfaceError,
            Error,
        ):
            db_exc_type = getattr(self.wrapper.Database, dj_exc_type.__name__)
            if issubclass(exc_type, db_exc_type):
                dj_exc_value = dj_exc_type(*exc_value.args)
                # Only set the 'errors_occurred' flag for errors that may make
                # the connection unusable.
                if dj_exc_type not in (DataError, IntegrityError):
                    self.wrapper.errors_occurred = True
>               raise dj_exc_value.with_traceback(traceback) from exc_value

.venv/lib/python3.13.../django/db/utils.py:91: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.backends.utils.CursorWrapper object at 0x7f1b6a4b2b10>
sql = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('6306e120-c24e-4480-9032-7fdf6b845e04'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'Blueprint Vpuo4I9qLlTOvf...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 709981, tzinfo=datetime.timezone.utc), ...)
ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f1b6a4b2b10>})

    def _execute(self, sql, params, *ignored_wrapper_args):
        # Raise a warning during app initialization (stored_app_configs is only
        # ever set during testing).
        if not apps.ready and not apps.stored_app_configs:
            warnings.warn(self.APPS_NOT_READY_WARNING_MSG, category=RuntimeWarning)
        self.db.validate_no_broken_transaction()
        with self.db.wrap_database_errors:
            if params is None:
                # params default might be backend specific.
                return self.cursor.execute(sql)
            else:
>               return self.cursor.execute(sql, params)

.venv/lib/python3.13.../db/backends/utils.py:105: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django_prometheus.db.common.ExportingCursorWrapper.<locals>.CursorWrapper [closed] [BAD] at 0x7f1b2adefdd0>
args = ('INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "...ovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 709981, tzinfo=datetime.timezone.utc), ...))
kwargs = {}

    def execute(self, *args, **kwargs):
        execute_total.labels(alias, vendor).inc()
        with (
            query_duration_seconds.labels(**labels).time(),
            ExceptionCounterByType(errors_total, extra_labels=labels),
        ):
>           return super().execute(*args, **kwargs)

.venv/lib/python3.13.../django_prometheus/db/common.py:69: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django_prometheus.db.common.ExportingCursorWrapper.<locals>.CursorWrapper [closed] [BAD] at 0x7f1b2adefdd0>
query = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('6306e120-c24e-4480-9032-7fdf6b845e04'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'Blueprint Vpuo4I9qLlTOvf...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 709981, tzinfo=datetime.timezone.utc), ...)

    def execute(
        self,
        query: Query,
        params: Params | None = None,
        *,
        prepare: bool | None = None,
        binary: bool | None = None,
    ) -> Self:
        """
        Execute a query or command to the database.
        """
        try:
            with self._conn.lock:
                self._conn.wait(
                    self._execute_gen(query, params, prepare=prepare, binary=binary)
                )
        except e._NO_TRACEBACK as ex:
>           raise ex.with_traceback(None)
E           django.db.utils.IntegrityError: insert or update on table "authentik_tasks_tasklog" violates foreign key constraint "authentik_tasks_task_task_id_a82f0835_fk_authentik"
E           DETAIL:  Key (task_id)=(2920b1f0-d466-4bfc-a904-6283e5d6be66) is not present in table "authentik_tasks_task".

.venv/lib/python3.13............/site-packages/psycopg/cursor.py:97: IntegrityError

During handling of the above exception, another exception occurred:

self = <django.db.backends.utils.CursorWrapper object at 0x7f1b64ca1cd0>
sql = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('22e43a37-7446-4a3e-98c9-9fd0d29f269e'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'insert or update on tabl...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 715516, tzinfo=datetime.timezone.utc), ...)
ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f1b64ca1cd0>})

    def _execute(self, sql, params, *ignored_wrapper_args):
        # Raise a warning during app initialization (stored_app_configs is only
        # ever set during testing).
        if not apps.ready and not apps.stored_app_configs:
            warnings.warn(self.APPS_NOT_READY_WARNING_MSG, category=RuntimeWarning)
        self.db.validate_no_broken_transaction()
        with self.db.wrap_database_errors:
            if params is None:
                # params default might be backend specific.
                return self.cursor.execute(sql)
            else:
>               return self.cursor.execute(sql, params)

.venv/lib/python3.13.../db/backends/utils.py:105: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django_prometheus.db.common.ExportingCursorWrapper.<locals>.CursorWrapper [closed] [BAD] at 0x7f1b2adee750>
args = ('INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "...ovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 715516, tzinfo=datetime.timezone.utc), ...))
kwargs = {}

    def execute(self, *args, **kwargs):
        execute_total.labels(alias, vendor).inc()
        with (
            query_duration_seconds.labels(**labels).time(),
            ExceptionCounterByType(errors_total, extra_labels=labels),
        ):
>           return super().execute(*args, **kwargs)

.venv/lib/python3.13.../django_prometheus/db/common.py:69: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django_prometheus.db.common.ExportingCursorWrapper.<locals>.CursorWrapper [closed] [BAD] at 0x7f1b2adee750>
query = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('22e43a37-7446-4a3e-98c9-9fd0d29f269e'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'insert or update on tabl...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 715516, tzinfo=datetime.timezone.utc), ...)

    def execute(
        self,
        query: Query,
        params: Params | None = None,
        *,
        prepare: bool | None = None,
        binary: bool | None = None,
    ) -> Self:
        """
        Execute a query or command to the database.
        """
        try:
            with self._conn.lock:
                self._conn.wait(
                    self._execute_gen(query, params, prepare=prepare, binary=binary)
                )
        except e._NO_TRACEBACK as ex:
>           raise ex.with_traceback(None)
E           psycopg.errors.ForeignKeyViolation: insert or update on table "authentik_tasks_tasklog" violates foreign key constraint "authentik_tasks_task_task_id_a82f0835_fk_authentik"
E           DETAIL:  Key (task_id)=(2920b1f0-d466-4bfc-a904-6283e5d6be66) is not present in table "authentik_tasks_task".

.venv/lib/python3.13............/site-packages/psycopg/cursor.py:97: ForeignKeyViolation

The above exception was the direct cause of the following exception:

self = <unittest.case._Outcome object at 0x7f1b74227d20>
test_case = <authentik.blueprints.tests.test_v1_tasks.TestBlueprintsV1Tasks testMethod=test_valid_disabled>
subTest = False

    @contextlib.contextmanager
    def testPartExecutor(self, test_case, subTest=False):
        old_success = self.success
        self.success = True
        try:
>           yield

.../hostedtoolcache/Python/3.13.11............/x64/lib/python3.13/unittest/case.py:58: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <authentik.blueprints.tests.test_v1_tasks.TestBlueprintsV1Tasks testMethod=test_valid_disabled>
result = <TestCaseFunction test_valid_disabled>

    def run(self, result=None):
        if result is None:
            result = self.defaultTestResult()
            startTestRun = getattr(result, 'startTestRun', None)
            stopTestRun = getattr(result, 'stopTestRun', None)
            if startTestRun is not None:
                startTestRun()
        else:
            stopTestRun = None
    
        result.startTest(self)
        try:
            testMethod = getattr(self, self._testMethodName)
            if (getattr(self.__class__, "__unittest_skip__", False) or
                getattr(testMethod, "__unittest_skip__", False)):
                # If the class or method was skipped.
                skip_why = (getattr(self.__class__, '__unittest_skip_why__', '')
                            or getattr(testMethod, '__unittest_skip_why__', ''))
                _addSkip(result, self, skip_why)
                return result
    
            expecting_failure = (
                getattr(self, "__unittest_expecting_failure__", False) or
                getattr(testMethod, "__unittest_expecting_failure__", False)
            )
            outcome = _Outcome(result)
            start_time = time.perf_counter()
            try:
                self._outcome = outcome
    
                with outcome.testPartExecutor(self):
                    self._callSetUp()
                if outcome.success:
                    outcome.expecting_failure = expecting_failure
                    with outcome.testPartExecutor(self):
>                       self._callTestMethod(testMethod)

.../hostedtoolcache/Python/3.13.11............/x64/lib/python3.13/unittest/case.py:651: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <authentik.blueprints.tests.test_v1_tasks.TestBlueprintsV1Tasks testMethod=test_valid_disabled>
method = <bound method TestBlueprintsV1Tasks.test_valid_disabled of <authentik.blueprints.tests.test_v1_tasks.TestBlueprintsV1Tasks testMethod=test_valid_disabled>>

    def _callTestMethod(self, method):
>       if method() is not None:

.../hostedtoolcache/Python/3.13.11............/x64/lib/python3.13/unittest/case.py:606: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (<authentik.blueprints.tests.test_v1_tasks.TestBlueprintsV1Tasks testMethod=test_valid_disabled>,)
kwds = {}

    @wraps(func)
    def inner(*args, **kwds):
        with self._recreate_cm():
>           return func(*args, **kwds)

.../hostedtoolcache/Python/3.13.11............/x64/lib/python3.13/contextlib.py:85: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <authentik.blueprints.tests.test_v1_tasks.TestBlueprintsV1Tasks testMethod=test_valid_disabled>

    @CONFIG.patch("blueprints_dir", TMP)
    def test_valid_disabled(self):
        """Test valid file"""
        with NamedTemporaryFile(mode="w+", suffix=".yaml", dir=TMP) as file:
            file.write(
                dump(
                    {
                        "version": 1,
                        "entries": [],
                    }
                )
            )
            file.flush()
            instance: BlueprintInstance = BlueprintInstance.objects.create(
                name=generate_id(),
                path=file.name,
                enabled=False,
                status=BlueprintInstanceStatus.UNKNOWN,
            )
            instance.refresh_from_db()
            self.assertEqual(instance.last_applied_hash, "")
            self.assertEqual(
                instance.status,
                BlueprintInstanceStatus.UNKNOWN,
            )
>           apply_blueprint(instance.pk)

.../blueprints/tests/test_v1_tasks.py:152: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = Actor(<function apply_blueprint at 0x7f1b803bac00>, queue_name='default', actor_name='authentik.blueprints.v1.tasks.apply_blueprint')
args = (UUID('8b456399-3fd0-43ea-998c-62b8ff97fd5a'),), kwargs = {}
start = 876.952727505, delta = 0.014526233999959004

    def __call__(self, *args: P.args, **kwargs: P.kwargs) -> Any | R | Awaitable[R]:
        """Synchronously call this actor.
    
        Parameters:
          *args: Positional arguments to send to the actor.
          **kwargs: Keyword arguments to send to the actor.
    
        Returns:
          Whatever the underlying function backing this actor returns.
        """
        try:
            self.logger.debug("Received args=%r kwargs=%r.", args, kwargs)
            start = time.perf_counter()
>           return self.fn(*args, **kwargs)

.venv/lib/python3.13.../site-packages/dramatiq/actor.py:185: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

instance_pk = UUID('8b456399-3fd0-43ea-998c-62b8ff97fd5a')

    @actor(description=_("Apply single blueprint."))
    def apply_blueprint(instance_pk: UUID):
        try:
            self = CurrentTask.get_task()
        except CurrentTaskNotFound:
            self = Task()
        self.set_uid(str(instance_pk))
        instance: BlueprintInstance | None = None
        try:
            instance: BlueprintInstance = BlueprintInstance.objects.filter(pk=instance_pk).first()
            if not instance:
                self.warning(f"Could not find blueprint {instance_pk}, skipping")
                return
            self.set_uid(slugify(instance.name))
            if not instance.enabled:
                self.info(f"Blueprint {instance.name} is disabled, skipping")
                return
            blueprint_content = instance.retrieve()
            file_hash = sha512(blueprint_content.encode()).hexdigest()
            importer = Importer.from_string(blueprint_content, instance.context)
            if importer.blueprint.metadata:
                instance.metadata = asdict(importer.blueprint.metadata)
            valid, logs = importer.validate()
            if not valid:
                instance.status = BlueprintInstanceStatus.ERROR
                instance.save()
                self.logs(logs)
                return
            with capture_logs() as logs:
                applied = importer.apply()
                if not applied:
                    instance.status = BlueprintInstanceStatus.ERROR
                    instance.save()
                    self.logs(logs)
                    return
            instance.status = BlueprintInstanceStatus.SUCCESSFUL
            instance.last_applied_hash = file_hash
            instance.last_applied = now()
        except (
            OSError,
            DatabaseError,
            ProgrammingError,
            InternalError,
            BlueprintRetrievalFailed,
            EntryInvalidError,
        ) as exc:
            if instance:
                instance.status = BlueprintInstanceStatus.ERROR
>           self.error(exc)

.../blueprints/v1/tasks.py:240: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <Task: 2920b1f0-d466-4bfc-a904-6283e5d6be66>
message = IntegrityError('insert or update on table "authentik_tasks_tasklog" violates foreign key constraint "authentik_tasks_t...entik"\nDETAIL:  Key (task_id)=(2920b1f0-d466-4bfc-a904-6283e5d6be66) is not present in table "authentik_tasks_task".')
attributes = {}

    def error(self, message: str | Exception, **attributes) -> None:
>       self.log(self.uid, TaskStatus.ERROR, message, **attributes)

authentik/tasks/models.py:144: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <Task: 2920b1f0-d466-4bfc-a904-6283e5d6be66>
logger = 'authentik.events.tasks.event_trigger_dispatch:vpuo4i9qlltovfvq8z5gwbna0ydnepcyslxtbu7f'
log_level = TaskStatus.ERROR
message = IntegrityError('insert or update on table "authentik_tasks_tasklog" violates foreign key constraint "authentik_tasks_t...entik"\nDETAIL:  Key (task_id)=(2920b1f0-d466-4bfc-a904-6283e5d6be66) is not present in table "authentik_tasks_task".')
attributes = {}

    def log(
        self,
        logger: str,
        log_level: TaskStatus,
        message: str | Exception,
        **attributes,
    ) -> None:
>       TaskLog.create_from_log_event(
            self,
            self._make_log(
                logger,
                log_level,
                message,
                **attributes,
            ),
        )

authentik/tasks/models.py:127: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

cls = <class 'authentik.tasks.models.TaskLog'>
task = <Task: 2920b1f0-d466-4bfc-a904-6283e5d6be66>
log_event = LogEvent(event='insert or update on table "authentik_tasks_tasklog" violates foreign key constraint "authentik_tasks_t...python3.13............/site-packages/psycopg/cursor.py', 'lineno': 97, 'name': 'execute'}], 'is_group': False, 'exceptions': []}]})

    @classmethod
    def create_from_log_event(cls, task: Task, log_event: LogEvent) -> Self | None:
        if not task.message:
            return None
>       return cls.objects.create(
            task=task,
            event=log_event.event,
            log_level=log_event.log_level,
            logger=log_event.logger,
            timestamp=log_event.timestamp,
            attributes=sanitize_item(log_event.attributes),
        )

authentik/tasks/models.py:172: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.models.manager.Manager object at 0x7f1b7490f750>, args = ()
kwargs = {'attributes': {'exception': [{'exc_notes': [], 'exc_type': 'IntegrityError', 'exc_value': 'insert or update on table ...vel': 'error', 'logger': 'authentik.events.tasks.event_trigger_dispatch:vpuo4i9qlltovfvq8z5gwbna0ydnepcyslxtbu7f', ...}

    @wraps(method)
    def manager_method(self, *args, **kwargs):
>       return getattr(self.get_queryset(), name)(*args, **kwargs)

.venv/lib/python3.13.../db/models/manager.py:87: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <QuerySet []>
kwargs = {'attributes': {'exception': [{'exc_notes': [], 'exc_type': 'IntegrityError', 'exc_value': 'insert or update on table ...vel': 'error', 'logger': 'authentik.events.tasks.event_trigger_dispatch:vpuo4i9qlltovfvq8z5gwbna0ydnepcyslxtbu7f', ...}
reverse_one_to_one_fields = frozenset()
obj = <TaskLog: 22e43a37-7446-4a3e-98c9-9fd0d29f269e>

    def create(self, **kwargs):
        """
        Create a new object with the given kwargs, saving it to the database
        and returning the created object.
        """
        reverse_one_to_one_fields = frozenset(kwargs).intersection(
            self.model._meta._reverse_one_to_one_field_names
        )
        if reverse_one_to_one_fields:
            raise ValueError(
                "The following fields do not exist in this model: %s"
                % ", ".join(reverse_one_to_one_fields)
            )
    
        obj = self.model(**kwargs)
        self._for_write = True
>       obj.save(force_insert=True, using=self.db)

.venv/lib/python3.13.../db/models/query.py:665: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <TaskLog: 22e43a37-7446-4a3e-98c9-9fd0d29f269e>, force_insert = True
force_update = False, using = 'default', update_fields = None

    def save(
        self,
        *args,
        force_insert=False,
        force_update=False,
        using=None,
        update_fields=None,
    ):
        """
        Save the current instance. Override this in a subclass if you want to
        control the saving process.
    
        The 'force_insert' and 'force_update' parameters can be used to insist
        that the "save" must be an SQL insert or update (or equivalent for
        non-SQL backends), respectively. Normally, they should not be set.
        """
        # RemovedInDjango60Warning.
        if args:
            force_insert, force_update, using, update_fields = self._parse_save_params(
                *args,
                method_name="save",
                force_insert=force_insert,
                force_update=force_update,
                using=using,
                update_fields=update_fields,
            )
    
        self._prepare_related_fields_for_save(operation_name="save")
    
        using = using or router.db_for_write(self.__class__, instance=self)
        if force_insert and (force_update or update_fields):
            raise ValueError("Cannot force both insert and updating in model saving.")
    
        deferred_non_generated_fields = {
            f.attname
            for f in self._meta.concrete_fields
            if f.attname not in self.__dict__ and f.generated is False
        }
        if update_fields is not None:
            # If update_fields is empty, skip the save. We do also check for
            # no-op saves later on for inheritance cases. This bailout is
            # still needed for skipping signal sending.
            if not update_fields:
                return
    
            update_fields = frozenset(update_fields)
            field_names = self._meta._non_pk_concrete_field_names
            not_updatable_fields = update_fields.difference(field_names)
    
            if not_updatable_fields:
                raise ValueError(
                    "The following fields do not exist in this model, are m2m "
                    "fields, primary keys, or are non-concrete fields: %s"
                    % ", ".join(not_updatable_fields)
                )
    
        # If saving to the same database, and this model is deferred, then
        # automatically do an "update_fields" save on the loaded fields.
        elif (
            not force_insert
            and deferred_non_generated_fields
            and using == self._state.db
        ):
            field_names = set()
            pk_fields = self._meta.pk_fields
            for field in self._meta.concrete_fields:
                if field not in pk_fields and not hasattr(field, "through"):
                    field_names.add(field.attname)
            loaded_fields = field_names.difference(deferred_non_generated_fields)
            if loaded_fields:
                update_fields = frozenset(loaded_fields)
    
>       self.save_base(
            using=using,
            force_insert=force_insert,
            force_update=force_update,
            update_fields=update_fields,
        )

.venv/lib/python3.13.../db/models/base.py:902: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <TaskLog: 22e43a37-7446-4a3e-98c9-9fd0d29f269e>, raw = False
force_insert = (<class 'authentik.tasks.models.TaskLog'>,), force_update = False
using = 'default', update_fields = None

    def save_base(
        self,
        raw=False,
        force_insert=False,
        force_update=False,
        using=None,
        update_fields=None,
    ):
        """
        Handle the parts of saving which should be done only once per save,
        yet need to be done in raw saves, too. This includes some sanity
        checks and signal sending.
    
        The 'raw' argument is telling save_base not to save any parent
        models and not to do any changes to the values before save. This
        is used by fixture loading.
        """
        using = using or router.db_for_write(self.__class__, instance=self)
        assert not (force_insert and (force_update or update_fields))
        assert update_fields is None or update_fields
        cls = origin = self.__class__
        # Skip proxies, but keep the origin as the proxy model.
        if cls._meta.proxy:
            cls = cls._meta.concrete_model
        meta = cls._meta
        if not meta.auto_created:
            pre_save.send(
                sender=origin,
                instance=self,
                raw=raw,
                using=using,
                update_fields=update_fields,
            )
        # A transaction isn't needed if one query is issued.
        if meta.parents:
            context_manager = transaction.atomic(using=using, savepoint=False)
        else:
            context_manager = transaction.mark_for_rollback_on_error(using=using)
        with context_manager:
            parent_inserted = False
            if not raw:
                # Validate force insert only when parents are inserted.
                force_insert = self._validate_force_insert(force_insert)
                parent_inserted = self._save_parents(
                    cls, using, update_fields, force_insert
                )
>           updated = self._save_table(
                raw,
                cls,
                force_insert or parent_inserted,
                force_update,
                using,
                update_fields,
            )

.venv/lib/python3.13.../db/models/base.py:1008: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <TaskLog: 22e43a37-7446-4a3e-98c9-9fd0d29f269e>, raw = False
cls = <class 'authentik.tasks.models.TaskLog'>
force_insert = (<class 'authentik.tasks.models.TaskLog'>,), force_update = False
using = 'default', update_fields = None

    def _save_table(
        self,
        raw=False,
        cls=None,
        force_insert=False,
        force_update=False,
        using=None,
        update_fields=None,
    ):
        """
        Do the heavy-lifting involved in saving. Update or insert the data
        for a single table.
        """
        meta = cls._meta
        pk_fields = meta.pk_fields
        non_pks_non_generated = [
            f
            for f in meta.local_concrete_fields
            if f not in pk_fields and not f.generated
        ]
    
        if update_fields:
            non_pks_non_generated = [
                f
                for f in non_pks_non_generated
                if f.name in update_fields or f.attname in update_fields
            ]
    
        if not self._is_pk_set(meta):
            pk_val = meta.pk.get_pk_value_on_save(self)
            setattr(self, meta.pk.attname, pk_val)
        pk_set = self._is_pk_set(meta)
        if not pk_set and (force_update or update_fields):
            raise ValueError("Cannot force an update in save() with no primary key.")
        updated = False
        # Skip an UPDATE when adding an instance and primary key has a default.
        if (
            not raw
            and not force_insert
            and not force_update
            and self._state.adding
            and all(f.has_default() or f.has_db_default() for f in meta.pk_fields)
        ):
            force_insert = True
        # If possible, try an UPDATE. If that doesn't update anything, do an INSERT.
        if pk_set and not force_insert:
            base_qs = cls._base_manager.using(using)
            values = [
                (
                    f,
                    None,
                    (getattr(self, f.attname) if raw else f.pre_save(self, False)),
                )
                for f in non_pks_non_generated
            ]
            forced_update = update_fields or force_update
            pk_val = self._get_pk_val(meta)
            updated = self._do_update(
                base_qs, using, pk_val, values, update_fields, forced_update
            )
            if force_update and not updated:
                raise DatabaseError("Forced update did not affect any rows.")
            if update_fields and not updated:
                raise DatabaseError("Save with update_fields did not affect any rows.")
        if not updated:
            if meta.order_with_respect_to:
                # If this is a model with an order_with_respect_to
                # autopopulate the _order field
                field = meta.order_with_respect_to
                filter_args = field.get_filter_kwargs_for_object(self)
                self._order = (
                    cls._base_manager.using(using)
                    .filter(**filter_args)
                    .aggregate(
                        _order__max=Coalesce(
                            ExpressionWrapper(
                                Max("_order") + Value(1), output_field=IntegerField()
                            ),
                            Value(0),
                        ),
                    )["_order__max"]
                )
            fields = [
                f
                for f in meta.local_concrete_fields
                if not f.generated and (pk_set or f is not meta.auto_field)
            ]
            returning_fields = meta.db_returning_fields
>           results = self._do_insert(
                cls._base_manager, using, fields, returning_fields, raw
            )

.venv/lib/python3.13.../db/models/base.py:1169: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <TaskLog: 22e43a37-7446-4a3e-98c9-9fd0d29f269e>
manager = <django.db.models.manager.Manager object at 0x7f1b748f73d0>
using = 'default'
fields = [<django.db.models.fields.UUIDField: id>, <django.db.models.fields.related.ForeignKey: task>, <django.db.models.fields...ield: log_level>, <django.db.models.fields.TextField: logger>, <django.db.models.fields.DateTimeField: timestamp>, ...]
returning_fields = [], raw = False

    def _do_insert(self, manager, using, fields, returning_fields, raw):
        """
        Do an INSERT. If returning_fields is defined then this method should
        return the newly created data for the model.
        """
>       return manager._insert(
            [self],
            fields=fields,
            returning_fields=returning_fields,
            using=using,
            raw=raw,
        )

.venv/lib/python3.13.../db/models/base.py:1210: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.models.manager.Manager object at 0x7f1b748f73d0>
args = ([<TaskLog: 22e43a37-7446-4a3e-98c9-9fd0d29f269e>],)
kwargs = {'fields': [<django.db.models.fields.UUIDField: id>, <django.db.models.fields.related.ForeignKey: task>, <django.db.mo...r>, <django.db.models.fields.DateTimeField: timestamp>, ...], 'raw': False, 'returning_fields': [], 'using': 'default'}

    @wraps(method)
    def manager_method(self, *args, **kwargs):
>       return getattr(self.get_queryset(), name)(*args, **kwargs)

.venv/lib/python3.13.../db/models/manager.py:87: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <QuerySet []>, objs = [<TaskLog: 22e43a37-7446-4a3e-98c9-9fd0d29f269e>]
fields = [<django.db.models.fields.UUIDField: id>, <django.db.models.fields.related.ForeignKey: task>, <django.db.models.fields...ield: log_level>, <django.db.models.fields.TextField: logger>, <django.db.models.fields.DateTimeField: timestamp>, ...]
returning_fields = [], raw = False, using = 'default', on_conflict = None
update_fields = None, unique_fields = None

    def _insert(
        self,
        objs,
        fields,
        returning_fields=None,
        raw=False,
        using=None,
        on_conflict=None,
        update_fields=None,
        unique_fields=None,
    ):
        """
        Insert a new record for the given model. This provides an interface to
        the InsertQuery class and is how Model.save() is implemented.
        """
        self._for_write = True
        if using is None:
            using = self.db
        query = sql.InsertQuery(
            self.model,
            on_conflict=on_conflict,
            update_fields=update_fields,
            unique_fields=unique_fields,
        )
        query.insert_values(fields, objs, raw=raw)
>       return query.get_compiler(using=using).execute_sql(returning_fields)

.venv/lib/python3.13.../db/models/query.py:1873: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <SQLInsertCompiler model=TaskLog connection=<DatabaseWrapper vendor='postgresql' alias='default'> using='default'>
returning_fields = []

    def execute_sql(self, returning_fields=None):
        assert not (
            returning_fields
            and len(self.query.objs) != 1
            and not self.connection.features.can_return_rows_from_bulk_insert
        )
        opts = self.query.get_meta()
        self.returning_fields = returning_fields
        cols = []
        with self.connection.cursor() as cursor:
            for sql, params in self.as_sql():
>               cursor.execute(sql, params)

.venv/lib/python3.13.../models/sql/compiler.py:1882: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

args = (<django.db.backends.utils.CursorWrapper object at 0x7f1b64ca1cd0>, 'INSERT INTO "authentik_tasks_tasklog" ("id", "tas...ovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 715516, tzinfo=datetime.timezone.utc), ...))
kwargs = {}

    def runner(*args: "P.args", **kwargs: "P.kwargs"):
        # type: (...) -> R
        if sentry_sdk.get_client().get_integration(integration) is None:
            return original_function(*args, **kwargs)
    
>       return sentry_patched_function(*args, **kwargs)

.venv/lib/python3.13....../site-packages/sentry_sdk/utils.py:1816: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.backends.utils.CursorWrapper object at 0x7f1b64ca1cd0>
sql = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('22e43a37-7446-4a3e-98c9-9fd0d29f269e'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'insert or update on tabl...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 715516, tzinfo=datetime.timezone.utc), ...)

    @ensure_integration_enabled(DjangoIntegration, real_execute)
    def execute(self, sql, params=None):
        # type: (CursorWrapper, Any, Optional[Any]) -> Any
        with record_sql_queries(
            cursor=self.cursor,
            query=sql,
            params_list=params,
            paramstyle="format",
            executemany=False,
            span_origin=DjangoIntegration.origin_db,
        ) as span:
            _set_db_data(span, self)
>           result = real_execute(self, sql, params)

.venv/lib/python3.13.../integrations/django/__init__.py:651: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.backends.utils.CursorWrapper object at 0x7f1b64ca1cd0>
sql = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('22e43a37-7446-4a3e-98c9-9fd0d29f269e'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'insert or update on tabl...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 715516, tzinfo=datetime.timezone.utc), ...)

    def execute(self, sql, params=None):
>       return self._execute_with_wrappers(
            sql, params, many=False, executor=self._execute
        )

.venv/lib/python3.13.../db/backends/utils.py:79: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.backends.utils.CursorWrapper object at 0x7f1b64ca1cd0>
sql = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('22e43a37-7446-4a3e-98c9-9fd0d29f269e'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'insert or update on tabl...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 715516, tzinfo=datetime.timezone.utc), ...)
many = False
executor = <bound method CursorWrapper._execute of <django.db.backends.utils.CursorWrapper object at 0x7f1b64ca1cd0>>

    def _execute_with_wrappers(self, sql, params, many, executor):
        context = {"connection": self.db, "cursor": self}
        for wrapper in reversed(self.db.execute_wrappers):
            executor = functools.partial(wrapper, executor)
>       return executor(sql, params, many, context)

.venv/lib/python3.13.../db/backends/utils.py:92: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.backends.utils.CursorWrapper object at 0x7f1b64ca1cd0>
sql = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('22e43a37-7446-4a3e-98c9-9fd0d29f269e'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'insert or update on tabl...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 715516, tzinfo=datetime.timezone.utc), ...)
ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f1b64ca1cd0>})

    def _execute(self, sql, params, *ignored_wrapper_args):
        # Raise a warning during app initialization (stored_app_configs is only
        # ever set during testing).
        if not apps.ready and not apps.stored_app_configs:
            warnings.warn(self.APPS_NOT_READY_WARNING_MSG, category=RuntimeWarning)
        self.db.validate_no_broken_transaction()
>       with self.db.wrap_database_errors:

.venv/lib/python3.13.../db/backends/utils.py:100: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.utils.DatabaseErrorWrapper object at 0x7f1b80b8e7b0>
exc_type = <class 'psycopg.errors.ForeignKeyViolation'>
exc_value = ForeignKeyViolation('insert or update on table "authentik_tasks_tasklog" violates foreign key constraint "authentik_ta...entik"\nDETAIL:  Key (task_id)=(2920b1f0-d466-4bfc-a904-6283e5d6be66) is not present in table "authentik_tasks_task".')
traceback = <traceback object at 0x7f1b2a8a8740>

    def __exit__(self, exc_type, exc_value, traceback):
        if exc_type is None:
            return
        for dj_exc_type in (
            DataError,
            OperationalError,
            IntegrityError,
            InternalError,
            ProgrammingError,
            NotSupportedError,
            DatabaseError,
            InterfaceError,
            Error,
        ):
            db_exc_type = getattr(self.wrapper.Database, dj_exc_type.__name__)
            if issubclass(exc_type, db_exc_type):
                dj_exc_value = dj_exc_type(*exc_value.args)
                # Only set the 'errors_occurred' flag for errors that may make
                # the connection unusable.
                if dj_exc_type not in (DataError, IntegrityError):
                    self.wrapper.errors_occurred = True
>               raise dj_exc_value.with_traceback(traceback) from exc_value

.venv/lib/python3.13.../django/db/utils.py:91: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django.db.backends.utils.CursorWrapper object at 0x7f1b64ca1cd0>
sql = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('22e43a37-7446-4a3e-98c9-9fd0d29f269e'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'insert or update on tabl...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 715516, tzinfo=datetime.timezone.utc), ...)
ignored_wrapper_args = (False, {'connection': <DatabaseWrapper vendor='postgresql' alias='default'>, 'cursor': <django.db.backends.utils.CursorWrapper object at 0x7f1b64ca1cd0>})

    def _execute(self, sql, params, *ignored_wrapper_args):
        # Raise a warning during app initialization (stored_app_configs is only
        # ever set during testing).
        if not apps.ready and not apps.stored_app_configs:
            warnings.warn(self.APPS_NOT_READY_WARNING_MSG, category=RuntimeWarning)
        self.db.validate_no_broken_transaction()
        with self.db.wrap_database_errors:
            if params is None:
                # params default might be backend specific.
                return self.cursor.execute(sql)
            else:
>               return self.cursor.execute(sql, params)

.venv/lib/python3.13.../db/backends/utils.py:105: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django_prometheus.db.common.ExportingCursorWrapper.<locals>.CursorWrapper [closed] [BAD] at 0x7f1b2adee750>
args = ('INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "...ovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 715516, tzinfo=datetime.timezone.utc), ...))
kwargs = {}

    def execute(self, *args, **kwargs):
        execute_total.labels(alias, vendor).inc()
        with (
            query_duration_seconds.labels(**labels).time(),
            ExceptionCounterByType(errors_total, extra_labels=labels),
        ):
>           return super().execute(*args, **kwargs)

.venv/lib/python3.13.../django_prometheus/db/common.py:69: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <django_prometheus.db.common.ExportingCursorWrapper.<locals>.CursorWrapper [closed] [BAD] at 0x7f1b2adee750>
query = 'INSERT INTO "authentik_tasks_tasklog" ("id", "task_id", "event", "log_level", "logger", "timestamp", "attributes", "previous") VALUES (%s, %s, %s, %s, %s, %s, %s, %s)'
params = (UUID('22e43a37-7446-4a3e-98c9-9fd0d29f269e'), UUID('2920b1f0-d466-4bfc-a904-6283e5d6be66'), 'insert or update on tabl...tovfvq8z5gwbna0ydnepcyslxtbu7f', datetime.datetime(2025, 12, 22, 2, 11, 38, 715516, tzinfo=datetime.timezone.utc), ...)

    def execute(
        self,
        query: Query,
        params: Params | None = None,
        *,
        prepare: bool | None = None,
        binary: bool | None = None,
    ) -> Self:
        """
        Execute a query or command to the database.
        """
        try:
            with self._conn.lock:
                self._conn.wait(
                    self._execute_gen(query, params, prepare=prepare, binary=binary)
                )
        except e._NO_TRACEBACK as ex:
>           raise ex.with_traceback(None)
E           django.db.utils.IntegrityError: insert or update on table "authentik_tasks_tasklog" violates foreign key constraint "authentik_tasks_task_task_id_a82f0835_fk_authentik"
E           DETAIL:  Key (task_id)=(2920b1f0-d466-4bfc-a904-6283e5d6be66) is not present in table "authentik_tasks_task".

.venv/lib/python3.13............/site-packages/psycopg/cursor.py:97: IntegrityError

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

@github-actions
Copy link
Contributor

github-actions bot commented Dec 9, 2025

authentik PR Installation instructions

Instructions for docker-compose

Add the following block to your .env file:

AUTHENTIK_IMAGE=ghcr.io/goauthentik/dev-server
AUTHENTIK_TAG=gh-93d50ddb823b1b4205ce75241437a07cb01b8682
AUTHENTIK_OUTPOSTS__CONTAINER_IMAGE_BASE=ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s

Afterwards, run the upgrade commands from the latest release notes.

Instructions for Kubernetes

Add the following block to your values.yml file:

authentik:
    outposts:
        container_image_base: ghcr.io/goauthentik/dev-%(type)s:gh-%(build_hash)s
global:
    image:
        repository: ghcr.io/goauthentik/dev-server
        tag: gh-93d50ddb823b1b4205ce75241437a07cb01b8682

Afterwards, run the upgrade commands from the latest release notes.

Signed-off-by: Dominic R <dominic@sdko.org>
Signed-off-by: Dominic R <dominic@sdko.org>
@dominic-r dominic-r marked this pull request as draft December 9, 2025 12:11
@rissson rissson changed the title bootstrap: ability to use hashed password core: ability to use hashed password in users API Dec 9, 2025
@rissson rissson changed the title core: ability to use hashed password in users API core: support hashed password in users API Dec 9, 2025
@dominic-r dominic-r marked this pull request as ready for review December 22, 2025 01:37
@dominic-r dominic-r removed the request for review from gergosimonyi December 22, 2025 01:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:docs area:frontend Features or issues related to the browser, TypeScript, Node.js, etc

Projects

Status: Todo

Development

Successfully merging this pull request may close these issues.

Allow passing a password hash instead of a plaintext password when bootstrapping akadmin account

2 participants