Skip to content

why batch norm layer freeze? #11

@gynjimmy0306

Description

@gynjimmy0306

in batch norm layer setting layer.trainable = False means freeze the layer, i.e. its internal state will not change during training: its trainable weights will not be updated during fit() or train_on_batch(), and its state updates will not be run,why batch norm freeze in train code?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions