in batch norm layer setting layer.trainable = False means freeze the layer, i.e. its internal state will not change during training: its trainable weights will not be updated during fit() or train_on_batch(), and its state updates will not be run,why batch norm freeze in train code?