Skip to content

Conversation

@Satyajitv
Copy link

Capability to unfreeze layer in LLM using --unfreeze_text_layers param.

Based on the number (x) we pass, code will freeze last x layers in the LLM used.

unfreeze_vision_layers: Optional[int] = field(default=None)


def __post_init__(self):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Satyajitv what is the purpose of the _post_init function?


def __post_init__(self):
if self.unfreeze_text_layers or self.unfreeze_vision_layers:
self.freeze_backbone = False
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks like freeze_backbone is not really for training haotian-liu#723

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants