Skip to content

Bug fixes in vqvae module#6

Open
JonathanColetti wants to merge 2 commits intoKatarinaYuan:mainfrom
JonathanColetti:main
Open

Bug fixes in vqvae module#6
JonathanColetti wants to merge 2 commits intoKatarinaYuan:mainfrom
JonathanColetti:main

Conversation

@JonathanColetti
Copy link

Bug fixes in vqvae module: adding bias to LayerNorm of VanillaMultiHeadAttention:init, quantizer indices2embedding misuses nn.Embedding and VanillaTransformerStack.forward not forcing required arguments such as affine, affine_mask

Anonymous and others added 2 commits July 28, 2025 23:13
…adAttention:__init__, quantizer misuses and not forcing required arguments such as ,
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant