Skip to content

Conversation

@iskbaga
Copy link
Collaborator

@iskbaga iskbaga commented Jun 22, 2025

No description provided.

"experiment_name": "letter_data",
"best_metric": "validation/ndcg@20",
"experiment_name": "letter_tiger",
"best_metric": "loss",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

А почему тут лосс?

inputs[self._output_prefix] = loss.cpu().item()

return loss

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Отступ 2 строки

if self._output_prefix is not None:
inputs[self._output_prefix] = loss.cpu().item()
return loss

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Отступ 2 строки

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Создай папочку tests рядом с modeling

codebook_embeddings = self._get_position_embeddings(
lengths, mask, codebook_lambda, self._codebook_embeddings
)
last_output = decoder_output[:, -1:, :] # (batch_size, 1, embedding_dim)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

тут можно сделать decoder_output[:, -1, :]

logits = torch.matmul(
last_output,
weights.t()
).squeeze(1) # (batch_size, codebook_size)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Если сделаешь то, что я выше написал, то можно тут squeeze не делать

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants