Skip to content

LM's read_batch and predict_proba should accept variable length input batches #3

@emanjavacas

Description

@emanjavacas

Using pack_padded_sequence etc...

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions