Skip to content

True batch #634

@BenjaminScholtens

Description

@BenjaminScholtens

Currently we have a simple queue that makes a separate llm prompt for each cell. We want to try out a prompting system that batches content based on context. Step one is to move to a batching system that actually batches a range of cells in the same prompt. Step two is to auto suggest batch sizes based on similar context. We can create some sort of algorithm like the search algorithm to find content in the source that should be translated together.

Image

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions