-
Notifications
You must be signed in to change notification settings - Fork 12
Open
Description
Currently we have a simple queue that makes a separate llm prompt for each cell. We want to try out a prompting system that batches content based on context. Step one is to move to a batching system that actually batches a range of cells in the same prompt. Step two is to auto suggest batch sizes based on similar context. We can create some sort of algorithm like the search algorithm to find content in the source that should be translated together.

Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels