Skip to content

Add support for Deno Deploy #3

@hellojwilde

Description

@hellojwilde

Background

There was a fantastic idea from a Redditor about making EnergeticAI run well in Cloudflare Workers, so you can have super fast inference at the edge, without the need to distribute your model weights:

https://www.reddit.com/r/tensorflow/comments/1493uoq/comment/jo6axc9/?utm_source=reddit&utm_medium=web2x&context=3

Goal

This task is to add support for Deno Deploy to EnergeticAI.

Approach

Given that Deno Deploy has even more restrictive bundle limits than AWS Lambda, I suspect the way to do this would be to distribute sharded model weights in Deno KV, and then fetch from that in parallel on function invocation. On paper at least KV values should be colocated with the functions enough that this should be fast.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions