diff --git a/.changeset/tame-turkeys-compete.md b/.changeset/tame-turkeys-compete.md new file mode 100644 index 00000000..40b76d83 --- /dev/null +++ b/.changeset/tame-turkeys-compete.md @@ -0,0 +1,5 @@ +--- +"js-tiktoken": patch +--- + +Update guidance on lite diff --git a/js/README.md b/js/README.md index 47541102..6cf35d69 100644 --- a/js/README.md +++ b/js/README.md @@ -9,9 +9,36 @@ Install the library from NPM: npm install js-tiktoken ``` -## Usage +## Lite -Basic usage follows, which includes all the OpenAI encoders and ranks: +You can only load the ranks you need, which will significantly reduce the bundle size: + +```typescript +import { Tiktoken } from "js-tiktoken/lite"; +import o200k_base from "js-tiktoken/ranks/o200k_base"; + +const enc = new Tiktoken(o200k_base); +assert(enc.decode(enc.encode("hello world")) === "hello world"); +``` + +Alternatively, encodings can be loaded dynamically from our CDN hosted on Cloudflare Pages. + +```typescript +import { Tiktoken } from "js-tiktoken/lite"; + +const res = await fetch(`https://tiktoken.pages.dev/js/o200k_base.json`); +const o200k_base = await res.json(); + +const enc = new Tiktoken(o200k_base); +assert(enc.decode(enc.encode("hello world")) === "hello world"); +``` + +## Full usage + +If you need all the OpenAI tokenizers, you can import the entire library: + +> [!CAUTION] +> This will include all the OpenAI tokenizers, which may significantly increase the bundle size. See ```typescript import assert from "node:assert";