feat(zai): add Z.AI/GLM support via happy zai command #138
+492
−636
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
Add support for GLM (z.ai / BigModel.cn) API through a new
happy zaisubcommand, similar to existinghappy geminiandhappy codexcommands.What It Does
happy zai- Start Claude with GLM API endpoint (redirects to https://open.bigmodel.cn/api/anthropic)happy zai token set <key>- Save GLM API key to ~/.zai/config.jsonhappy zai token get- Show current API tokenhappy zai model set <model>- Set GLM model (glm-4.7, glm-4-plus, glm-4-flash, glm-4-air, glm-4-flashx)happy zai model get- Show current modelhappy zai base-url set <url>- Set custom API endpointHow It Works
GLM is Anthropic-compatible, so this implementation sets environment variables before launching Claude:
ANTHROPIC_BASE_URL→https://open.bigmodel.cn/api/anthropicANTHROPIC_AUTH_TOKEN→ GLM API keyANTHROPIC_MODEL→ glm-4.7 (or specified model)Configuration is stored in
~/.zai/config.jsonwith support for environment variable overrides (ZAI_AUTH_TOKEN,ZAI_BASE_URL,ZAI_MODEL).Files Changed
src/zai/runZai.ts- New module for Z.AI/GLM supportsrc/index.ts- Added zai subcommand handler with config subcommandsUsage Example