Skip to content

Conversation

@am-will
Copy link

@am-will am-will commented Jan 7, 2026

Summary

  • add Z.AI GLM-4.7 model/provider support
  • add GLM model presets + provider routing updates
  • add Z.AI reference docs

Usage

To use GLM-4.7, export your API key before running Code:

export Z_AI_API_KEY="ZAI-API-KEY"

Uses GLM OpenAI endpoint. Streaming doesn't work.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 71c86f3d42

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +1098 to +1102
} else if model_provider_id.eq_ignore_ascii_case("zai") {
model_provider = model_providers
.get("openai")
.ok_or_else(|| {
std::io::Error::new(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Preserve non-OpenAI provider after leaving glm-4.7

This branch forces model_provider_id to openai whenever the current provider is zai and the model is not glm-4.7. That means a user who normally runs with another provider (e.g., oss/openrouter) and briefly switches to glm-4.7 will be silently switched to OpenAI when they select any other model, and that provider choice gets persisted. For models that are only available on the original provider, requests will now be routed to the wrong backend and can fail. Consider restoring the previous provider (or the config default) instead of hardcoding OpenAI here, or only reverting when the provider was auto-set for glm-4.7.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant