Skip to content

Conversation

Copy link
Contributor

Copilot AI commented Dec 9, 2025

Documents the approach for supporting non-Azure OpenAI models in azsdk-cli, analyzing PR #12747's implementation that switched from AzureOpenAIClient to the generic OpenAIClient.

Proposal Contents

  • Overview: Background on the implementation changes from Azure-specific to flexible OpenAI client
  • Technical Implementation: Package changes, authentication priority (API key → Entra ID fallback), endpoint configuration strategy
  • Benefits: Provider flexibility, local development support, backward compatibility preserved, simplified configuration, modern SDK usage
  • Decision Matrix: Guidelines for when to use Microsoft Foundry vs. external providers based on data sensitivity, compliance requirements, and use case scenarios (including non-Azure SDK workflows and model availability)
  • Potential Issues and Considerations:
    • Authentication scope hardcoding concerns
    • Security considerations for API key exposure
    • Confidential data leakage risk when using non-Microsoft model providers, with mitigation strategies including explicit opt-in mechanisms
  • Scope and Focus Analysis:
    • Azure SDK CLI/MCP primary purpose and Azure SDK focus
    • Assessment of non-Azure SDK support value and applicability (noting OpenAI .NET SDK is TypeSpec-based, enabling significant workflow overlap)
    • Cost-effectiveness analysis favoring shared tooling due to TypeSpec commonality

Location

tools/azsdk-cli/docs/specs/non-azure-models-support-proposal.md

Original prompt

write a complete proposal for the support for non-azure models implemented in #12747 and list the benefits and the issues the approach may cause


💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.

Co-authored-by: deyaaeldeen <6074665+deyaaeldeen@users.noreply.github.com>
Copilot AI changed the title [WIP] Add support for non-Azure models in SDK tools Add proposal document for non-Azure model support (PR #12747) Dec 9, 2025
Copilot AI requested a review from deyaaeldeen December 9, 2025 22:47
…sion

Co-authored-by: deyaaeldeen <6074665+deyaaeldeen@users.noreply.github.com>
Co-authored-by: deyaaeldeen <6074665+deyaaeldeen@users.noreply.github.com>
Co-authored-by: deyaaeldeen <6074665+deyaaeldeen@users.noreply.github.com>
Co-authored-by: deyaaeldeen <6074665+deyaaeldeen@users.noreply.github.com>
…matrix

Co-authored-by: deyaaeldeen <6074665+deyaaeldeen@users.noreply.github.com>
Co-authored-by: deyaaeldeen <6074665+deyaaeldeen@users.noreply.github.com>
…ternatives

Co-authored-by: deyaaeldeen <6074665+deyaaeldeen@users.noreply.github.com>
Co-authored-by: deyaaeldeen <6074665+deyaaeldeen@users.noreply.github.com>
Copilot AI requested a review from deyaaeldeen December 10, 2025 21:34
@deyaaeldeen deyaaeldeen marked this pull request as ready for review December 11, 2025 18:29
@deyaaeldeen deyaaeldeen requested a review from a team as a code owner December 11, 2025 18:29
Copilot AI review requested due to automatic review settings December 11, 2025 18:29
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds a comprehensive proposal document for supporting non-Azure OpenAI models in the Azure SDK CLI tool, documenting the implementation approach from PR #12747 that switched from AzureOpenAIClient to the generic OpenAIClient.

Key Changes:

  • Documents the technical implementation including package changes (from Azure.AI.OpenAI to OpenAI package), authentication priority (API key → Entra ID fallback), and endpoint configuration strategy
  • Provides decision matrix for choosing between Microsoft Foundry vs. external providers based on data sensitivity and compliance requirements
  • Addresses potential security and confidentiality concerns with mitigation strategies

Comment on lines +117 to +124
| Scenario | Recommended Provider | Rationale |
|----------|---------------------|-----------|
| **Working with Microsoft confidential data** (e.g., unannounced APIs, internal specs) | Microsoft Foundry | Data stays within Microsoft's infrastructure; complies with confidentiality requirements |
| **Production Azure SDK development** | Microsoft Foundry | Enterprise-grade security, compliance, and support; consistent with Azure ecosystem |
| **CI/CD pipelines in Azure DevOps** | Microsoft Foundry | Seamless integration with Azure services; Entra ID authentication |
| **Local development with non-sensitive data** | External Provider (optional) | Flexibility to use local LLM servers; no Azure credentials needed |
| **Applying workflows to non-Azure SDKs** (e.g., OpenAI .NET SDK) | External Provider | Enables using Azure SDK CLI/MCP tools with SDKs outside the Azure ecosystem |
| **Accessing models not available in Microsoft Foundry** | External Provider | Access to specialized or newer models that may not yet be deployed in Microsoft Foundry |
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The table describing the decision matrix has a formatting issue. The first pipe character at the start of each row should be removed for proper markdown table formatting. Proper markdown tables don't require a leading pipe.

Suggested change
| Scenario | Recommended Provider | Rationale |
|----------|---------------------|-----------|
| **Working with Microsoft confidential data** (e.g., unannounced APIs, internal specs) | Microsoft Foundry | Data stays within Microsoft's infrastructure; complies with confidentiality requirements |
| **Production Azure SDK development** | Microsoft Foundry | Enterprise-grade security, compliance, and support; consistent with Azure ecosystem |
| **CI/CD pipelines in Azure DevOps** | Microsoft Foundry | Seamless integration with Azure services; Entra ID authentication |
| **Local development with non-sensitive data** | External Provider (optional) | Flexibility to use local LLM servers; no Azure credentials needed |
| **Applying workflows to non-Azure SDKs** (e.g., OpenAI .NET SDK) | External Provider | Enables using Azure SDK CLI/MCP tools with SDKs outside the Azure ecosystem |
| **Accessing models not available in Microsoft Foundry** | External Provider | Access to specialized or newer models that may not yet be deployed in Microsoft Foundry |
Scenario | Recommended Provider | Rationale
----------|---------------------|-----------
**Working with Microsoft confidential data** (e.g., unannounced APIs, internal specs) | Microsoft Foundry | Data stays within Microsoft's infrastructure; complies with confidentiality requirements
**Production Azure SDK development** | Microsoft Foundry | Enterprise-grade security, compliance, and support; consistent with Azure ecosystem
**CI/CD pipelines in Azure DevOps** | Microsoft Foundry | Seamless integration with Azure services; Entra ID authentication
**Local development with non-sensitive data** | External Provider (optional) | Flexibility to use local LLM servers; no Azure credentials needed
**Applying workflows to non-Azure SDKs** (e.g., OpenAI .NET SDK) | External Provider | Enables using Azure SDK CLI/MCP tools with SDKs outside the Azure ecosystem
**Accessing models not available in Microsoft Foundry** | External Provider | Access to specialized or newer models that may not yet be deployed in Microsoft Foundry

Copilot uses AI. Check for mistakes.
Comment on lines +177 to +184
| Aspect | Azure SDKs | Non-Azure SDKs (e.g., OpenAI .NET) |
|--------|-----------|-----------------------------------|
| TypeSpec/API generation | ✅ Full support | ✅ Applicable (TypeSpec-based) |
| Azure-specific reviews | ✅ Full support | ❌ Not applicable |
| Generic code quality tools | ✅ Full support | ✅ Applicable |
| Documentation workflows | ✅ Full support | ✅ Applicable |
| Testing utilities | ✅ Full support | ⚠️ Partial applicability |

Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The table comparing Azure SDKs and non-Azure SDKs has a formatting issue. The first pipe character at the start of each row should be removed for proper markdown table formatting. Proper markdown tables don't require a leading pipe.

Copilot uses AI. Check for mistakes.
Comment on lines +191 to +197
| Factor | Shared Tooling (Current Approach) | Dedicated OpenAI Tooling |
|--------|----------------------------------|-------------------------|
| **Development cost** | Lower (reuses existing infrastructure) | Higher (new tool development) |
| **Maintenance cost** | Lower (shared TypeSpec tooling) | Higher (separate maintenance) |
| **Feature fit** | High overlap due to TypeSpec | Purpose-built but duplicative |
| **User experience** | Familiar for TypeSpec users | Separate learning curve |
| **Long-term scalability** | Benefits from Azure SDK investments | Independent evolution |
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The cost-effectiveness analysis table has a formatting issue. The first pipe character at the start of each row should be removed for proper markdown table formatting. Proper markdown tables don't require a leading pipe.

Suggested change
| Factor | Shared Tooling (Current Approach) | Dedicated OpenAI Tooling |
|--------|----------------------------------|-------------------------|
| **Development cost** | Lower (reuses existing infrastructure) | Higher (new tool development) |
| **Maintenance cost** | Lower (shared TypeSpec tooling) | Higher (separate maintenance) |
| **Feature fit** | High overlap due to TypeSpec | Purpose-built but duplicative |
| **User experience** | Familiar for TypeSpec users | Separate learning curve |
| **Long-term scalability** | Benefits from Azure SDK investments | Independent evolution |
Factor | Shared Tooling (Current Approach) | Dedicated OpenAI Tooling
--------|----------------------------------|-------------------------
**Development cost** | Lower (reuses existing infrastructure) | Higher (new tool development)
**Maintenance cost** | Lower (shared TypeSpec tooling) | Higher (separate maintenance)
**Feature fit** | High overlap due to TypeSpec | Purpose-built but duplicative
**User experience** | Familiar for TypeSpec users | Separate learning curve
**Long-term scalability** | Benefits from Azure SDK investments | Independent evolution

Copilot uses AI. Check for mistakes.
Comment on lines +142 to +144
- **Risk**: This scope is specific to Azure Cognitive Services and may not be appropriate for all scenarios
- **Consideration**: Future enhancements might need to make this configurable

Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The documentation mentions "hardcoded" authentication scope as an issue but doesn't explain why this is problematic or provide specific examples of scenarios where this would fail. Consider adding more detail about which non-Azure scenarios would be affected by this hardcoded scope.

Suggested change
- **Risk**: This scope is specific to Azure Cognitive Services and may not be appropriate for all scenarios
- **Consideration**: Future enhancements might need to make this configurable
- **Risk**: This scope is specific to Azure Cognitive Services and may not be appropriate for all scenarios. For example, when connecting to non-Azure OpenAI-compatible endpoints (such as OpenAI's own API or other third-party providers), the authentication flow may require a different scope or may not use scopes at all. Attempting to use the Azure-specific scope in these cases can result in authentication failures, rejected tokens, or inability to access the desired models.
- **Example**: If a user configures the CLI to use OpenAI's public API endpoint (`https://api.openai.com/v1`), the Azure scope will not be recognized, and authentication will fail. Similarly, other providers may require their own scopes or API key-based authentication, making the hardcoded Azure scope incompatible.
- **Consideration**: Future enhancements should make the authentication scope configurable to support a wider range of providers and avoid integration failures in non-Azure scenarios.

Copilot uses AI. Check for mistakes.
var baseEndpoint = azureOpenAiEndpoint.TrimEnd('/') + "/openai/v1";
endpoint = new Uri(baseEndpoint);
}
// Priority 3: If no OPENAI_API_KEY but no Azure endpoint, use openai-shared
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The code comment references Priority 3 but the actual scenario is when both OPENAI_API_KEY and endpoint variables are not set. The comment should also clarify that this scenario occurs when using Entra ID authentication without explicit endpoint configuration.

Suggested change
// Priority 3: If no OPENAI_API_KEY but no Azure endpoint, use openai-shared
// Priority 3: If both OPENAI_API_KEY and endpoint variables (OPENAI_BASE_URL, AZURE_OPENAI_ENDPOINT) are not set,
// and Entra ID authentication is used without explicit endpoint configuration, use the default openai-shared endpoint.

Copilot uses AI. Check for mistakes.
@@ -0,0 +1,216 @@
# Proposal: Support for Non-Azure OpenAI Models
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The document has inconsistent spacing at the beginning - there are extra spaces before the heading. The heading should start at the beginning of the line without leading whitespace.

Copilot uses AI. Check for mistakes.
Comment on lines +63 to +68
| Priority | Environment Variable | Behavior |
|----------|---------------------|----------|
| 1 | `OPENAI_BASE_URL` | Used as-is (no modifications) |
| 2 | `AZURE_OPENAI_ENDPOINT` | Automatically appends `/openai/v1` if missing |
| 3 | Neither set, `OPENAI_API_KEY` not set | Default to `https://openai-shared.openai.azure.com/openai/v1` |
| 4 | `OPENAI_API_KEY` only (no endpoint variables) | Use standard OpenAI API (no custom endpoint) |
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The table describing endpoint configuration priority has a formatting issue. The first pipe character at the start of each row should be removed for proper markdown table formatting. Proper markdown tables don't require a leading pipe.

Suggested change
| Priority | Environment Variable | Behavior |
|----------|---------------------|----------|
| 1 | `OPENAI_BASE_URL` | Used as-is (no modifications) |
| 2 | `AZURE_OPENAI_ENDPOINT` | Automatically appends `/openai/v1` if missing |
| 3 | Neither set, `OPENAI_API_KEY` not set | Default to `https://openai-shared.openai.azure.com/openai/v1` |
| 4 | `OPENAI_API_KEY` only (no endpoint variables) | Use standard OpenAI API (no custom endpoint) |
Priority | Environment Variable | Behavior
----------|---------------------|----------
1 | `OPENAI_BASE_URL` | Used as-is (no modifications)
2 | `AZURE_OPENAI_ENDPOINT` | Automatically appends `/openai/v1` if missing
3 | Neither set, `OPENAI_API_KEY` not set | Default to `https://openai-shared.openai.azure.com/openai/v1`
4 | `OPENAI_API_KEY` only (no endpoint variables) | Use standard OpenAI API (no custom endpoint)

Copilot uses AI. Check for mistakes.
Comment on lines +55 to +56
BearerTokenPolicy tokenPolicy = new(credential, "https://cognitiveservices.azure.com/.default");
options.AddPolicy(tokenPolicy, PipelinePosition.BeforeTransport);
Copy link

Copilot AI Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The documented fallback from API key to BearerTokenPolicy for Entra ID authentication means that bearer tokens scoped to https://cognitiveservices.azure.com/.default will be attached to all OpenAI requests when OPENAI_API_KEY is unset, regardless of which endpoint was resolved (including OPENAI_BASE_URL pointing at non‑Microsoft hosts). A malicious or compromised external provider could capture these Azure access tokens and use them to call Azure Cognitive Services APIs as the caller, leading to unauthorized access and data exfiltration. Restrict Entra ID authentication to trusted Azure endpoints only (e.g., hostnames under Azure OpenAI/Foundry) and avoid sending Azure bearer tokens to arbitrary OPENAI_BASE_URL values, or require API-key-based auth for external providers instead of this fallback behavior.

Suggested change
BearerTokenPolicy tokenPolicy = new(credential, "https://cognitiveservices.azure.com/.default");
options.AddPolicy(tokenPolicy, PipelinePosition.BeforeTransport);
// Only attach BearerTokenPolicy if endpoint is a trusted Azure domain
var endpoint = /* resolved endpoint URL as string */;
var uri = new Uri(endpoint);
if (uri.Host.EndsWith(".azure.com", StringComparison.OrdinalIgnoreCase))
{
BearerTokenPolicy tokenPolicy = new(credential, "https://cognitiveservices.azure.com/.default");
options.AddPolicy(tokenPolicy, PipelinePosition.BeforeTransport);
}
else
{
throw new InvalidOperationException("Entra ID authentication is only supported for Azure endpoints. Please use API key authentication for non-Azure endpoints.");
}

Copilot uses AI. Check for mistakes.

## Background

The Azure SDK CLI (`azsdk`) is used as both a command-line interface and an MCP (Model Context Protocol) server for AI-assisted development workflows. Previously, the tool was tightly coupled to Azure OpenAI through the `Azure.AI.OpenAI` package, which limited its use to Azure-hosted models only.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be pedantic, the Azure.AI.OpenAI package shouldn't be used even for the Azure OpenAI context; both Azure and non-Azure scenarios should use the base OpenAI package, so this change is necessary regardless of the desire (or lack of desire) for non-Azure support.

<PackageReference Include="Azure.AI.OpenAI" Version="2.3.0-beta.2" />

<!-- Added -->
<PackageReference Include="OpenAI" Version="2.6.0" />
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
<PackageReference Include="OpenAI" Version="2.6.0" />
<PackageReference Include="OpenAI" Version="2.8.0" />

This should move to the latest release.


// Fall back to bearer token (Entra ID) authentication
BearerTokenPolicy tokenPolicy = new(credential, "https://cognitiveservices.azure.com/.default");
options.AddPolicy(tokenPolicy, PipelinePosition.BeforeTransport);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is unnecessary. There's a constructor overload that accepts the policy directly and much more simply:

var options = new OpenAIClientOptions
{
    Endpoint = new Uri("<< SOME ENDPOINT >>")
};

var policy = new BearerTokenPolicy(<< SOME CREDENTIAL >>, "<< SOME AUTH SCOPE >>");
var client = new OpenAIClient(policy, options);

// Priority 2: Use AZURE_OPENAI_ENDPOINT with /openai/v1 postfix if it exists
else if (!string.IsNullOrWhiteSpace(azureOpenAiEndpoint))
{
var baseEndpoint = azureOpenAiEndpoint.TrimEnd('/') + "/openai/v1";
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd suggest using Uri or UriBuilder rather than the concatenation approach. That'll handle your trim issues automatically and ensure that you've got a valid URI.


### Alternative: Dedicated OpenAI SDK Tooling

Given that the OpenAI .NET SDK shares the TypeSpec foundation with Azure SDKs, **leveraging existing Azure SDK CLI/MCP infrastructure** may be more efficient than building dedicated tooling.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Azure OpenAI library is being phased out. It should not be used for new applications. We should be using the official OpenAI library only.


#### Considerations

1. **Shared Technology Stack**: The OpenAI .NET SDK is TypeSpec-based, similar to Azure SDKs, which means many Azure SDK CLI/MCP workflows (TypeSpec generation, code generation patterns) could be applicable
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How the library is generated is neither a value nor a risk. It is simply an implementation detail. The burden of being fully compatible is on the Azure Foundry endpoints and is not impacted by whether or how the client library is generated.


1. **Shared Technology Stack**: The OpenAI .NET SDK is TypeSpec-based, similar to Azure SDKs, which means many Azure SDK CLI/MCP workflows (TypeSpec generation, code generation patterns) could be applicable
2. **Ownership Context**: Microsoft owns the OpenAI .NET SDK, which creates a unique situation where Azure SDK tooling concepts naturally align
3. **Workflow Overlap**: Due to the shared TypeSpec foundation, a significant subset of workflows (code generation, API reviews, documentation generation, testing patterns) could be applicable to the OpenAI .NET SDK
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not an accurate statement. The process, design, and workflow for OpenAI are very, very different from the ones used by the Azure SDK repositories.


### Support for Non-Azure SDKs

While the non-Azure model support enables technical flexibility, the question arises whether supporting non-Azure SDKs (such as the OpenAI .NET SDK) is desirable or meaningful within this tool.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It has already been decided that supporting a full third-party (specifically, the OpenAI repository) is not in scope for the Azure CLI tooling due to a non-trivial amount of hard assumptions across the tooling infrastructure for Azure-specific conventions that the OpenAI repository does not support and will not adopt.


#### Cost-Effectiveness Analysis

| Factor | Shared Tooling (Current Approach) | Dedicated OpenAI Tooling |
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure what the value of this section is. The tooling does not support a full third-party scenario by design and has many deep assumptions about Azure conventions needed to make it work. It is not something that is possible to use with OpenAI in its current form. Further, since there is a single library maintained by our team, there's not enough value offered by the tooling that would justify removing those assumptions so that it would be usable.


For non-Azure SDKs like the OpenAI .NET SDK:

1. **Short-term**: The current flexible model support combined with shared TypeSpec infrastructure provides a cost-effective way to support OpenAI SDK workflows
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is inaccurate with respect to the OpenAI library and workflow.

@jsquire
Copy link
Member

jsquire commented Dec 12, 2025

It feels as if we're conflating "support for non-Azure models" and "use the Azure CLI tooling in non-Azure repositories" in this document. The second is predicated on the first, but the inverse is not true. That said, based on conversations, costs, and goals - I do not see the Azure CLI tooling being adapted for true third-party use nor a strong value proposition in doing so. As a result, I'm not sure that support for non-Azure models is a feature that would get real-world use.

That said, the baseline conversion to use the official OpenAI library is absolutely needed, regardless due to the direction and guidance from the Azure Foundry services. Personally, I'd remove the support for API keys and the default endpoints until such time as we have a real-world use case.

@deyaaeldeen
Copy link
Member

It feels as if we're conflating "support for non-Azure models" and "use the Azure CLI tooling in non-Azure repositories" in this document.

@jsquire The intent was to keep these concerns separate, and the document tries to do that explicitly by carving out the latter as its own section. If it would be clearer, we could certainly split them into separate documents. That said, I think I have a much better understanding of your perspective now.

Copy link
Member

@maririos maririos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After reviewing the document and the comments from Jesse, I don’t think the content in the PR is ready for review.

The structure and technical framing need more clarity and depth, and some of the assumptions don’t align with our team’s guidelines. Specifically, that we do not use or endorse LLMs that fall outside the Azure Foundry ecosystem.

I’m blocking the PR for now. You could update it and reflect why we need to use OpenAI and how it is set in the environment, or close it all together as that is because of Azure OpenAI library being phased out

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants