If you've spent more than ten minutes wrestling with Cursor's Azure settings, you already know the frustration. The dropdowns don't save, the API version throws mismatches, and the authentication silently fails with zero useful error messages. A Cloudflare Worker Proxy is the clean, permanent fix — and it takes about 3 minutes to deploy. Instead of forcing Cursor to speak Azure's dialect, you build a tiny free middleman that does the translation invisibly. Cursor sends a standard request to your proxy, the proxy reformats it perfectly for Azure, and the response comes back as if nothing unusual happened. No more fighting broken settings. In this guide, you'll set up a working Azure OpenAI Cursor integration using a free Cloudflare Worker that handles all the messy header and URL conversion for you.
A Cloudflare Worker Proxy is a lightweight, serverless JavaScript function that runs on Cloudflare's global edge network, intercepting HTTP requests and modifying them — such as swapping authentication headers or rewriting URLs — before forwarding them to their destination. It requires no server, no hosting fees, and deploys in seconds.
Before jumping into the fix, it helps to understand why the problem exists in the first place. Cursor is architected around the standard OpenAI API. That standard uses:
Authorization: Bearer <token> header for authenticationhttps://api.openai.com/v1/chat/completionsAzure OpenAI uses an entirely different contract:
api-key header instead of Authorization: Bearer/openai/deployments/{deploymentName}/chat/completionsapi-version query parameter (e.g., ?api-version=2024-05-01-preview)Cursor's built-in Azure section tries to bridge this gap internally, but it does so unreliably. The result is a broken Azure AI coding assistant experience that varies unpredictably between versions of Cursor. The root cause isn't user error — it's a structural mismatch between two different API standards. According to the official Azure OpenAI Service documentation, the authentication format is fundamentally incompatible with the OpenAI SDK defaults that Cursor relies on.
The Cloudflare Worker Proxy acts as a reverse proxy — a middleman that lives between Cursor and Azure. Here's the exact translation it performs on every request:
Authorization headerapi-key headerapi-version query parameterCursor never knows Azure is involved. It sees exactly what it expects to see — a standard OpenAI-compatible endpoint — because that's exactly what your Cloudflare Worker Proxy presents to it.
Head to dash.cloudflare.com and log in or create a free account. In the left sidebar, click Workers & Pages, then click Create. On the next screen, select Start with Hello World!. Name your Worker something like azure-proxy and click Deploy.
Once deployed, click Edit code to open the built-in code editor.
Delete every line of the default Hello World code. Replace it entirely with the script below. Update the two placeholder values with your actual Azure resource details before deploying.
export default {
async fetch(request, env) {
const url = new URL(request.url);
// Handle CORS preflight so Cursor doesn't block the connection
if (request.method === 'OPTIONS') {
return new Response(null, {
headers: {
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
'Access-Control-Allow-Headers': '*',
}
});
}
// Return a mock model list so Cursor's verification check passes
if (url.pathname.endsWith('/v1/models')) {
return new Response(JSON.stringify({
object: "list",
data: [{ id: "YOUR-DEPLOYMENT-NAME", object: "model", created: Date.now(), owned_by: "azure" }]
}), {
headers: { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' }
});
}
// Only process chat completion requests
if (!url.pathname.endsWith('/v1/chat/completions')) {
return new Response('Proxy is running. Point Cursor to this URL + /v1', { status: 200 });
}
// ── UPDATE THESE TWO VALUES ──────────────────────────────────
const azureEndpoint = "https://YOUR-RESOURCE.cognitiveservices.azure.com";
const deploymentName = "YOUR-DEPLOYMENT-NAME";
// ─────────────────────────────────────────────────────────────
const apiVersion = "2024-05-01-preview";
const azureUrl = `${azureEndpoint}/openai/deployments/${deploymentName}/chat/completions?api-version=${apiVersion}`;
// Swap OpenAI Bearer token → Azure api-key header
const newHeaders = new Headers(request.headers);
const authHeader = newHeaders.get('Authorization');
if (authHeader && authHeader.startsWith('Bearer ')) {
const apiKey = authHeader.replace('Bearer ', '');
newHeaders.set('api-key', apiKey);
newHeaders.delete('Authorization');
}
// Forward to Azure and pass the response back to Cursor
const response = await fetch(azureUrl, {
method: request.method,
headers: newHeaders,
body: request.body
});
const newResponse = new Response(response.body, response);
newResponse.headers.set('Access-Control-Allow-Origin', '*');
return newResponse;
}
}
Click Deploy in the top-right corner. Cloudflare will give you a live URL in the format: https://azure-proxy.your-username.workers.dev
Copy that URL — you'll need it in the next step.
This is where the fix Cursor Azure bug approach becomes visible. Because your Worker perfectly mimics a standard OpenAI endpoint, you configure Cursor entirely through the OpenAI section — not the Azure section.
Follow these steps in order:
Cmd + , on Mac / Ctrl + , on Windows)/v1 appended: https://azure-proxy.your-username.workers.dev/v1[Internal Link: how to find your Azure API key -> hasnainayaz.com/azure-api-key-guide]
Scroll to Model Names in Cursor's settings. If your deployment name isn't listed, click Add model and type it exactly as it appears in your Azure deployment (e.g., Kimi-K2.5). Toggle it on. You can toggle off unused models like gpt-4o to keep the dropdown clean.
Open a new chat (Cmd + L), select your model, and send a test message. If the setup is correct, you'll get a response routed invisibly through Azure via your Cloudflare Worker Proxy.
If you want a cleaner Cursor OpenAI base URL — for example https://ai.yourdomain.com/v1 instead of the default workers.dev URL — you can attach a custom subdomain inside Cloudflare:
ai.yourdomain.com)This is optional but highly recommended if you're sharing the proxy with a team or want a professional-looking setup.
Here's a quick comparison of what you get with the Cloudflare Worker Proxy vs. Cursor's built-in Azure integration:
Cloudflare Worker Proxy:
Cursor's Native Azure Settings:
The free serverless proxy AI approach wins on every dimension for individual developers and small teams.
Cursor's Azure OpenAI integration has frustrated developers for a long time, but the solution is simpler than most people expect. A Cloudflare Worker Proxy acts as a perfect bridge between Cursor's OpenAI-native architecture and Azure's deployment-specific API format — and the entire setup takes under 3 minutes on the free tier. You get full access to your Azure models, including powerful options like Kimi-K2.5, without touching Cursor's broken Azure panel ever again. All of Cursor's advanced features — Agent mode, Composer, inline completions — work without any additional configuration because they all flow through the same OpenAI-compatible interface your Worker provides.
The next time Cursor releases an update that breaks Azure settings for hundreds of developers, your setup won't even flinch.