Cloudflare Worker Proxy: Fix Cursor Azure Bug Fast [2026]

Cloudflare Worker Proxy: Fix Cursor Azure Bug Fast [2026]
  • logo image
    Written by

    Hasnain Ayaz

  • Category

    AI/ML

  • Date

    2026-02-25

Introduction

If you've spent more than ten minutes wrestling with Cursor's Azure settings, you already know the frustration. The dropdowns don't save, the API version throws mismatches, and the authentication silently fails with zero useful error messages. A Cloudflare Worker Proxy is the clean, permanent fix — and it takes about 3 minutes to deploy. Instead of forcing Cursor to speak Azure's dialect, you build a tiny free middleman that does the translation invisibly. Cursor sends a standard request to your proxy, the proxy reformats it perfectly for Azure, and the response comes back as if nothing unusual happened. No more fighting broken settings. In this guide, you'll set up a working Azure OpenAI Cursor integration using a free Cloudflare Worker that handles all the messy header and URL conversion for you.

What is a Cloudflare Worker Proxy?

A Cloudflare Worker Proxy is a lightweight, serverless JavaScript function that runs on Cloudflare's global edge network, intercepting HTTP requests and modifying them — such as swapping authentication headers or rewriting URLs — before forwarding them to their destination. It requires no server, no hosting fees, and deploys in seconds.

Why Cursor's Azure OpenAI Settings Are Broken

Before jumping into the fix, it helps to understand why the problem exists in the first place. Cursor is architected around the standard OpenAI API. That standard uses:

  • An Authorization: Bearer <token> header for authentication
  • A simple base URL like https://api.openai.com/v1/chat/completions
  • No deployment-specific routing in the URL path

Azure OpenAI uses an entirely different contract:

  • An api-key header instead of Authorization: Bearer
  • A deployment-specific URL: /openai/deployments/{deploymentName}/chat/completions
  • A mandatory api-version query parameter (e.g., ?api-version=2024-05-01-preview)

Cursor's built-in Azure section tries to bridge this gap internally, but it does so unreliably. The result is a broken Azure AI coding assistant experience that varies unpredictably between versions of Cursor. The root cause isn't user error — it's a structural mismatch between two different API standards. According to the official Azure OpenAI Service documentation, the authentication format is fundamentally incompatible with the OpenAI SDK defaults that Cursor relies on.

How the Cloudflare Worker Proxy Solves It

The Cloudflare Worker Proxy acts as a reverse proxy — a middleman that lives between Cursor and Azure. Here's the exact translation it performs on every request:

  1. Receives a standard OpenAI-formatted request from Cursor
  2. Extracts the Bearer token from the Authorization header
  3. Replaces it with Azure's required api-key header
  4. Rewrites the URL to match your specific Azure deployment path
  5. Appends the correct api-version query parameter
  6. Forwards the translated request to Azure
  7. Returns the Azure response directly back to Cursor

Cursor never knows Azure is involved. It sees exactly what it expects to see — a standard OpenAI-compatible endpoint — because that's exactly what your Cloudflare Worker Proxy presents to it.

Step-by-Step: Deploy Your Cloudflare Worker Proxy

Step 1 — Create Your Worker

Head to dash.cloudflare.com and log in or create a free account. In the left sidebar, click Workers & Pages, then click Create. On the next screen, select Start with Hello World!. Name your Worker something like azure-proxy and click Deploy.

Once deployed, click Edit code to open the built-in code editor.

Step 2 — Paste the Proxy Script

Delete every line of the default Hello World code. Replace it entirely with the script below. Update the two placeholder values with your actual Azure resource details before deploying.

export default {
  async fetch(request, env) {
    const url = new URL(request.url);

    // Handle CORS preflight so Cursor doesn't block the connection
    if (request.method === 'OPTIONS') {
      return new Response(null, {
        headers: {
          'Access-Control-Allow-Origin': '*',
          'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
          'Access-Control-Allow-Headers': '*',
        }
      });
    }

    // Return a mock model list so Cursor's verification check passes
    if (url.pathname.endsWith('/v1/models')) {
      return new Response(JSON.stringify({
        object: "list",
        data: [{ id: "YOUR-DEPLOYMENT-NAME", object: "model", created: Date.now(), owned_by: "azure" }]
      }), {
        headers: { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' }
      });
    }

    // Only process chat completion requests
    if (!url.pathname.endsWith('/v1/chat/completions')) {
      return new Response('Proxy is running. Point Cursor to this URL + /v1', { status: 200 });
    }

    // ── UPDATE THESE TWO VALUES ──────────────────────────────────
    const azureEndpoint   = "https://YOUR-RESOURCE.cognitiveservices.azure.com";
    const deploymentName  = "YOUR-DEPLOYMENT-NAME";
    // ─────────────────────────────────────────────────────────────
    const apiVersion = "2024-05-01-preview";

    const azureUrl = `${azureEndpoint}/openai/deployments/${deploymentName}/chat/completions?api-version=${apiVersion}`;

    // Swap OpenAI Bearer token → Azure api-key header
    const newHeaders = new Headers(request.headers);
    const authHeader = newHeaders.get('Authorization');
    if (authHeader && authHeader.startsWith('Bearer ')) {
      const apiKey = authHeader.replace('Bearer ', '');
      newHeaders.set('api-key', apiKey);
      newHeaders.delete('Authorization');
    }

    // Forward to Azure and pass the response back to Cursor
    const response = await fetch(azureUrl, {
      method: request.method,
      headers: newHeaders,
      body: request.body
    });

    const newResponse = new Response(response.body, response);
    newResponse.headers.set('Access-Control-Allow-Origin', '*');
    return newResponse;
  }
}

Click Deploy in the top-right corner. Cloudflare will give you a live URL in the format: https://azure-proxy.your-username.workers.dev

Copy that URL — you'll need it in the next step.

Step 3 — Configure Cursor to Use the Proxy

This is where the fix Cursor Azure bug approach becomes visible. Because your Worker perfectly mimics a standard OpenAI endpoint, you configure Cursor entirely through the OpenAI section — not the Azure section.

Follow these steps in order:

  1. Open Cursor Settings (Cmd + , on Mac / Ctrl + , on Windows)
  2. Navigate to the Models tab
  3. Scroll to the Azure section and toggle it completely OFF
  4. Scroll back up to the OpenAI section
  5. Find Override OpenAI Base URL and paste your Worker URL with /v1 appended: https://azure-proxy.your-username.workers.dev/v1
  6. In the OpenAI API Key field, paste your actual Azure API key
  7. Click Verify — it should pass immediately

[Internal Link: how to find your Azure API key -> hasnainayaz.com/azure-api-key-guide]

Step 4 — Add Your Model and Test

Scroll to Model Names in Cursor's settings. If your deployment name isn't listed, click Add model and type it exactly as it appears in your Azure deployment (e.g., Kimi-K2.5). Toggle it on. You can toggle off unused models like gpt-4o to keep the dropdown clean.

Open a new chat (Cmd + L), select your model, and send a test message. If the setup is correct, you'll get a response routed invisibly through Azure via your Cloudflare Worker Proxy.

Setting Up the Cloudflare Worker Proxy with a Custom Domain

If you want a cleaner Cursor OpenAI base URL — for example https://ai.yourdomain.com/v1 instead of the default workers.dev URL — you can attach a custom subdomain inside Cloudflare:

  • Go to Workers & Pages → select your Worker
  • Click SettingsTriggersCustom Domains
  • Add a subdomain you own (e.g., ai.yourdomain.com)
  • Cloudflare provisions the SSL certificate and routing automatically

This is optional but highly recommended if you're sharing the proxy with a team or want a professional-looking setup.

Why This Approach Beats Native Azure Settings in Cursor

Here's a quick comparison of what you get with the Cloudflare Worker Proxy vs. Cursor's built-in Azure integration:

Cloudflare Worker Proxy:

  • Works reliably across all Cursor versions
  • Free up to 100,000 requests per day
  • All Cursor features work (Agent, Composer, inline suggestions)
  • Update deployment or API version by editing one line
  • Your Azure key travels as a standard Bearer token — never stored in Azure format

Cursor's Native Azure Settings:

  • Inconsistent across Cursor updates
  • No control over API version conflicts
  • Authentication failures with no clear error messages
  • Requires Cursor-specific Azure configuration knowledge

The free serverless proxy AI approach wins on every dimension for individual developers and small teams.

Conclusion

Cursor's Azure OpenAI integration has frustrated developers for a long time, but the solution is simpler than most people expect. A Cloudflare Worker Proxy acts as a perfect bridge between Cursor's OpenAI-native architecture and Azure's deployment-specific API format — and the entire setup takes under 3 minutes on the free tier. You get full access to your Azure models, including powerful options like Kimi-K2.5, without touching Cursor's broken Azure panel ever again. All of Cursor's advanced features — Agent mode, Composer, inline completions — work without any additional configuration because they all flow through the same OpenAI-compatible interface your Worker provides.

The next time Cursor releases an update that breaks Azure settings for hundreds of developers, your setup won't even flinch.

Blog

Related posts