How to Use Azure AI in Cursor with Cloudflare Worker Proxy

How to Use Azure AI in Cursor with Cloudflare Worker Proxy
  • logo image
    Written by

    Hasnain Ayaz

  • Category

    AI/ML

  • Date

    2026-02-25

If you've ever tried to connect Cursor IDE to an Azure OpenAI deployment, you've probably run into the wall. The Azure settings inside Cursor are inconsistent, poorly documented, and often throw cryptic API version conflicts or authentication errors that seem impossible to debug. You're not alone — it's one of the most common frustrations in the AI developer community.

But here's the good news: there's a clean, elegant solution that takes about 3 minutes to set up, costs absolutely nothing, and makes the whole problem disappear. The trick is building a tiny reverse proxy using Cloudflare Workers — a free, serverless platform — that sits between Cursor and Azure, silently translating requests so everything works seamlessly.

In this guide, you'll learn exactly how to do it, step by step.

Why Cursor Struggles with Azure OpenAI

Cursor is built and optimized around the standard OpenAI API format. When you send a request to OpenAI, it uses an Authorization: Bearer <token> header and a simple URL structure. Azure OpenAI, however, uses a completely different format:

  • It requires an api-key header instead of Authorization: Bearer
  • It uses a deployment-specific URL (/openai/deployments/{deploymentName}/chat/completions)
  • It requires an api-version query parameter that changes based on the preview or stable release you're targeting

Cursor's Azure settings panel tries to handle this translation internally, but it does so unreliably. API version mismatches, header conflicts, and routing bugs are all common. Rather than fighting that broken interface, the smarter approach is to bypass it entirely.

The Reverse Proxy Solution: How It Works

Instead of using Cursor's Azure settings, you'll deploy a lightweight reverse proxy — essentially a middleman server — that intercepts Cursor's standard OpenAI-formatted requests and translates them on the fly into Azure's required format before forwarding them to your Azure endpoint.

Here's the flow:

Cursor → [Cloudflare Worker Proxy] → Azure AI (Kimi-K2.5)

Cursor sends a completely standard OpenAI request to your proxy URL. The Worker grabs that request, swaps the Authorization: Bearer header for an api-key header, rewrites the URL to match your specific Azure deployment, and forwards it along. Azure responds, the Worker passes the response back to Cursor, and Cursor has no idea Azure was ever involved. It thinks it's talking to a regular OpenAI endpoint the entire time.

This approach completely sidesteps Cursor's buggy Azure section

What You'll Need Before You Start

Before diving in, make sure you have the following ready:

  • A Cloudflare account (free tier is all you need — sign up at cloudflare.com)
  • Your Azure AI resource endpoint (e.g., https://ai-yourname-resource.cognitiveservices.azure.com)
  • Your Azure deployment name (e.g., Kimi-K2.5)
  • Your Azure API key (found in your Azure AI resource under "Keys and Endpoint")
  • Cursor IDE installed with an active subscription

Step 1: Create a Cloudflare Worker

Start by logging into your Cloudflare dashboard at dash.cloudflare.com.

In the left sidebar, click on Workers & Pages. Then click Create. Cloudflare's updated dashboard now shows a few starter options. Click Start with Hello World! to initialize a basic Worker. When prompted, give your Worker a descriptive name — something like azure-proxy works well. Then click the blue Deploy button to create it.

Once it says "Successfully deployed," click Edit code to open the built-in code editor.

Step 2: Add the Proxy Script

You'll see a simple Hello World script already in the editor. Delete every line of it, and replace it with the following JavaScript code:

export default {
  async fetch(request, env) {
    const url = new URL(request.url);
    
    // Handle CORS so Cursor doesn't block the request
    if (request.method === 'OPTIONS') {
      return new Response(null, {
        headers: {
          'Access-Control-Allow-Origin': '*',
          'Access-Control-Allow-Methods': 'GET, POST, OPTIONS',
          'Access-Control-Allow-Headers': '*',
        }
      });
    }

    // Fake the models list (Cursor verifies the connection by checking this)
    if (url.pathname.endsWith('/v1/models')) {
      return new Response(JSON.stringify({
        object: "list",
        data: [{ id: "Kimi-K2.5", object: "model", created: Date.now(), owned_by: "azure" }]
      }), {
        headers: { 'Content-Type': 'application/json', 'Access-Control-Allow-Origin': '*' }
      });
    }

    // Only process chat completion requests
    if (!url.pathname.endsWith('/v1/chat/completions')) {
      return new Response('Proxy is running. Point Cursor to this URL + /v1', { status: 200 });
    }

    // Your Azure details
    const azureEndpoint = "https://YOUR-RESOURCE-NAME.cognitiveservices.azure.com";
    const deploymentName = "YOUR-DEPLOYMENT-NAME";
    const apiVersion = "2024-05-01-preview";

    // Build the Azure URL
    const azureUrl = `${azureEndpoint}/openai/deployments/${deploymentName}/chat/completions?api-version=${apiVersion}`;

    // Swap OpenAI Bearer token for Azure api-key header
    const newHeaders = new Headers(request.headers);
    const authHeader = newHeaders.get('Authorization');
    if (authHeader && authHeader.startsWith('Bearer ')) {
      const apiKey = authHeader.replace('Bearer ', '');
      newHeaders.set('api-key', apiKey);
      newHeaders.delete('Authorization');
    }

    // Forward to Azure and return the response to Cursor
    const response = await fetch(azureUrl, {
      method: request.method,
      headers: newHeaders,
      body: request.body
    });

    const newResponse = new Response(response.body, response);
    newResponse.headers.set('Access-Control-Allow-Origin', '*');
    
    return newResponse;
  }
}

Before you deploy, replace the placeholder values:

  • https://YOUR-RESOURCE-NAME.cognitiveservices.azure.com → your actual Azure endpoint
  • YOUR-DEPLOYMENT-NAME → your actual deployment name (e.g., Kimi-K2.5)

Once you've updated those two lines, click the blue Deploy button in the top right corner of the editor. Cloudflare will give you a live Worker URL that looks something like https://azure-proxy.your-username.workers.dev.

Copy that URL — you'll need it in the next step.

Step 3: Configure Cursor to Use Your Proxy

Now that your proxy is live, you'll configure Cursor to treat it like a standard OpenAI endpoint. This is the key insight: because your Worker mimics the OpenAI API format perfectly, Cursor will never know it's talking to Azure.

Open Cursor and go to Settings > Models (or press Cmd + , on Mac and navigate to Models).

First, scroll down to the Azure section and make sure it is completely toggled OFF. You don't need it, and leaving it on can interfere.

Next, scroll back up to the OpenAI section and find the Override OpenAI Base URL field. Paste your Cloudflare Worker URL into this box, and add /v1 to the end of it. It should look like this:

https://azure-proxy.your-username.workers.dev/v1

In the OpenAI API Key field directly below it, paste your actual Azure API key. The Worker will grab this key from Cursor's standard Authorization: Bearer header and convert it into Azure's required api-key format automatically and securely.

Step 4: Add and Enable Your Model

Scroll down to the Model Names section in Cursor's settings. If Kimi-K2.5 (or whichever deployment you're using) isn't already listed, click Add model and type the exact deployment name. Make sure it is toggled on.

You can optionally toggle off standard models like gpt-4o to keep your model dropdown clean and uncluttered.

Step 5: Test the Connection

Click Verify in Cursor's model settings to confirm the proxy is responding correctly. Because the Worker includes a /v1/models endpoint that returns a mocked model list, Cursor's connection check should pass without any errors.

After verification, open a new chat (Cmd + L), select your model from the dropdown, and say hello. If everything is working, you'll get a response from your Azure deployment routed seamlessly through Cloudflare.

Optional: Attach a Custom Subdomain

If you'd like a cleaner-looking base URL in your Cursor settings (for example, https://ai.yourdomain.com/v1 instead of the generic Workers URL), you can easily attach a custom subdomain to your Worker inside Cloudflare's dashboard under Workers & Pages > your Worker > Settings > Triggers > Custom Domains.

This is entirely optional but makes the setup look more polished, especially if you're sharing the proxy with a team.

Why This Approach Is Better

This reverse proxy method is superior to fighting Cursor's native Azure settings for several reasons. First, it's completely free — Cloudflare Workers' free tier handles up to 100,000 requests per day, which is more than enough for personal and small team use. Second, it's transparent, meaning all of Cursor's features — including Agent mode, Composer, and inline suggestions — work without modification because they all use the same underlying OpenAI-compatible API. Third, it's future-proof: if Azure changes their API version or if you want to swap to a different deployment, you update a single line in your Worker and redeploy in seconds. Finally, your API key is never exposed in Cursor's configuration files in Azure's format — it flows securely as a standard Bearer token through Cloudflare's encrypted infrastructure.

Conclusion

Cursor's Azure integration is frustrating, but it doesn't have to be a dead end. By deploying a free Cloudflare Worker as a reverse proxy, you can connect any Azure OpenAI deployment — including powerful models like Kimi-K2.5 — to Cursor in just a few minutes, with zero ongoing cost and zero ongoing maintenance. Once it's running, it just works, quietly doing its job in the background every time you open Cursor to write code.

If you found this guide helpful, consider bookmarking it and sharing it with other developers who are hitting the same Azure configuration walls. And if you want to explore further, look into adding environment variables to your Cloudflare Worker to store your Azure endpoint and deployment name securely, rather than hardcoding them directly in the script.

Blog

Related posts