Building a Remote MCP Server for Azure DevOps with Azure Functions

February 16, 202614 min read
šŸ‘ļø Loading...

Building a secure MCP Server with Azure Functions + EasyAuth

Tell an AI assistant to "check why the build failed" and it'll politely explain that it doesn't have access to your CI/CD system. Fair enough.

The Model Context Protocol (MCP) gives agents a standard way to call external tools — but most of the examples out there assume your tools are either running locally or sitting behind a public API with a simple API key.

That's not how it works in most organisations. Your CI/CD lives behind Entra ID. Your pipelines need per-user or per-service permissions. You can't just hand out a PAT and call it a day.

This post walks through how I built a remote MCP server that exposes Azure DevOps pipeline operations to AI assistants, with authentication handled entirely by Entra ID and EasyAuth. The Function App's Managed Identity authenticates directly to Azure DevOps with least-privilege access across multiple projects. No PATs. No shared service accounts. No client secrets anywhere in the stack.

The goal of this guide is to demonstrate and show you how to:

  • Build your own MCP Server using EasyAuth
  • Host your own MCP Server via Azure and build it with IaC
  • No hardcoded secrets to rotate, simple privilege of least principle

What are we building

Four MCP tools that let an AI assistant interact with Azure DevOps Pipelines:

ToolWhat it does
list_pipeline_runsList recent builds with statuses, branches, and durations
get_run_failure_logsInspect a failed run — task names, error messages, log tails
list_deploymentsList Classic Release deployments by environment and status
trigger_pipeline_runQueue a new pipeline run with optional branch and parameters

All tools accept an optional project parameter — the server supports multiple Azure DevOps projects, validated against a configured allowlist.

The compute layer is an Azure Function App (Flex Consumption, Python 3.12). Authentication is handled by EasyAuth with Protected Resource Metadata (PRM). Identity is Entra ID with group-based access control. Everything is deployed via Terraform.

Architecture

The architecture is deliberately simple. EasyAuth sits in front of the Function App as the authentication layer. MCP clients discover the auth requirements via the /.well-known/oauth-protected-resource endpoint (exposed by PRM), authenticate the user through Entra ID's standard OAuth 2.0 PKCE flow, and then call the MCP endpoint with a bearer token. EasyAuth validates the token before the request ever reaches your function code.

MCP Client                     EasyAuth                        Entra ID
│                                │                                │
│ GET /.well-known/              │                                │
│ oauth-protected-resource       │                                │
│───────────────────────►│                                        │
│◄───────────────────────│ (auth metadata + scopes)               │
│                                │                                │
│ OAuth 2.0 PKCE flow            │                                │
│────────────────────────────────────────────────────────────────►│
│◄────────────────────────────────────────────────────────────────│
│ (access_token)                 │                                │
│                                │                                │
│ POST /runtime/webhooks/mcp     │                                │
│ Authorization: Bearer {token}  │                                │
│───────────────────────►│                                        │
│                        │ Validates token                        │
│                        │ (audience, issuer, expiry)             │
│                        │                                        │
│                        │──► Function App                        │
│                        │    (MCP tool execution)                │
│                        │                                        │
│◄───────────────────────│ (tool result)                          │
│                                │                                │
ā–¼                                ā–¼                                ā–¼

A few things worth noting about this approach:

EasyAuth + PRM replaces a full OAuth gateway. The WEBSITE_AUTH_PRM_DEFAULT_WITH_SCOPES app setting tells EasyAuth to expose a /.well-known/oauth-protected-resource endpoint. MCP clients that support remote servers (VS Code, for example) discover this automatically and handle the entire PKCE flow themselves. No custom OAuth endpoints, no token caching policies, no session encryption.

unauthenticated_action = "Return401" is critical. The default EasyAuth behaviour redirects unauthenticated requests to a login page — which breaks MCP clients expecting a 401 response to trigger their OAuth flow. Setting this to Return401 ensures the correct HTTP semantics.

The Managed Identity authenticates directly to Azure DevOps. The Function App's User-Assigned MI is registered as a Stakeholder in the ADO organisation and added to each project's Contributors group. The MI calls the ADO REST API under its own identity with scoped permissions — no token exchange, no token juggling.

The extension bundle challenge

The Azure Functions MCP extension1 is what makes a Function App "Speak MCP" — When I say the Function App can "Speak MCP" it handles the JSON-RPC protocol, tool discovery, and SSE transport. But the current experimental extension bundle has a constraint that shaped the authentication approach: the extension doesn't pass HTTP headers to your function code.

When your function receives a tool invocation, the context object contains name, arguments, and _meta. That's it. No Authorization header. No cookies. No ambient user identity.

The MCP extension v1.1.0 (released November 2025) added a transport field to the context that includes all HTTP request headers. However, the experimental extension bundle v4.5.0 (released September 2025) predates this — so the bundle doesn't include the newer extension version. Until the bundle catches up, the user's bearer token is invisible to function code.

I tried installing the MCP extension explicitly (bypassing the bundle) by creating an extensions.csproj with Microsoft.Azure.Functions.Extensions.Mcp v1.1.0. It built locally, but Flex Consumption's remote build (Kudu/Oryx) strips the bin/ directory during deployment. Deploying with --no-build preserved the extensions but skipped pip install, resulting in zero functions loaded. A catch-22.

The workaround: the MI authenticates directly to Azure DevOps. EasyAuth still gates access to the Function App — only users with valid Entra ID tokens for this specific app registration can reach it — but the downstream ADO calls use the MI's own credentials rather than a user-delegated token.

The Function App

Each MCP tool is an Azure Function using the mcpToolTrigger binding from the experimental extension bundle:

The MCP Server is stateless by design, each MCP tool invocation is a self-contained function execution with no server-side session state.

EasyAuth validates the bearer token per-request — no session cookies, no server-side token cache. The MCP client holds the token and sends it with every request.

The MI acquires an ADO token per-request — DefaultAzureCredential handles token caching internally (within the process lifetime), but there's nothing persisted between function invocations.

No database, no cache, no shared state — each function call gets a context with name, arguments, and _meta, does its ADO API call, and returns the result. Nothing is written anywhere.

This is actually a big win compared to the APIM approach I have used in the past. Using that type of architecture is stateful — and it would store client registrations, encrypted session keys, and Entra tokens in APIM's internal cache. When the cache cleared (which happens on every policy update or restart on the Developer SKU), everything broke. Login loops, "Client Not Found" errors, dead sessions.

@app.generic_trigger(
    arg_name="context",
    type="mcpToolTrigger",
    toolName="list_pipeline_runs",
    description="List recent pipeline runs. Returns build IDs, statuses, branches, and durations.",
    toolProperties=json.dumps([
        {
            "propertyName": "project",
            "propertyType": "string",
            "description": "Azure DevOps project name. Defaults to the configured project."
        },
        {
            "propertyName": "pipeline_id",
            "propertyType": "integer",
            "description": "Filter to a specific pipeline ID."
        },
        {
            "propertyName": "status",
            "propertyType": "string",
            "description": "Filter by status: completed, inProgress, cancelling, notStarted."
        },
        {
            "propertyName": "top",
            "propertyType": "integer",
            "description": "Number of results to return (default 20, max 50)."
        },
    ]),
)
async def list_pipeline_runs(context: str) -> str:
    ctx = json.loads(context)
    args = ctx.get("arguments", {})
    project = _resolve_project(args)
    bearer_token = _get_devops_token(os.environ.get("AZURE_MI_CLIENT_ID"))
    client = get_devops_client()
    # ... validate inputs, call Azure DevOps REST API ...

The project parameter lets the AI assistant target different Azure DevOps projects — validated against a configured allowlist so it can't wander into projects the MI doesn't have access to.

The host.json opts into the experimental extension bundle and configures the MCP server metadata:

{
  "version": "2.0",
  "extensionBundle": {
    "id": "Microsoft.Azure.Functions.ExtensionBundle.Experimental",
    "version": "[4.*, 5.0.0)"
  },
  "extensions": {
    "mcp": {
      "serverName": "azure-devops-pipelines-mcp",
      "serverVersion": "1.0.0",
      "system": {
        "webhookAuthorizationLevel": "Anonymous"
      }
    }
  }
}

The webhookAuthorizationLevel is set to Anonymous because EasyAuth handles authentication at the platform level — the function-level auth key isn't needed when every request has already been validated by EasyAuth.

The token acquisition has a two-tier fallback:

_DEVOPS_SCOPE = "499b84ac-1321-427f-aa17-267ca6975798/.default"

def _get_devops_token(mi_client_id: str | None) -> str:
    if mi_client_id:
        # Azure: MI authenticates directly to ADO
        cred = DefaultAzureCredential(managed_identity_client_id=mi_client_id)
        return cred.get_token(_DEVOPS_SCOPE).token

    # Local dev: use az login session
    return AzureCliCredential().get_token(_DEVOPS_SCOPE).token

On Azure, AZURE_MI_CLIENT_ID is set and the MI authenticates directly to ADO. Locally, it falls back to AzureCliCredential from your az login session — zero infrastructure dependencies.

That magic string 499b84ac-1321-427f-aa17-267ca6975798 is the Azure DevOps first-party application ID. Every Azure DevOps token request uses it as the resource scope.

Input validation

All tool inputs are validated before any ADO API calls are made. Integer parameters (pipeline_id, build_id, top) are type-checked. top is range-enforced between 1 and 50. Status filters are validated against known ADO enum values. The project parameter is checked against the configured allowlist. Branch names are length-limited, and pipeline parameters are size-capped at 10KB.

Error messages are sanitised to strip URLs that might contain tokens — if an ADO API returns an error containing a URL with auth parameters, it's redacted before being sent back to the MCP client.

def _sanitise_error_message(message: str) -> str:
    return re.sub(r"https?://\S+", "[URL redacted]", message)

The Azure DevOps REST client is thin by design — retry logic for 429 rate limits on all request types (including log fetches), and that's about it:

class AzureDevOpsClient:
    def _request_with_retry(self, method, url, *, params, json_body, bearer_token):
        for attempt in range(1, self._retry_attempts + 1):
            resp = requests.request(method, url, headers=headers, params=params, json=json_body)
            if resp.status_code == 429 and attempt < self._retry_attempts:
                retry_after = float(resp.headers.get("Retry-After", self._retry_delay))
                time.sleep(retry_after)
                continue
            resp.raise_for_status()
            return resp.json()

Infrastructure as Code

The entire stack is Terraform. No portal clicking, no post-deployment scripts, no "just run this one az command manually."

Entra ID

The app registration is the identity anchor. It defines the OAuth scopes, redirect URIs, required permissions, and pre-authorised clients:

resource "azuread_application" "mcp_server" {
  display_name     = "${var.project_name}-${random_id.suffix.hex}"
  sign_in_audience = "AzureADMyOrg"

  api {
    requested_access_token_version = 2

    oauth2_permission_scope {
      id      = random_uuid.scope_id.result
      value   = "access_as_user"
      type    = "User"
      enabled = true
    }
  }
}

VS Code and Azure CLI are pre-authorised as clients — Entra ID doesn't support Dynamic Client Registration, so MCP clients must be explicitly configured:

resource "azuread_application_pre_authorized" "vscode" {
  application_id       = azuread_application.mcp_server.id
  authorized_client_id = "aebc6443-996d-45c2-90f0-388ff96faa56" # VS Code
  permission_ids       = [random_uuid.scope_id.result]
}

Group-based access control

The service principal has app_role_assignment_required = true, meaning only explicitly assigned users or groups can obtain tokens. A security group gates access — if you're not in the group, Entra ID blocks you at token issuance with AADSTS50105:

resource "azuread_group" "mcp_users" {
  display_name     = "${var.project_name}-users-${random_id.suffix.hex}"
  security_enabled = true
  description      = "Users allowed to access the MCP server"
}

resource "azuread_group_member" "mcp_users" {
  for_each         = toset(var.admin_user_object_ids)
  group_object_id  = azuread_group.mcp_users.object_id
  member_object_id = each.value
}

resource "azuread_app_role_assignment" "mcp_users_group" {
  app_role_id         = "00000000-0000-0000-0000-000000000000" # Default Access
  principal_object_id = azuread_group.mcp_users.object_id
  resource_object_id  = azuread_service_principal.mcp_server.object_id
}

Users are managed via the admin_user_object_ids Terraform variable — add or remove an object ID, run terraform apply, and access is updated. One thing to note: removing a user from the group doesn't invalidate existing access tokens. JWTs are valid until their exp claim. The user needs to sign out from VS Code's Accounts menu to force a new token acquisition, at which point Entra ID will reject the request.

Managed Identity in Azure DevOps

The MI needs its own identity within Azure DevOps. Terraform handles both the organisation-level entitlement and per-project group membership:

resource "azuredevops_service_principal_entitlement" "mcp_mi" {
  origin               = "aad"
  origin_id            = azurerm_user_assigned_identity.mcp.principal_id
  account_license_type = "stakeholder"
}

data "azuredevops_project" "target" {
  for_each = toset(var.azure_devops_projects)
  name     = each.value
}

data "azuredevops_group" "contributors" {
  for_each   = toset(var.azure_devops_projects)
  project_id = data.azuredevops_project.target[each.key].id
  name       = "Contributors"
}

resource "azuredevops_group_membership" "mcp_mi_contributors" {
  for_each = toset(var.azure_devops_projects)
  group    = data.azuredevops_group.contributors[each.key].descriptor
  mode     = "add"
  members  = [azuredevops_service_principal_entitlement.mcp_mi.descriptor]
}

The for_each iterates over a list of project names — add a new project to azure_devops_projects in your terraform.tfvars, run terraform apply, and the MI gets Contributors access to it automatically. The mode = "add" ensures the MI is added to the Contributors group without overwriting the group's existing membership. Stakeholder is the cheapest licence tier (free), and Contributors grants access to pipelines, builds, repos, and boards — enough for the MCP tools to list runs, fetch logs, and trigger builds.

One gotcha: the azuredevops_service_principal_entitlement resource requires Project Collection Administrator permissions on the ADO organisation. If you haven't granted yourself that role, the terraform apply will fail with a 401.

Function App

Flex Consumption (FC1 SKU) with MI-authenticated storage — no connection strings anywhere:

resource "azurerm_function_app_flex_consumption" "mcp" {
  name            = "${var.project_name}-func-${random_id.suffix.hex}"
  service_plan_id = azurerm_service_plan.mcp.id

  storage_container_type            = "blobContainer"
  storage_authentication_type       = "UserAssignedIdentity"
  storage_user_assigned_identity_id = azurerm_user_assigned_identity.mcp.id

  runtime_name    = "python"
  runtime_version = "3.12"

  app_settings = {
    "AZURE_DEVOPS_ORG"      = var.azure_devops_org
    "AZURE_DEVOPS_PROJECT"  = var.azure_devops_projects[0]
    "AZURE_DEVOPS_PROJECTS" = join(",", var.azure_devops_projects)
    "AZURE_MI_CLIENT_ID"    = azurerm_user_assigned_identity.mcp.client_id

    "WEBSITE_AUTH_PRM_DEFAULT_WITH_SCOPES" = "api://${azuread_application.mcp_server.client_id}/${var.oauth2_scope_name}"
  }

  auth_settings_v2 {
    auth_enabled           = true
    require_authentication = true
    unauthenticated_action = "Return401"

    active_directory_v2 {
      client_id            = azuread_application.mcp_server.client_id
      tenant_auth_endpoint = "https://login.microsoftonline.com/${data.azuread_client_config.current.tenant_id}/v2.0"
      allowed_audiences = [
        "api://${azuread_application.mcp_server.client_id}",
        azuread_application.mcp_server.client_id,
      ]
    }

    login {
      token_store_enabled = true
    }
  }
}

The WEBSITE_AUTH_PRM_DEFAULT_WITH_SCOPES setting is what exposes the /.well-known/oauth-protected-resource endpoint. MCP clients discover this, learn which scopes to request, and handle the full PKCE flow without any custom OAuth endpoints on our side.

AZURE_DEVOPS_PROJECTS is a comma-separated allowlist — the function code validates that any project parameter from the AI assistant matches this list before making ADO API calls.

Local development

For local dev, the MI path is bypassed entirely. When AZURE_MI_CLIENT_ID isn't set (which it won't be on your laptop), the function falls back to AzureCliCredential:

# Login to Azure with DevOps permissions
az login

# Start the Function App locally
func start

The MCP extension starts an endpoint on localhost. Point your MCP client at it and you're calling Azure DevOps as yourself, with zero infrastructure dependencies.

Deployment

# Deploy infrastructure
cd infra
terraform init && terraform apply

# Deploy function code
func azure functionapp publish $(terraform -chdir=infra output -raw function_app_name)

Then add the server to VS Code's MCP settings:

{
  "servers": {
    "azure-devops-pipelines": {
      "type": "http",
      "url": "https://<FUNCTION_APP_NAME>.azurewebsites.net/runtime/webhooks/mcp"
    }
  }
}

The URL must end with /runtime/webhooks/mcp — pointing at the root URL will return the default "App is running" landing page and the MCP client will hang waiting for an initialize response that never comes.

On first use, VS Code opens a browser window for Entra ID sign-in. After that, the token is cached and re-authentication is handled transparently.

Trade-offs and honest simplifications

A few things I'd change or note before running this in production at scale:

  • Managed Identity acts as itself, not as the user. Every ADO API call runs under the MI's identity and permissions, not the calling user's. This is fine for the tools we've built but means you lose per-user audit trails in Azure DevOps. When the extension bundle ships transport headers and the user's bearer token becomes available to function code, On-Behalf-Of token exchange could restore user-level identity.
  • Extension bundle is experimental. The Microsoft.Azure.Functions.ExtensionBundle.Experimental bundle contains the MCP trigger bindings. It works, but the API surface may change before GA.
  • Token revocation isn't instant. Removing a user from the Entra ID group blocks future token issuance, but existing JWTs remain valid until expiry. For immediate revocation, you'd need to revoke the user's sessions via the Graph API and have them sign out of their MCP client.
  • Flex Consumption and explicit extensions don't mix. If you need a newer version of the MCP extension than what the bundle provides, you're stuck until the bundle updates. The remote build process strips custom bin/ directories, and --no-build skips Python dependency installation.
  • No per-client rate limiting. Azure Functions Flex Consumption has instance-level scaling limits but no per-user or per-client throttling. A misbehaving MCP client could exhaust your ADO API rate limits.

Wrapping up

EasyAuth with Protected Resource Metadata handles the entire OAuth flow that MCP clients need — no custom OAuth endpoints, no token caching layer, no session encryption. The Function App stays focused on business logic. Entra ID handles identity and access control. Managed Identity eliminates secrets. Terraform makes it reproducible.

Multi-project support means one server can serve tools across multiple Azure DevOps projects — add a project name to your Terraform variables and the MI gets the right permissions automatically. Input validation ensures the AI assistant can't send garbage to your ADO APIs, and error messages are sanitised to prevent token leakage.

The entire codebase — infrastructure and function code — is available in the repository.


Footnotes

  1. The Azure Functions MCP extension (mcpToolTrigger) is part of the experimental extension bundle. It adds MCP protocol support — JSON-RPC, tool discovery, SSE transport — as a Functions binding type. ↩

Share this post