Host Your MCP Server on Azure Functions: One Config File and You're Live
The Challenge
The Model Context Protocol has gone from "interesting spec" to "everyone's building with it" remarkably quickly. MCP gives AI agents a standardised way to discover and call external tools — browse a database, query an API, interact with a file system — without hardcoding every integration. It's model-agnostic, it's open, and it works.
But there's a gap between building an MCP server locally and running one in production. Locally, you spin up a Node.js process and point your agent at localhost. In production, you need scaling, monitoring, authentication, and cost management. Most teams solve this by containerising the server and deploying to something like Azure Container Apps or AKS. That works, but it comes with operational overhead that doesn't always match the simplicity of the MCP server itself.
Azure Functions now supports hosting Node.js MCP servers built with the official Anthropic SDK. The setup is genuinely minimal — a single host.json file — and you get serverless scaling, pay-per-use pricing, and built-in monitoring with Application Insights. If you've been looking for the fastest path from MCP prototype to production, this is it.
What's Changed
Azure Functions has added support for MCP servers as custom handlers. In practice, this means you can take your existing Node.js MCP server — the one using Express, Fastify, or any HTTP framework with the @modelcontextprotocol/sdk package — and deploy it to Functions with almost no code changes.
The entire setup is one file. Create a host.json at your project root:
{
"version": "2.0",
"configurationProfile": "mcp-custom-handler",
"customHandler": {
"description": {
"defaultExecutablePath": "node",
"arguments": ["lib/server.js"]
},
"http": {
"DefaultAuthorizationLevel": "anonymous"
},
"port": "3000"
}
}
Point the arguments array at your compiled entry point. That's the configuration. No function bindings, no trigger definitions, no Azure-specific SDK dependencies in your application code. The Functions runtime treats your MCP server as a custom handler and routes HTTP requests to it directly.
This is a different approach from the existing Azure Functions MCP bindings, which require you to define tools using Azure Functions-specific decorators. The custom handler approach lets you keep your standard MCP SDK code untouched — the same code runs locally and in production.
A practical note on protocol support: this currently works with HTTP Streaming (the StreamableHTTPServerTransport from the MCP SDK). The legacy SSE transport isn't supported because it requires persistent stateful connections, which don't fit the serverless model. For most production use cases, HTTP Streaming is the better choice anyway — it's more scalable and doesn't need persistent connections.
Getting Started
The fastest way to try this is with the minimal starter template. Clone it, run npm install, and you have a working MCP server ready for Functions deployment.
For a more complete example, Microsoft's burger ordering system demonstrates a full AI agent that uses an MCP server deployed on Functions. It includes nine tools — browsing menus, placing orders, viewing history — wired up through the standard MCP SDK.
Here's the core pattern for an MCP server using Express:
import express from 'express';
import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/streamableHttp.js';
import { getMcpServer } from './mcp.js';
const app = express();
app.use(express.json());
app.all('/mcp', async (req, res) => {
const transport = new StreamableHTTPServerTransport({
sessionIdGenerator: undefined,
});
const server = getMcpServer();
await server.connect(transport);
await transport.handleRequest(req, res, req.body);
res.on('close', async () => {
await transport.close();
await server.close();
});
});
const PORT = process.env.PORT || 3000;
app.listen(PORT);
To deploy, use the Azure Developer CLI:
azd auth login
azd up
That provisions the Functions app, configures Application Insights, and deploys your code. The CLI outputs the production URL for your MCP endpoint.
Once deployed, point any MCP-compatible client at it. For GitHub Copilot, add this to .vscode/mcp.json:
{
"servers": {
"my-mcp-server": {
"type": "http",
"url": "https://your-function-app.azurewebsites.net/mcp"
}
}
}
Testing locally? The MCP Inspector is the fastest debugging tool — set the transport to Streamable HTTP, connect to http://localhost:3000/mcp, and browse your tools.
On cost: Azure Functions Flex Consumption includes a free grant of 1 million requests and 400,000 GB-seconds of execution time per month. For most MCP servers handling agent requests, you'll likely stay within free tier for development and testing, with production costs scaling proportionally to actual usage.
What This Means
MCP is becoming the standard interface between AI agents and the tools they use. But standards only win when deployment is easy. Azure Functions removing the infrastructure friction from MCP hosting is a meaningful step — it means a single developer can go from a working MCP server to a production endpoint in minutes, not days.
For enterprise teams, this opens up a practical architecture: wrap your existing internal APIs as MCP tools, deploy them on Functions, and let any MCP-compatible agent — GitHub Copilot, Claude, custom LangChain agents — discover and use them through a single protocol. The function-per-tool pattern fits naturally with serverless, and the cost model works for the bursty, event-driven nature of agent tool calls.
The limitation to watch is the stateless requirement. If your MCP server needs session state across requests, you'll need a different hosting option. But for the vast majority of tool-calling use cases — API wrappers, database queries, document retrieval — stateless is exactly right.
Leon Godwin, Principal Cloud Evangelist at Cloud Direct