Keboola dropped SSE support on April 1st. Atlassian's deadline is June 30th. If you've been running MCP servers with HTTP+SSE transport and haven't looked at your migration path yet, here's what you need to know and exactly how to make the switch.
This isn't a protocol rewrite. Your tool logic doesn't change. But the transport layer does, and getting it wrong means broken integrations in production.
What SSE Actually Does (and Why It's a Problem)
SSE transport was MCP's original solution for remote servers. It works, but it requires two separate HTTP endpoints. The client establishes a persistent GET connection to receive a server-sent event stream, and separately POSTs messages to the server. Your Express app ends up with something like this:
// The old SSE pattern: two routes for one logical connection
app.get('/sse', async (req, res) => {
const transport = new SSEServerTransport('/messages', res);
await server.connect(transport);
});
app.post('/messages', async (req, res) => {
await transport.handlePostMessage(req, res);
});This split creates real operational problems:
Load balancer confusion. Most load balancers treat GET and POST requests independently. Your SSE stream might land on server A while subsequent POSTs go to server B. Sticky sessions help but aren't reliable across all infrastructure.
Proxy timeouts. Long-lived GET connections get killed by proxies, API gateways, and CDNs with connection timeout settings. Nginx's default keepalive timeout is 75 seconds. AWS ALB closes idle connections after 60 seconds by default. Your SSE stream silently drops.
CORS headaches. Two different routes need consistent CORS headers. Get this wrong and the browser blocks the connection or the POST, which shows up as a cryptic error on the client side.
No standard session management. SSE ties session state to the connection. If the connection drops, you've lost the session context. Reconnection requires the full initialization handshake again.
Streamable HTTP fixes all of this with one endpoint.
How Streamable HTTP Works
The core change is that client and server communicate over a single POST endpoint. The client sends a request, the server looks at whether the response needs streaming, and returns either a standard JSON body (for simple responses) or opens an SSE stream (for responses with multiple events or long-running operations).
The new session management works through headers. On initialization, the server includes Mcp-Session-Id in the response. The client attaches it to every subsequent request. If the session expires or the server restarts, the server returns 404 and the client reinitializes cleanly.
This works naturally with every piece of infrastructure that handles standard HTTP. No sticky sessions required. Proxies don't need special configuration. CORS is one route.
Migration: The Phased Approach
Don't cut over all at once. The safest migration runs both transports simultaneously, moves clients over, then retires the old routes.
Step 1: Add the Streamable HTTP endpoint alongside your existing SSE routes
Install the latest MCP TypeScript SDK if you haven't:
npm install @modelcontextprotocol/sdk@latestThen add the new transport while keeping the old one:
import express from 'express';
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/streamableHttp.js';
import { SSEServerTransport } from '@modelcontextprotocol/sdk/server/sse.js';
import { randomUUID } from 'crypto';
const app = express();
app.use(express.json());
function createServer() {
const server = new McpServer({ name: 'my-tools', version: '1.0.0' });
// Register your tools here, same as before
return server;
}
// --- NEW: Streamable HTTP endpoint ---
const sessions = new Map<string, StreamableHTTPServerTransport>();
app.post('/mcp', async (req, res) => {
const sessionId = req.headers['mcp-session-id'] as string | undefined;
let transport: StreamableHTTPServerTransport;
if (sessionId && sessions.has(sessionId)) {
transport = sessions.get(sessionId)!;
} else {
// New session
transport = new StreamableHTTPServerTransport({
sessionIdGenerator: () => randomUUID(),
onsessioninitialized: (id) => {
sessions.set(id, transport);
},
});
transport.onclose = () => {
if (transport.sessionId) sessions.delete(transport.sessionId);
};
const server = createServer();
await server.connect(transport);
}
await transport.handleRequest(req, res, req.body);
});
// --- LEGACY: Keep SSE routes for existing clients ---
const sseTransports = new Map<string, SSEServerTransport>();
app.get('/sse', async (req, res) => {
const transport = new SSEServerTransport('/messages', res);
sseTransports.set(transport.sessionId, transport);
transport.onclose = () => sseTransports.delete(transport.sessionId);
const server = createServer();
await server.connect(transport);
});
app.post('/messages', async (req, res) => {
const sessionId = req.query.sessionId as string;
const transport = sseTransports.get(sessionId);
if (!transport) {
res.status(404).json({ error: 'Session not found' });
return;
}
await transport.handlePostMessage(req, res);
});
app.listen(3000);Now you have both transports running. New clients connect to /mcp. Existing clients continue using /sse and /messages.
Step 2: Update your clients
On the client side, swap SSEClientTransport for StreamableHTTPClientTransport:
import { Client } from '@modelcontextprotocol/sdk/client/index.js';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp.js';
const client = new Client({ name: 'my-agent', version: '1.0.0' });
const transport = new StreamableHTTPClientTransport(
new URL('http://localhost:3000/mcp')
);
await client.connect(transport);
// From here, tool calls work exactly as before
const tools = await client.listTools();The client API is identical. listTools(), callTool(), listResources() -- all the same. Only the transport constructor changes.
Step 3: Test both paths
Run your full test suite against both endpoints. Focus especially on:
- Tools that return large responses (verify they stream correctly)
- Long-running tool operations (session persistence over time)
- Error conditions (what happens when the tool throws)
- Concurrent calls (session isolation)
The MCP inspector is useful here. Point it at http://localhost:3000/mcp and exercise your tools manually before running automated tests.
Step 4: Update infrastructure configuration
If you're running behind a load balancer or reverse proxy, remove any sticky session configuration you added for SSE. With Streamable HTTP you don't need it -- the session ID is in the header and any server instance can handle any request.
Update CORS config to the single endpoint:
import cors from 'cors';
app.use('/mcp', cors({
origin: process.env.ALLOWED_ORIGINS?.split(',') ?? '*',
exposedHeaders: ['Mcp-Session-Id'],
}));The Mcp-Session-Id response header needs to be in exposedHeaders so browser-based clients can read it.
Step 5: Retire the SSE routes
Once you've confirmed all clients are on Streamable HTTP, remove the SSE routes. Set a sunset date, communicate it to any consumers of your server, and then delete the code.
What Changes at the Protocol Level
For most teams, the migration is purely operational. But it's worth understanding what's different at the protocol level so you can debug issues when they come up.
Response format. A simple tool call returns Content-Type: application/json with a single JSON body, same as a REST API. A streaming response (or a request sent with Accept: text/event-stream) returns Content-Type: text/event-stream. Your tool handlers don't need to know which format is being used -- the transport handles that automatically.
Session lifecycle. With SSE, the session lived as long as the GET connection. With Streamable HTTP, sessions are decoupled from individual connections. A session persists until the client sends a DELETE request to the /mcp endpoint with the session ID, or until the server evicts it. This means you can reconnect to an existing session after a client restart without losing context.
Error handling. SSE connection failures were often silent -- the client just stopped receiving events. Streamable HTTP errors come back as HTTP status codes, which are easier to catch, log, and alert on.
The Chanl MCP Runtime
If you're deploying AI agents for customer experience, the MCP runtime in Chanl handles the transport layer for you. Streamable HTTP is the default, with session management, auth middleware, and audit logging built in. You register your tools once and connect them to any agent deployment -- voice, chat, or messaging -- without managing transport configuration separately.
This matters especially when you're connecting tools to production agents: the transport security and session hygiene that Streamable HTTP enables are prerequisites for the monitoring you need to debug tool failures after they happen.
After the Migration
Moving from SSE to Streamable HTTP is one part of the broader MCP 2026 upgrade path. The other changes coming in the spec -- formal authorization scopes, audit trail APIs, and the governance structure under the Agentic AI Foundation -- all build on a stable transport layer.
The fragmentation problem that drove MCP adoption in the first place is solved. The protocol is now stable enough to build production infrastructure on. The migration from SSE to Streamable HTTP is the last significant compatibility break -- get it done before the platform deadlines force you to do it under pressure.
If you want to go deeper on the architecture, we walked through how MCP, A2A, and WebMCP form a three-layer protocol stack. The transport layer covered here is the foundation everything else builds on.
MCP runtime with Streamable HTTP, built in
Chanl's agent platform handles transport, session management, and auth so you can focus on your tools and agent logic.
Explore MCP FeaturesCo-founder
Building the platform for AI agents at Chanl — tools, testing, and observability for customer experience.
Aprende IA Agéntica
Una lección por semana: técnicas prácticas para construir, probar y lanzar agentes IA. Desde ingeniería de prompts hasta monitoreo en producción. Aprende haciendo.



