
Everyone Shows What MCP Does — But Nobody Tells You What It Abstracts
Last Updated on April 15, 2025 by Editorial Team
Author(s): Sanjay Krishna Anbalagan
Originally published on Towards AI.
Let’s Build This from Scratch — Developer Style
You’ve probably seen this diagram in blog posts:

But here’s the problem: Everyone shows you how to use MCP.
Nobody explains what it abstracts — and why it suddenly matters now.So let’s stop theorizing and start coding.
1. Traditional Web UI: The Button Era
Let’s say we’re building a web UI with two buttons:
<button id="btn-create-user">Create User</button>
<button id="btn-get-revenue">Get Revenue</button>
Here’s the JavaScript behind it:
document.getElementById("btn-create-user").addEventListener("click", () => {
fetch("/api/createUser")
.then(res => res.json())
.then(data => renderUserCard(data));
});document.getElementById("btn-get-revenue").addEventListener("click", () => {
fetch("/api/getRevenue")
.then(res => res.json())
.then(data => renderGraph(data));
});
What are we doing here?
- We’re writing fetching logic
- We’re writing routing logic (which API to call)
- We’re writing post-processing logic to fit the UI
Every action is hardcoded. The output needs to match the UI structure exactly.
2. Let’s DRY It Up — One Event Handler
function handleClick(actionName) {
switch (actionName) {
case "createUser":
fetch("/api/createUser")
.then(res => res.json())
.then(data => renderUserCard(data));
break;
case "getRevenue":
fetch("/api/getRevenue")
.then(res => res.json())
.then(data => renderGraph(data));
break;
}
}
This is the pattern we’ve all used:
- Switch statements
- Per-action logic
- Tight coupling with UI shape


2. Now Enter Chat-Based UIs: Text In, Text Out
The UI is no longer buttons and charts.
It’s natural language input, powered by an LLM.So now your
- Frontend says: “Get me revenue of User A”
- LLM extract an tool (getRevenue) and some toolInput(User A)
So you replace your earlier routing logic with:
invokeTool("getRevenue", { name: "User A" });
No more post-processing needed. The LLM just wants a JSON string:
{result:[ {productName: 'XyZ', yearly: 34B, },{productName: 'Z', yearly: 32B}]}

3. One Function to Handle All Tool Invocations
So Let’s Write a Wrapper
function invokeTool(toolName, toolInput) {
switch (toolName) {
case "createUser":
return fetch("/api/createUser", {
method: "POST",
body: JSON.stringify(toolInput)
}).then(res => res.text());
case "getRevenue":
return fetch("/api/getRevenue", {
method: "POST",
body: JSON.stringify(toolInput)
}).then(res => res.text());
}
}
// Wrapper Function
function getToolDataForLLM(toolName, toolInput){
return invokeTool(toolName,toolInput)
}
Boom — one function handles all tool invocations.
4. Let’s Expose It as a Service
Now that this works so well, we think:“Hey, let’s expose this wrapper as a service for other teams in our org.”So we do:
// Internal Org URL
POST /invoke
{
"toolName": "getRevenue",
"toolInput": { "month": "January" }
}
Our wrapper becomes the backend of this internal tool invocation API. Everyone wins — no one else needs to write their own logic.

5. Then… a New Tool is Added
Backend team adds a new endpoint: getTopCustomers. Now, we have to:
case "getTopCustomers":
return fetch("/api/getTopCustomers", …)
- Update our switch block
- Deploy again
- Inform everyone we changed behavior
Suddenly this beautiful abstraction becomes tech debt.
6. Wait — Why Are We Doing This Again?
“Why are we maintaining this logic if the tools are owned by the data provider?”
We’ve already proven the UI doesn’t care about data shape.
The LLM just wants a string.So why are we sitting in the middle?
Previously, you had to — because the UI expected different shaped data. But now?
- The UI doesn’t care about shape
- The LLM doesn’t care about endpoint
- You care only about string output
7. So We Move the Wrapper to the Data Provider
We move the invokeTool logic to the data provider.

// External Data Provider URL
POST /invoke
{
"toolName": "getRevenue",
"toolInput": { "month": "January" }
}
The backend does:
- Routing
- Fetching
- Output formatting
And we get back JSON string
// JSON string
{
"toolOutput": "Revenue for January is $8000"
}
8. Wrapper you just moved to the backend?
That’s the MCP Server.

9. Build It with the MCP SDK
Backend uses:
// Pseudo SDK code
const { registerTool, startMCPServer } = require("mcp-sdk");
registerTool("getRevenue", async (input) => {
const data = await fetchRevenue(input);
return `Revenue for ${input.month} is $${data.total}`;
});
registerTool("create", async (input) => {
const data = await createUser(input);
return `User created`;
});
startMCPServer();
Done.
- No manual routes.
- No client-side logic.
- No UI wiring.
And the MCP Client? Just One Line
// PSEUDO MCP client code
await invokeTool("getRevenue", { month: "January" });
// Returns: "Revenue for January is $8000"
MCP Client handles:
- Transport
- Payload structure
- Tool invocation

Summary
Why This Wasn’t Possible Before?

That flexibility made MCP abstraction possible.
What started as a simple switch block became a reusable wrapper.Then we exposed it as a service.Then we handed it off to the data provider.That journey is what leads to the MCP Server.
- It’s not a framework.
- It’s not a transport spec.
- It’s a shift in ownership.
It’s about moving routing and formatting logic from consumer to provider. If you’re building LLM-based applications, you don’t need to reinvent glue code for every tool.
You need an interface.
You need a protocol.
You need MCP.
Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.
Published via Towards AI
Take our 90+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!
Towards AI has published Building LLMs for Production—our 470+ page guide to mastering LLMs with practical projects and expert insights!

Discover Your Dream AI Career at Towards AI Jobs
Towards AI has built a jobs board tailored specifically to Machine Learning and Data Science Jobs and Skills. Our software searches for live AI jobs each hour, labels and categorises them and makes them easily searchable. Explore over 40,000 live jobs today with Towards AI Jobs!
Note: Content contains the views of the contributing authors and not Towards AI.