Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Read by thought-leaders and decision-makers around the world. Phone Number: +1-650-246-9381 Email: pub@towardsai.net
228 Park Avenue South New York, NY 10003 United States
Website: Publisher: https://towardsai.net/#publisher Diversity Policy: https://towardsai.net/about Ethics Policy: https://towardsai.net/about Masthead: https://towardsai.net/about
Name: Towards AI Legal Name: Towards AI, Inc. Description: Towards AI is the world's leading artificial intelligence (AI) and technology publication. Founders: Roberto Iriondo, , Job Title: Co-founder and Advisor Works for: Towards AI, Inc. Follow Roberto: X, LinkedIn, GitHub, Google Scholar, Towards AI Profile, Medium, ML@CMU, FreeCodeCamp, Crunchbase, Bloomberg, Roberto Iriondo, Generative AI Lab, Generative AI Lab VeloxTrend Ultrarix Capital Partners Denis Piffaretti, Job Title: Co-founder Works for: Towards AI, Inc. Louie Peters, Job Title: Co-founder Works for: Towards AI, Inc. Louis-François Bouchard, Job Title: Co-founder Works for: Towards AI, Inc. Cover:
Towards AI Cover
Logo:
Towards AI Logo
Areas Served: Worldwide Alternate Name: Towards AI, Inc. Alternate Name: Towards AI Co. Alternate Name: towards ai Alternate Name: towardsai Alternate Name: towards.ai Alternate Name: tai Alternate Name: toward ai Alternate Name: toward.ai Alternate Name: Towards AI, Inc. Alternate Name: towardsai.net Alternate Name: pub.towardsai.net
5 stars – based on 497 reviews

Frequently Used, Contextual References

TODO: Remember to copy unique IDs whenever it needs used. i.e., URL: 304b2e42315e

Resources

Our 15 AI experts built the most comprehensive, practical, 90+ lesson courses to master AI Engineering - we have pathways for any experience at Towards AI Academy. Cohorts still open - use COHORT10 for 10% off.

Publication

Everyone Shows What MCP Does — But Nobody Tells You What It Abstracts
Latest   Machine Learning

Everyone Shows What MCP Does — But Nobody Tells You What It Abstracts

Last Updated on April 15, 2025 by Editorial Team

Author(s): Sanjay Krishna Anbalagan

Originally published on Towards AI.

Let’s Build This from Scratch — Developer Style

You’ve probably seen this diagram in blog posts:

Everyone Shows What MCP Does — But Nobody Tells You What It Abstracts

But here’s the problem: Everyone shows you how to use MCP.
Nobody explains what it abstracts — and why it suddenly matters now.So let’s stop theorizing and start coding.

1. Traditional Web UI: The Button Era

Let’s say we’re building a web UI with two buttons:

<button id="btn-create-user">Create User</button>
<button id="btn-get-revenue">Get Revenue</button>

Here’s the JavaScript behind it:

document.getElementById("btn-create-user").addEventListener("click", () => {
fetch("/api/createUser")
.then(res => res.json())
.then(data => renderUserCard(data));
});document.getElementById("btn-get-revenue").addEventListener("click", () => {
fetch("/api/getRevenue")
.then(res => res.json())
.then(data => renderGraph(data));
});

What are we doing here?

  1. We’re writing fetching logic
  2. We’re writing routing logic (which API to call)
  3. We’re writing post-processing logic to fit the UI

Every action is hardcoded. The output needs to match the UI structure exactly.

2. Let’s DRY It Up — One Event Handler

function handleClick(actionName) {
switch (actionName) {
case "createUser":
fetch("/api/createUser")
.then(res => res.json())
.then(data => renderUserCard(data));
break;
case "getRevenue":
fetch("/api/getRevenue")
.then(res => res.json())
.then(data => renderGraph(data));
break;
}
}

This is the pattern we’ve all used:

  1. Switch statements
  2. Per-action logic
  3. Tight coupling with UI shape

2. Now Enter Chat-Based UIs: Text In, Text Out

The UI is no longer buttons and charts.
It’s natural language input, powered by an LLM.So now your

  1. Frontend says: “Get me revenue of User A”
  2. LLM extract an tool (getRevenue) and some toolInput(User A)

So you replace your earlier routing logic with:

invokeTool("getRevenue", { name: "User A" });

No more post-processing needed. The LLM just wants a JSON string:

{result:[ {productName: 'XyZ', yearly: 34B, },{productName: 'Z', yearly: 32B}]}

3. One Function to Handle All Tool Invocations

So Let’s Write a Wrapper

function invokeTool(toolName, toolInput) {
switch (toolName) {
case "createUser":
return fetch("/api/createUser", {
method: "POST",
body: JSON.stringify(toolInput)
}).then(res => res.text());
case "getRevenue":
return fetch("/api/getRevenue", {
method: "POST",
body: JSON.stringify(toolInput)
}).then(res => res.text());
}
}

// Wrapper Function
function getToolDataForLLM(toolName, toolInput){
return invokeTool(toolName,toolInput)
}

Boom — one function handles all tool invocations.

4. Let’s Expose It as a Service

Now that this works so well, we think:“Hey, let’s expose this wrapper as a service for other teams in our org.”So we do:

// Internal Org URL
POST /invoke
{
"toolName": "getRevenue",
"toolInput": { "month": "January" }
}

Our wrapper becomes the backend of this internal tool invocation API. Everyone wins — no one else needs to write their own logic.

5. Then… a New Tool is Added

Backend team adds a new endpoint: getTopCustomers. Now, we have to:

case "getTopCustomers":
return fetch("/api/getTopCustomers", …)
  1. Update our switch block
  2. Deploy again
  3. Inform everyone we changed behavior

Suddenly this beautiful abstraction becomes tech debt.

6. Wait — Why Are We Doing This Again?

Why are we maintaining this logic if the tools are owned by the data provider?

We’ve already proven the UI doesn’t care about data shape.
The LLM just wants a string.So why are we sitting in the middle?

Previously, you had to — because the UI expected different shaped data. But now?

  1. The UI doesn’t care about shape
  2. The LLM doesn’t care about endpoint
  3. You care only about string output

7. So We Move the Wrapper to the Data Provider

We move the invokeTool logic to the data provider.

// External Data Provider URL
POST /invoke
{
"toolName": "getRevenue",
"toolInput": { "month": "January" }
}

The backend does:

  1. Routing
  2. Fetching
  3. Output formatting

And we get back JSON string

// JSON string
{
"toolOutput": "Revenue for January is $8000"
}

8. Wrapper you just moved to the backend?

That’s the MCP Server.

9. Build It with the MCP SDK

Backend uses:

// Pseudo SDK code 
const { registerTool, startMCPServer } = require("mcp-sdk");

registerTool("getRevenue", async (input) => {
const data = await fetchRevenue(input);
return `Revenue for ${input.month} is $${data.total}`;
});
registerTool("create", async (input) => {
const data = await createUser(input);
return `User created`;
});

startMCPServer();

Done.

  1. No manual routes.
  2. No client-side logic.
  3. No UI wiring.

And the MCP Client? Just One Line

// PSEUDO MCP client code
await invokeTool("getRevenue", { month: "January" });
// Returns: "Revenue for January is $8000"

MCP Client handles:

  1. Transport
  2. Payload structure
  3. Tool invocation

Summary

Why This Wasn’t Possible Before?

That flexibility made MCP abstraction possible.

What started as a simple switch block became a reusable wrapper.Then we exposed it as a service.Then we handed it off to the data provider.That journey is what leads to the MCP Server.

  1. It’s not a framework.
  2. It’s not a transport spec.
  3. It’s a shift in ownership.

It’s about moving routing and formatting logic from consumer to provider. If you’re building LLM-based applications, you don’t need to reinvent glue code for every tool.

You need an interface.
You need a protocol.
You need MCP.

Join thousands of data leaders on the AI newsletter. Join over 80,000 subscribers and keep up to date with the latest developments in AI. From research to projects and ideas. If you are building an AI startup, an AI-related product, or a service, we invite you to consider becoming a sponsor.

Published via Towards AI


Take our 90+ lesson From Beginner to Advanced LLM Developer Certification: From choosing a project to deploying a working product this is the most comprehensive and practical LLM course out there!

Towards AI has published Building LLMs for Production—our 470+ page guide to mastering LLMs with practical projects and expert insights!


Discover Your Dream AI Career at Towards AI Jobs

Towards AI has built a jobs board tailored specifically to Machine Learning and Data Science Jobs and Skills. Our software searches for live AI jobs each hour, labels and categorises them and makes them easily searchable. Explore over 40,000 live jobs today with Towards AI Jobs!

Note: Content contains the views of the contributing authors and not Towards AI.