Link Copied!

Protocolo de Contexto del Modelo (MCP) de Anthropic: El momento 'USB-C' para la integración de la IA

El nuevo estándar abierto de Anthropic, MCP, elimina el problema de integración 'MxN'. Con la adopción temprana de Replit y Cursor, promete ser el 'USB-C' que conecta cada IA a cada base de datos.

🌐
Nota de Idioma

Este artículo está escrito en inglés. El título y la descripción han sido traducidos automáticamente para su conveniencia.

El Protocolo de Contexto del Modelo MCP de Anthropic conectando la IA a varias fuentes de datos

The artificial intelligence industry has been facing a silent but growing crisis: the “MxN problem.”

Every time a developer wants to connect an AI model (M) to a new data source or tool (N), they have to build a custom, brittle integration. To connect Claude to Google Drive? One API. To connect GPT-4 to Slack? A completely different API. To connect Llama-3 to a Postgres database? Yet another custom connector.

It serves as a massive tax on innovation. It turns AI developers into plumbers, spending 90% of their time fixing pipes instead of building intelligence.

Enter Anthropic, who have just unveiled what might be the most important infrastructure update of 2025: the Model Context Protocol (MCP).

The “USB-C” Analogy

Think about life before USB-C. You had a drawer full of proprietary chargers—one for your Nokia, one for your camera, one for your laptop. It was a mess. USB-C standardized the physical connection, allowing power and data to flow between almost any two devices regardless of manufacturer.

MCP does the same for AI context.

It defines a universal standard for how an AI model asks for data and how a system provides it. “It’s a standard way for AI to plug into the world,” says Anthropic’s product lead.

Under the Hood: JSON-RPC and “Capabilities”

At its core, MCP isn’t magic; it’s a rigorous implementation of JSON-RPC 2.0. It uses a Client-Host-Server architecture that decouples the “brain” (AI) from the “hands” (Tools).

  1. MCP Host (The Interface): Applications like the Claude Desktop App, Cursor, or Replit. The Host is responsible for running the AI and managing permissions.
  2. MCP Client (The Brain): The LLM itself. It understands the protocol and knows how to ask “What tools do I have?” and “Please read this file.”
  3. MCP Server (The Data): This is the magic part. Developers build a lightweight “server” for their data (e.g., a “Google Drive MCP Server”). This server exposes Resources (files), Prompts (templates), and Tools (functions executable by the AI).

Because it uses JSON-RPC over standard transports (stdio, SSE), it is incredibly fast and secure. Authentication happens at the transport layer, meaning you don’t have to give your API keys to the AI model provider. The data stays local or flows directly between your app and your data source.

The Industry Rallies: Replit, Cursor, and More

A standard is only as good as its adoption, and MCP is seeing unprecedented buy-in.

  • Replit: The popular online IDE has announced full support, allowing their “Replit Agent” to plug into any MCP-compliant tool.
  • Cursor: The AI code editor that has taken the world by storm is adopting MCP, meaning Cursor can now natively “read” your linear tickets or “query” your production database if you have an MCP server for them.
  • Block & Apollo: Large enterprise players are already building internal MCP servers to safely expose their massive datasets to internal AI assistants.

The End of “Siloed AI”

For the average user, this means the era of the “dumb chatbot” is ending.

Soon, you won’t just chat with Claude. You will open your IDE, and Claude will already know what is in your Jira backlog, have read the latest PR documentation from GitHub, and be ready to deploy to AWS—all because those services simply “plugged in” via MCP.

We are moving from “Chatting with AI” to “Working with AI.” And thanks to MCP, we finally have a universal language for that work.

Sources

🦋 Discussion on Bluesky

Discuss on Bluesky

Searching for posts...