April 2026 · Founder's Log
The Multi-Protocol Future: MCP, Agent Skills, and Making APIs Agent-Ready
There are now four different ways to make your API discoverable by AI agents. They're not competing — they're layers of the same stack. Here's what we learned shipping all four for Signbee.
Founder, Signbee

The convergence
Six months ago, the answer to “how do I make my API work with AI agents?” was simple: write good docs and hope the LLM has seen them in its training data.
Today, there are four distinct protocols for making your API agent-accessible — each serving a different layer of the stack. They look like they're competing. They're not. They're complementary, and if you're building an API-first product, you probably need all of them.
The four layers
| Layer | Purpose | Consumer | Effort |
|---|---|---|---|
| llms.txt | Discovery & context | LLMs, AI search | 30 minutes |
| OpenAPI | Schema & structure | Code generators, SDKs | 1–2 hours |
| Agent Skills | Usage patterns & context | Hermes, autonomous agents | 1 hour |
| MCP Server | Live tool access | Claude, Cursor, Hermes | 2–4 hours |
Layer 1: llms.txt — Tell LLMs what you do
llms.txt is the simplest layer. It's a plain text file at your domain root that describes your product in a format optimised for large language models. When someone asks ChatGPT, Claude, or Perplexity “what is Signbee?”, the answer should come from your llms.txt, not from a cached training snapshot.
# Signbee > API-first e-signing for developers and AI agents. ## What it does Signbee sends documents for legally binding electronic signatures via a single API call. Write markdown or provide a PDF URL, send it, get a SHA-256 certified signed PDF back. ## API POST https://signb.ee/api/v1/send Authorization: Bearer sb_live_... ## MCP Server npx -y signbee-mcp
Who consumes this: AI chat models (ChatGPT, Claude, Perplexity), AI search crawlers, and developers asking LLMs about your product.
Build time: 30 minutes. It's a text file.
Layer 2: OpenAPI — Describe your schema
The OpenAPI specification (formerly Swagger) has been around since 2011. It defines your API's endpoints, parameters, request/response schemas, and authentication in a machine-readable JSON or YAML format.
OpenAPI isn't new, but it gains new importance in the agent era. LLMs with function calling can generate API requests directly from OpenAPI schemas. Code generation tools (like GPT-4's code interpreter) use it to auto-generate SDK wrappers. And MCP servers can be auto-generated from OpenAPI specs.
Who consumes this: SDK generators, function-calling LLMs, documentation engines, and MCP server generators.
Build time: 1–2 hours if you're writing it manually, or auto-generated from your framework (e.g., @nestjs/swagger, FastAPI).
Layer 3: Agent Skills — Teach agents when and how
We covered this in depth in yesterday's post. An Agent Skill is a structured markdown file that bundles documentation, examples, trigger conditions, and error handling into a single installable package. It tells the agent when to use your API, not just how.
This is the layer that frameworks like Hermes Agent consume natively. Hermes creates its own skills from experience — but pre-made skills give it a head start. The agentskills.io format has become the emerging standard.
Who consumes this: Hermes Agent, autonomous agents, and any framework that loads tool descriptions from markdown.
Build time: 1 hour.
Layer 4: MCP Server — Give agents live access
The Model Context Protocol (MCP) by Anthropic is the runtime layer. An MCP server wraps your API in a standard protocol that any MCP-compatible client (Claude, Cursor, Windsurf, Hermes Agent) can connect to and use as a live tool.
Unlike the other three layers, an MCP server runs as a process. It handles authentication, transforms natural language parameters into API calls, and returns structured results. It's the highest-fidelity integration — the agent gets real-time access to your API with type-safe tool definitions.
{
"signbee": {
"command": "npx",
"args": ["-y", "signbee-mcp"],
"env": {
"SIGNBEE_API_KEY": "sb_live_..."
}
}
}Who consumes this: Claude Desktop, Claude Co-work, Cursor, Windsurf, Hermes Agent, and any MCP-compatible agent.
Build time: 2–4 hours for a basic server; a day for a polished one with multiple tools.
Why you need all four
Each layer serves a different moment in the agent's journey:
- llms.txt → Discovery — The agent (or user) asks “is there a tool for document signing?” and the LLM knows about Signbee from its llms.txt
- OpenAPI → Understanding — The agent generates a function wrapper or validates its API call against the schema
- Agent Skill → Context — The agent knows when to use Signbee (trigger: “sign document”, “send NDA”) and has a proven example to follow
- MCP Server → Execution — The agent connects, calls the tool, gets the result. Live interaction with your API
Skip any layer and you create a gap. Without llms.txt, agents don't know you exist. Without OpenAPI, they can't validate their calls. Without a Skill, they don't know when to use you. Without an MCP server, they need to manually construct HTTP requests.
What we learned shipping all four
Signbee ships with all four layers. Here's what we learned:
llms.txt is the highest ROI. It took 30 minutes to write and immediately made Signbee discoverable in AI search. If you do nothing else, do this.
The MCP server drives real usage. When a developer installs the Signbee MCP server in Claude or Hermes, they use Signbee daily. It becomes ambient infrastructure — always available, always one natural language command away. This is the stickiest integration layer.
Agent Skills are underrated. The Hermes Agent community is growing fast, and skills are how capability spreads. One developer publishes a Signbee skill; every Hermes user can install it. It's the closest thing to viral distribution in the agent ecosystem.
OpenAPI is table stakes. You probably have one already. If you don't, generate it from your framework — most modern backend frameworks can do this automatically.
The playbook
If you're building an API-first product and want to be agent-ready, here's the priority order:
- Today: Write your
llms.txt(30 minutes) - This week: Create an Agent Skill and publish to agentskills.io (1 hour)
- This month: Build an MCP server and publish to npm (4 hours)
- Ongoing: Keep your OpenAPI spec up to date (generated)
Total investment: one day. Total reach: every AI agent, every AI chat model, every AI-powered IDE, and every agentic framework in the ecosystem.
The future isn't one protocol. It's all of them, working together.
Signbee ships llms.txt, OpenAPI, Agent Skills, and MCP. Agent-ready out of the box.