March 2026 · Technical Guide

What Is llms.txt and Why Your SaaS Needs One

SEO made your product discoverable by search engines. llms.txt makes it discoverable by AI agents. If you're building a SaaS product in 2026, this is the file you didn't know you needed.

llms.txt file at the center with AI models — ChatGPT, Claude, Perplexity, Gemini — orbiting and consuming it

The analogy everyone gets

In 1994, if you wanted search engines to find your website, you needed a robots.txt file. It told crawlers what to index, what to skip, and how to navigate your site. Every website has one.

In 2026, if you want AI agents to find and use your product, you need an llms.txt file. It tells language models what your product does, what API endpoints are available, how to authenticate, and what the request/response formats look like. All in plain text.

robots.txt was for search engines. llms.txt is for AI agents.

What goes in an llms.txt file

An llms.txt file is a plain text document served at /llms.txt on your domain. It contains everything an AI model needs to understand and use your product:

  • Product description — What does your product do, in one sentence?
  • Connection methods — How does an agent connect? API? MCP? SDK?
  • Authentication — What credentials are needed and how to get them?
  • Endpoints — Every API endpoint with required/optional fields and examples
  • Request/response formats — Exact JSON payloads with sample data
  • Flow — Step-by-step explanation of how the process works
  • Pricing/limits — What does the free tier include? What are the rate limits?
  • Error codes — What do errors mean and how to handle them

The key principle: write for a machine, not a marketer. No promotional language. No “industry-leading” adjectives. Just facts, endpoints, and examples. The model will decide whether to use your product based on whether it solves the user's problem, not because you called yourself “best-in-class”.

A real example: Signbee's llms.txt

Here's a condensed version of what we serve at signb.ee/llms.txt:

signb.ee/llms.txt (condensed)
# Signbee
> Send documents for e-signing via a single API call. Built for AI agents.

## Connect (pick one)

### 1. Direct API (zero setup)
POST https://signb.ee/api/v1/send
Content-Type: application/json
Authorization: Bearer <api_key> (optional)

### 2. MCP Server
{"mcpServers": {"signbee": {"command": "npx", "args": ["-y", "signbee-mcp"]}}}

## Endpoints

### POST /api/v1/send
Required fields:
- recipient_name, recipient_email
- Either markdown (min 10 chars) OR pdf_url

Example request:
{"markdown": "# Mutual NDA\n...", "sender_name": "Alice",
 "sender_email": "alice@startup.com", "recipient_name": "Bob",
 "recipient_email": "bob@acme.com"}

## Plan limits
- Free: 5 documents/month
- Pro ($9/mo): 100 documents/month
- Business ($19/mo): Unlimited

No fluff. No marketing copy. Just actionable information that a model can parse, understand, and act on immediately.

Why this matters now

Something fundamental is shifting in how users discover software. The old funnel was: Google search → landing page → sign up → read docs → integrate. That flow assumed a human in a browser at every step.

The new funnel is increasingly: User asks AI → AI searches for tools → AI reads your llms.txt → AI integrates your product → User sees the result. The human never visits your website. They never read your docs. They never see your pricing page.

If an AI model can't find or understand your API, it will recommend a competitor that it can. That's the new competitive dynamic.

📄 robots.txt → “Search engines, here's how to crawl me”
📄 sitemap.xml → “Search engines, here's everything I have”
📄 openapi.json → “Developers, here's my API spec”
📄 llms.txt → “AI agents, here's how to use me”

Each file in this stack serves a different consumer. The first three have been standard for years. The fourth is becoming essential in 2026.

How to build your own llms.txt

Here's a template you can adapt for any SaaS product:

llms.txt template
# [Product Name]
> [One-sentence description of what the product does]

## Authentication
[How to get and use credentials]

## Endpoints

### [METHOD] [/path] — [What it does]
[Description]

Required fields:
- field_name: description

Optional fields:
- field_name: description (default: value)

Example request:
{JSON example}

Example response:
{JSON example}

## Flow
1. [Step one]
2. [Step two]
3. [Step three]

## Pricing
- Free: [what's included]
- Paid: [what's included, price]

## Error codes
- 400: [meaning]
- 401: [meaning]
- 429: [meaning]

## Links
- Docs: [url]
- OpenAPI: [url]
- Support: [email]

Key rules:

  1. Plain text only — No HTML, no rich formatting. Markdown headings are fine
  2. Be specific — Include real field names, real JSON, real URLs
  3. Include examples — Show complete request/response pairs
  4. State limitations — Rate limits, plan restrictions, field constraints
  5. Skip the marketing — No superlatives, no testimonials, no social proof
  6. Keep it current — Update it whenever your API changes

Serving it from Next.js

If you're running a Next.js app, you can serve llms.txt as a route handler:

src/app/llms.txt/route.ts
import { NextResponse } from "next/server";

export async function GET() {
  const content = `# Your Product
> One-line description

## Endpoints
### POST /api/v1/action
...
`;

  return new NextResponse(content, {
    headers: {
      "Content-Type": "text/plain; charset=utf-8",
      "Cache-Control": "public, max-age=3600",
    },
  });
}

Then add it to your robots.txt allow rules so AI crawlers know it exists:

robots.txt addition
User-agent: GPTBot
Allow: /llms.txt

User-agent: ClaudeBot
Allow: /llms.txt

User-agent: PerplexityBot
Allow: /llms.txt

The GEO advantage

This is part of a broader shift called Generative Engine Optimisation (GEO) — the practice of making your product discoverable and usable by AI-powered search and recommendation systems.

Traditional SEO optimises for Google's ranking algorithm. GEO optimises for the information retrieval and tool-use capabilities of large language models. Different engines, different optimisation strategies:

SEOGEO
TargetGoogle, BingChatGPT, Claude, Perplexity
FormatHTML, meta tags, schemallms.txt, openapi.json, plain text
GoalRank on page 1Be recommended and integrated
User actionClick a linkZero-click (AI takes action)

The companies that invest in GEO now will have a significant advantage as AI-assisted discovery becomes the default. An llms.txt file is the lowest-effort, highest-impact first step you can take.

Start today

Creating an llms.txt file takes about 30 minutes. Here's the checklist:

  1. Write a one-line product description (no marketing language)
  2. List every public API endpoint with required fields and examples
  3. Document authentication requirements
  4. Include real JSON request/response pairs
  5. State pricing, limits, and error codes
  6. Serve it at /llms.txt as text/plain
  7. Allow AI crawlers to access it in your robots.txt

You can see a production example at signb.ee/llms.txt. Copy the structure. Adapt it for your product. Ship it today.

Signbee is AI-discoverable by design — llms.txt, openapi.json, MCP server.