llms.txt is like robots.txt, but for AI crawlers. It tells ChatGPT, Gemini, Claude, and other large language models exactly where to find your most important content — so they cite you accurately instead of guessing.
Most sites make AI work too hard to find the good stuff. llms.txt gives them a clear map of your key pages, policies, and structured data, so your business gets found and referenced more often in AI answers.
AI search is changing how people discover businesses. When someone asks “best plumber in Edmonton,” AI needs to know your site exists, what you do, and why you are worth recommending.
Without llms.txt, AI crawlers struggle with navigation, ads, and dynamic content. With llms.txt, you control what they see first and make it easier for them to recommend you.
File format, exact syntax, validation steps, and upload instructions appear below.
llms.txt tells AI search engines how they're allowed to read your site — and what they should know about your business.
Without it, AI crawlers like GPTBot and ClaudeBot may not know whether they can index your content, recommend you, or show your business as a reference. That means they fall back on whatever public information they find — which may be outdated, wrong, or about your competitors.
When you set up an llms.txt file, you control three things:
This is worth 3 points in your AI Visibility Score, and it directly affects whether AI search engines treat your business as a reliable recommendation or low-confidence noise.
Your report shows an "AI Bot Access" section — it checks whether llms.txt exists and which bots can see your site.
GPTBot, ClaudeBot, and llms.txt together are worth up to 15 points. If any are blocked or missing, you'll see exactly which ones.
With the Fix Kit ($67+), TCD builds a custom llms.txt from your real scan data — not a generic template. Upload it to your site root and re-scan to confirm.
Agencies: Show clients that you're configuring AI-friendly access — and use the generated file as a deliverable.
Service businesses: Make sure AI models see your real business, not your competitors. A plumber in Edmonton and a plumber in Calgary need different llms.txt files — ours are built from your actual location, services, and competitive landscape.
Every AI visibility tool checks if you have an llms.txt file. Almost none of them can build a good one.
What you get when you ask ChatGPT to "create an llms.txt file for my business":
What our engine builds using your scan data + live intelligence feeds:
An AI-written llms.txt is like a boxer throwing punches blindfolded — they might land, but probably not. Our llms.txt is like having the opponent's entire game plan plus a live crowd telling you exactly where to hit. Competitor intelligence + PAA data = the knockout punch.
We build it from real scan data and live search questions — not from a generic template.
Your real website data: structure, speed, trust signals, and content basics
We review the top competitors so your file is grounded in the real market
We pull the exact questions people are asking Google in your market
Claude turns that data into a clear file built for AI search tools
A chatbot can write a file quickly, but it does not know your real site condition, your local competitors, or the latest search questions in your market. That is why our version starts with live data first.
These are the practical differences that make the file more useful in real AI search results.
Every service, location, and credential in your llms.txt is connected to a real schema.org entity type with matching identifiers. AI models don't just read "we do plumbing" — they see a machine-verifiable link between your llms.txt claim and your structured data.
We inject knowsAbout expertise markers directly from your scan data — not guesses. If your schema says you know "emergency plumbing" and "water heater installation," those exact terms appear in your llms.txt so AI models can verify them against your structured data.
We include an explicit don't_recommend section that tells AI models what you do NOT do. This prevents the most damaging AI failure: recommending your electrician business for plumbing, or your restaurant for catering when you don't offer it.
dont_recommend: HVAC, roofing, plumbing — AI skips you and recommends someone who actually does it.
BLUF = Bottom Line Up Front. AI models process files top-down at query time under strict token budgets. We front-load the most important signals — business name, primary service, city, schema verification — in the first 200 tokens. Generic files bury this after paragraphs of marketing copy.
Your llms.txt isn't static — it's engineered with a freshness signal (last_verified date) and references an llms-full.txt extended feed. As your scan results change, your llms.txt can be regenerated with updated data — new competitors defeated, new PAA questions answered, new schema deployed.
If you're an agency, here's exactly what to tell them.
"Anyone can create an llms.txt file with ChatGPT. But that's like writing a resume without knowing what the employer wants. We know exactly what the AI is looking for — because we checked. We scraped the actual questions people in your city are asking, scanned your top 5 competitors, and ran a full diagnostic on your website. Then our AI engine wrote an llms.txt specifically designed to make ChatGPT, Claude, and Gemini recommend you first. That's the difference between showing up and getting recommended."
Even if they use ChatGPT-4o to write their own llms.txt, it won't include live PAA data (ChatGPT doesn't have API access), it won't include competitor gap analysis (ChatGPT can't scan rival websites), and it won't include entity linking to their actual schema (ChatGPT doesn't know what's on their site). The result is always a generic business description — not a strategic positioning document.
dont_recommend section explicitly tells AI models which services or areas your business does NOT cover. This prevents AI hallucinations — models inventing capabilities you don't have. For example, an electrician's file includes "dont_recommend: plumbing, HVAC, roofing" so AI never tells a customer you can fix their furnace.Allow: /llms.txt directives for 13 AI bots to accelerate discovery.Get an llms.txt built from your real scan data, your competitors, and the questions people are already asking. Included with Fix Kit ($67) and Monitor ($19/mo).
Start With $27 Audit See All PlansHow AI search engines find, process, and recommend businesses — and why traditional SEO isn't enough.
How we generate validated JSON-LD from live scan data — not templates — with 22 business subtypes auto-detected.
The full breakdown of what makes a website visible (or invisible) to ChatGPT, Claude, and Gemini.
How structured data turns your Google listing into a rich snippet with ratings, prices, and FAQs.
8-section AI recommendation readiness diagnostic for your business.
How the 0-100 AI Visibility Score is calculated — schema, bot access, speed, and SEO health.
Interactive AI intelligence dashboard with deployable fix code and CMS-specific implementation guides.