ConduitScore
7 Categories

What ConduitScore Checks

ConduitScore evaluates any website across 7 categories that determine how well AI agents — including ChatGPT, Perplexity, Claude, and Gemini — can access, understand, and cite your content. Here is exactly what each category checks, why it matters, and what common issues look like.

Crawler Access(15 pts)

What it checks

  • robots.txt presence and AI bot access rules (GPTBot, ClaudeBot, OAI-SearchBot, PerplexityBot)
  • Sitemap directive in robots.txt
  • sitemap.xml presence and accessibility
  • Blocked important pages or assets
  • Crawlable HTML availability

Why it matters for AI visibility

If AI bots are blocked by robots.txt or missing from your allowed crawlers list, they cannot index or cite your content — no matter how good it is. Crawler access is a prerequisite for any AI visibility.

Common Issue

GPTBot is blocked in robots.txt

Fix

Add a robots.txt rule that explicitly allows GPTBot: `User-agent: GPTBot\nAllow: /`

Structured Data(20 pts)

What it checks

  • Organization schema (JSON-LD)
  • WebSite schema with potential SearchAction
  • FAQPage schema for Q&A content
  • BreadcrumbList schema for navigation
  • Article/BlogPosting schema for content pages

Why it matters for AI visibility

JSON-LD schema gives AI systems a machine-readable understanding of your organization, content type, and site structure. Without it, AI agents have to guess what your pages are about — and they often guess wrong.

Common Issue

No Organization schema found

Fix

Add a JSON-LD Organization block to your `<head>` with your name, URL, logo, and description.

LLMs.txt(10 pts)

What it checks

  • File existence at /llms.txt
  • File readability and structure (sections, headers)
  • Number of URLs listed
  • Whether key pages (about, pricing, docs) are referenced

Why it matters for AI visibility

The /llms.txt file is the AI equivalent of sitemap.xml — it tells AI agents exactly which pages are important and what your site is for. Sites with a well-structured llms.txt are easier to summarize, cite, and recommend.

Common Issue

No /llms.txt file found

Fix

Create a public/llms.txt file at the root of your site that lists your key pages with short descriptions. See llmstxt.org for the standard format.

Content Structure(15 pts)

What it checks

  • Single H1 tag (exactly one)
  • Multiple H2 subheadings
  • H3 subsections
  • Introductory paragraph near page top
  • FAQ or Q&A section detection
  • Semantic HTML (article, main, section)

Why it matters for AI visibility

AI systems parse content the same way screen readers and structured data parsers do. Clear heading hierarchies, semantic HTML, and FAQ sections make it far easier for AI to extract and attribute answers from your content.

Common Issue

Page has no H2 subheadings

Fix

Break long content pages into clearly labeled sections using H2 tags. Each major topic should have its own heading.

Technical Health(15 pts)

What it checks

  • Page load speed (target: under 2 seconds)
  • Canonical tag presence
  • noindex directive detection
  • Viewport meta tag
  • Meta description presence
  • Charset declaration

Why it matters for AI visibility

Slow pages, missing canonical tags, or accidental noindex directives can prevent AI crawlers from successfully reading your content. Technical health ensures your pages are actually reachable and indexable.

Common Issue

noindex meta tag found on a public-facing page

Fix

Remove `<meta name='robots' content='noindex'>` from pages you want AI agents to crawl and cite.

Citation Signals(15 pts)

What it checks

  • External links to authoritative sources
  • About page link
  • Contact page link
  • Author attribution (meta or HTML class)
  • Organization identity signals
  • Legal/trust pages (privacy, terms)

Why it matters for AI visibility

AI agents are more likely to cite sources that demonstrate credibility, authorship, and institutional identity. Sites with clear author information, external links to authoritative sources, and trust pages rank higher for citation likelihood.

Common Issue

No author attribution found

Fix

Add author information using a `<meta name='author'>` tag, or mark up author names with `itemprop='author'` in your HTML.

Content Quality(10 pts)

What it checks

  • Word count (target: 1000+ words for key pages)
  • Title tag quality
  • Meta description length (target: 50+ characters)
  • Publish or updated date
  • Paragraph count and structure

Why it matters for AI visibility

Thin, undated, or poorly titled content is less likely to be cited by AI systems. Longer content with good structure, clear publish dates, and quality title/description gives AI agents more confidence in using it as a source.

Common Issue

Meta description is too short (under 50 characters)

Fix

Write a meta description of 50–160 characters that accurately summarizes the page content. This helps AI agents understand what the page is about before reading it.

Run a Free Scan to See Your Results

Get your 0–100 AI visibility score across all 7 categories, plus prioritized copy-paste fixes. No sign-up required.

Scan Your Website Free →

See also: Methodology · Sample Reports