Glossary

Robots.txt

A file that guides crawlers on which pages to access.

Definition

What this term means

A file that guides crawlers on which pages to access.

Why it matters

The business impact

Proper crawling ensures AI systems can find and trust your content.

Used in context

How you might use this term

We allow AI crawlers to access documentation while blocking staging pages.
Ready to improve AI visibility?

Put This Knowledge Into Action

Understanding the language of AI visibility is the first step. See how your brand performs across AI systems with a free scan.