Glossary

Token

The basic unit of text that AI models process, roughly equivalent to three-quarters of a word in English.

Definition

What this term means

The fundamental unit of text that AI models process, roughly equivalent to three-quarters of a word in English. AI systems break all input text into tokens before processing it. Every word, punctuation mark, and space is tokenised. The number of tokens determines how much content fits within a model's context window and how much it costs to process.

Why it matters

The business impact

Token limits affect how much of your content an AI system can consider and process in a single query. Content that is unnecessarily verbose uses more tokens without adding value, potentially causing key information to be truncated or excluded. Writing token-efficient content, concise, clear, and information-dense, increases the likelihood that AI systems will fully process and cite your most important claims.

Used in context

How you might use this term

During a content optimisation audit, a team discovered that their 3,000-word guide used 4,200 tokens, most of which were repetitive filler. After streamlining to 1,800 words (2,400 tokens) without losing any substantive information, the page's AI citation rate doubled.
Ready to improve AI visibility?

Put This Knowledge Into Action

Understanding the language of AI visibility is the first step. See how your brand performs across AI systems with a free scan.