Glossary

Context Window

The maximum amount of text an AI model can consider at once when generating a response, measured in tokens.

Definition

What this term means

The maximum amount of text, measured in tokens, that an AI model can process in a single interaction. The context window determines how much information the model can consider when generating a response. Modern models like GPT-4o and Claude support context windows of 128,000+ tokens, but RAG-retrieved snippets are typically much shorter, making concise content crucial for citation.

Why it matters

The business impact

While context windows are growing larger, the practical implication for brand visibility is about efficiency: AI systems select and prioritise the most relevant content snippets within their context window. Pages that communicate key information clearly and concisely are more likely to be included in the model's working context, and therefore more likely to be cited in the response.

Used in context

How you might use this term

A brand with 5,000-word product pages found their key differentiators were buried too deep to be reliably retrieved. By restructuring content to place critical information within the first 500 words and using clear heading hierarchies, their citation rate in AI responses improved significantly.
Ready to improve AI visibility?

Put This Knowledge Into Action

Understanding the language of AI visibility is the first step. See how your brand performs across AI systems with a free scan.