Glossary

GPT (Generative Pre-trained Transformer)

OpenAI's family of large language models that power ChatGPT and form the foundation of the most widely used AI assistants.

Definition

What this term means

A family of large language models developed by OpenAI that form the foundation of ChatGPT and numerous other AI applications. GPT models are pre-trained on internet-scale text data and then fine-tuned for conversational use, powering some of the most widely used AI assistants in the world. The architecture has become so influential that 'GPT' is often used colloquially to refer to generative AI in general.

Why it matters

The business impact

GPT models power ChatGPT, which has over 200 million weekly active users globally. When these users ask for product recommendations, service comparisons, or expert advice, GPT's understanding of your brand directly determines whether you appear in the response. Optimising for GPT-based systems is essential for reaching the largest AI audience available today.

Used in context

How you might use this term

An agency tested GPT-4o's responses to 50 category-relevant prompts and found their client was mentioned in only 3. After implementing entity optimisation and strengthening authority signals, the client appeared in 28 of the same prompts within two months.
Ready to improve AI visibility?

Put This Knowledge Into Action

Understanding the language of AI visibility is the first step. See how your brand performs across AI systems with a free scan.