Glossary

Hallucination

When an AI model generates false or unsupported information.

Definition

What this term means

When an AI model generates false or unsupported information.

Why it matters

The business impact

Hallucinations can spread misinformation about your brand. Strong authority signals reduce this risk.

Used in context

How you might use this term

We monitor AI outputs to catch and correct hallucinations about our client's products.
Ready to improve AI visibility?

Put This Knowledge Into Action

Understanding the language of AI visibility is the first step. See how your brand performs across AI systems with a free scan.