Definition
What this term means
An HTTP response header that communicates the exact date and time a web page was last updated. When a crawler or AI system requests a page, the Last-Modified header tells it how recent the content is. This header works alongside other freshness signals to help AI systems assess whether content is current and relevant, or potentially outdated.
Why it matters
The business impact
AI systems prefer fresh, well-maintained content, especially for topics where information changes frequently. Accurate Last-Modified headers reinforce freshness signals, making your content more likely to be selected over older competing sources. Conversely, stale or missing Last-Modified headers can cause AI systems to deprioritise your content in favour of more recently updated alternatives.
Used in context
How you might use this term
“A company was losing AI citations to competitors whose content was objectively less comprehensive but more recently updated. After implementing accurate Last-Modified headers and establishing a regular content refresh schedule, their citation rate recovered as AI systems recognised their content as both authoritative and current.”
Related terms
Explore connected concepts
Freshness Signals
The collection of indicators that tell search engines and AI systems how recently content was created or updated. Freshness signals include Last-Modified headers, sitemap lastmod dates, visible 'last updated' dates on pages, recent internal and external references, and the frequency of content changes detected by crawlers. Together, these signals help AI systems determine whether content is current and reliable.
Sitemap
An XML file that provides search engines and AI crawlers with a structured list of all important URLs on a website, along with metadata about each page, including when it was last modified, how frequently it changes, and its relative priority. Sitemaps serve as a roadmap that helps crawlers discover, prioritise, and efficiently index your content.
Crawl Budget
The total number of pages that search engine and AI crawlers will fetch from your website within a given time period. Crawl budget is determined by a combination of your site's perceived authority, server performance, URL structure, and content freshness signals. Crawlers allocate their budget based on these factors, spending more time on sites they consider valuable and efficient to crawl.