Free Noindex Drift Monitor
Bulk-check your URLs for accidental noindex directives. Paste a list of URLs or provide a sitemap to scan up to 500 pages at once. This tool detects noindex tags in HTML meta robots and X-Robots-Tag HTTP headers, helping you catch noindex drift before it costs you search traffic.
Check URLs for Noindex Drift
Up to 500 URLs. One URL per line.
What Is Noindex Drift?
Noindex drift is the gradual, unintentional spread of noindex directives across a website. Unlike a single accidental noindex on one page, drift affects multiple pages over time and often goes undetected for weeks or months. By the time you notice the traffic drop, dozens or hundreds of pages may have already been removed from search results.
Common causes include staging environment settings that leak into production during deployments, CMS plugins that add noindex to certain page types by default, server configurations that inject X-Robots-Tag headers globally, and template changes that accidentally include noindex in shared layouts.
Common Sources of Noindex Drift
- Staging environment leaks: A "Discourage search engines" checkbox in your CMS (like WordPress) that stays enabled after going live.
- CMS plugin misconfiguration: SEO plugins like Yoast or Rank Math that apply noindex to taxonomy pages, archives, or pagination by default.
- Server-level X-Robots-Tag: Nginx or Apache configurations that inject noindex headers for certain URL patterns or entire directories.
- Template inheritance: A shared layout or base template that inadvertently includes a noindex meta tag, affecting all pages that extend it.
- CDN or proxy rules: Cloudflare Workers, Vercel middleware, or reverse proxy rules that add X-Robots-Tag headers to responses.
How to Use This Tool
- Choose your input method — Paste a list of URLs directly or enter a sitemap URL to have the tool extract URLs automatically.
- Provide URLs — For the paste method, enter one URL per line (up to 500). For the sitemap method, provide the URL of your XML sitemap or sitemap index.
- Click "Check for Noindex" — The tool fetches each URL in parallel batches and checks for noindex directives in both meta tags and HTTP headers.
- Review the summary — See total URLs checked, how many are noindexed, how many are indexable, and how many had errors (timeout or unreachable).
- Filter results — Use the filter buttons to show only noindexed pages or pages with errors. Noindexed rows are highlighted for easy identification.
- Take action — For each noindexed URL, determine if the directive is intentional. If not, remove the noindex tag or header and verify the fix.
What This Tool Checks
- Meta robots tags: Scans the HTML for
<meta name="robots" content="noindex">and bot-specific variants like<meta name="googlebot"> - X-Robots-Tag HTTP header: Checks the server response headers for
X-Robots-Tag: noindexorX-Robots-Tag: none - HTTP status codes: Reports the response status code for each URL so you can identify 404s, 500s, and redirects alongside noindex issues.
- Combined detection: Identifies pages with noindex from multiple sources (both meta tag and HTTP header simultaneously).
Frequently Asked Questions
What is noindex drift?
Noindex drift is the gradual, often accidental, spread of noindex directives across a website. It typically happens when staging environment settings leak into production, CMS plugins add noindex tags globally, or deployment scripts misconfigure robots meta tags. Over time, important pages silently disappear from search results without anyone noticing.
How does this tool detect noindex drift?
This tool fetches each URL you provide (via paste or sitemap) and checks for noindex directives in two places: HTML meta robots tags and X-Robots-Tag HTTP response headers. It reports which pages are noindexed, the source of the directive, and highlights potential issues like important pages being accidentally blocked.
What is the difference between meta robots noindex and X-Robots-Tag?
Meta robots noindex is an HTML tag placed in the page <head> section. X-Robots-Tag is an HTTP response header set by the server. Both achieve the same result — preventing indexing — but X-Robots-Tag is harder to spot since it does not appear in the page source. This tool checks both sources to give you complete coverage.
Why would a page in my sitemap have noindex?
A page in your sitemap with noindex is a conflicting signal. The sitemap tells search engines the page should be indexed, while the noindex directive says the opposite. This usually indicates an accidental noindex — perhaps from a CMS setting, a plugin, or a server configuration. You should either remove the noindex or remove the page from the sitemap to resolve the conflict.
How many URLs can I check at once?
You can check up to 500 URLs at once, either by pasting them directly or by providing a sitemap URL. For sitemap index files, the tool processes up to 3 child sitemaps. URLs are checked in parallel batches of 10 for speed, with a 15-second timeout per request.
Track Your Brand Across Google & AI
QuickSEO connects your Google Search Console data with AI visibility tracking across ChatGPT, Claude, and Gemini — all in one dashboard.
Try QuickSEO →