fingerprintby Scraping Central

Detect headless browser

Is your scraper leaking automation?

Every classic headless browser tell, checked in one shot, webdriver, chrome.runtime, plugins, permissions, headless UA, GPU.

Running fingerprint checks in your browser…

The headless browser detection checklist

  1. navigator.webdriver === false, must be deleted, not just set to undefined.
  2. window.chrome.runtime exists, required if your UA claims Chrome.
  3. Notification.permission === ‘default’, not ‘denied’ on Chrome UA.
  4. navigator.plugins.length > 0, at least PDF Viewer should be present on desktop Chrome.
  5. navigator.languages non-empty, and matching the UA's expected locale.
  6. WebGL renderer is hardware, not SwiftShader, llvmpipe, or Mesa.
  7. UA does not contain HeadlessChrome/PhantomJS, override at launch.
  8. Screen size is plausible, not 800x600 (Puppeteer default) or 1024x768.
  9. Timezone matches IP geolocation, set TZ env or use Playwright's timezoneId.
  10. Mouse and keyboard events have human entropy, bezier curves, varying velocities, dwell times. Out of scope of this page, but worth knowing.

Recommended stack

For most scrapers: Playwright with playwright-stealth on a real-GPU host, behind residential proxies whose geolocation matches the timezone you set. Verify with this page before every deploy.

Frequently asked questions

How do websites detect a headless browser?

By looking for the differences between headless and headed Chrome. The classic tells: navigator.webdriver===true, window.chrome.runtime missing on a Chrome UA, Notification.permission stuck on 'denied' while permissions API reports 'default', navigator.plugins.length===0, empty navigator.languages, missing media codecs, and unusual WebGL renderers. Anti-bot scripts test all of these and more.

What is navigator.webdriver?

A boolean property set to true by any browser launched under WebDriver protocol (Selenium, Playwright, Puppeteer with default settings). It's specified by the W3C, easy to read, and the single most-checked bot signal. Modern stealth plugins delete it from the navigator object, but anti-bot scripts also check whether it can be redefined, if the property descriptor looks tampered, that's its own giveaway.

How do I make Puppeteer / Playwright look like real Chrome?

Install the stealth plugin: 'puppeteer-extra-plugin-stealth' for Puppeteer, 'playwright-stealth' or undetected-playwright for Playwright. They patch ~20 known leaks (webdriver, chrome.runtime, plugins, languages, permissions, WebGL noise, etc.). Then verify with this tool, if any signal still shows as a leak, patch it manually before deploying.

Why is Notification.permission a bot signal?

Real Chrome reports Notification.permission === 'default' when the user hasn't decided. Headless Chrome historically reports 'denied' because it has no UI to ask. The mismatch (Chrome UA + 'denied' permission) was documented around 2017 and remains one of the cheapest signals to check.

Does using 'new headless mode' (--headless=new) help?

Some. Chrome's --headless=new uses the same code paths as headed Chrome, so navigator.plugins, navigator.mimeTypes, and Notification.permission behave correctly. But navigator.webdriver is still true under WebDriver protocol, and the UA still says 'HeadlessChrome' unless you override it. Helps; doesn't solve.

Can a fully patched headless browser ever be undetectable?

Against the easy signals: yes. Against advanced detectors (DataDome, Cloudflare bot management, Akamai Bot Manager) that fingerprint TLS, JA3, mouse movement entropy, and inter-keystroke timing: not really, without significant effort. At some point it's cheaper to use a managed scraping API or a real-browser proxy service.

← Run the full fingerprint test