Anti-Scraping Challenge - Case Page Blocked
Summary
BAILII England & Wales has implemented Anubis anti-bot protection on case page EWHC/Comm/2026/765. The protection uses a Proof-of-Work scheme inspired by Hashcash to deter aggressive AI website scraping. Users must enable JavaScript and disable plugins like JShelter to access the protected content. The measure aims to reduce server downtime caused by mass scraping while legitimate users experience minimal additional load.
What changed
This is an administrative notice from BAILII regarding the implementation of Anubis, a Proof-of-Work-based anti-scraping protection system, on the England & Wales High Court case page EWHC/Comm/2026/765. The protection is designed to make mass scraping more computationally expensive while adding negligible load for individual legitimate users.
For users attempting to access case documents: the primary requirement is enabling modern JavaScript support and disabling anti-tracking browser extensions. There are no compliance obligations, regulatory requirements, or penalties associated with this page—it is a technical access control mechanism, not a regulatory document. The underlying case content remains accessible to human users who complete the challenge.
What to do next
- Enable JavaScript in browser
- Disable JShelter or similar plugins for this domain
- Wait for challenge completion to access content
Archived snapshot
Apr 15, 2026GovPing captured this document from the original source. If the source has since changed or been removed, this is the text as it existed at that time.
Making sure you're not a bot!
Loading...
You are seeing this because the administrator of this website has set up Anubis to protect the server against the scourge of AI companies aggressively scraping websites. This can and does cause downtime for the websites, which makes their resources inaccessible for everyone.
Anubis is a compromise. Anubis uses a Proof-of-Work scheme in the vein of Hashcash, a proposed proof-of-work scheme for reducing email spam. The idea is that at individual scales the additional load is ignorable, but at mass scraper levels it adds up and makes scraping much more expensive.
Ultimately, this is a placeholder solution so that more time can be spent on fingerprinting and identifying headless browsers (EG: via how they do font rendering) so that the challenge proof of work page doesn't need to be presented to users that are much more likely to be legitimate.
Please note that Anubis requires the use of modern JavaScript features that plugins like JShelter will disable. Please disable JShelter or other such plugins for this domain.
Related changes
Get daily alerts for BAILII England & Wales Recent Decisions
Daily digest delivered to your inbox.
Free. Unsubscribe anytime.
About this page
Every important government, regulator, and court update from around the world. One place. Real-time. Free. Our mission
Source document text, dates, docket IDs, and authority are extracted directly from BAILII.
The summary, classification, recommended actions, deadlines, and penalty information are AI-generated from the original text and may contain errors. Always verify against the source document.
Classification
Who this affects
Taxonomy
Browse Categories
Get alerts for this source
We'll email you when BAILII England & Wales Recent Decisions publishes new changes.
Subscribed!
Optional. Filters your digest to exactly the updates that matter to you.