I just started using this myself, seems pretty great so far!
Clearly doesn’t stop all AI crawlers, but a significantly large chunk of them.
It’s a clever solution but I did see one recently that IMO was more elegant for noscript users. I can’t remember the name but it would create a dummy link that human users won’t touch, but webcrawlers will naturally navigate into, but then generates an infinitely deep tree of super basic HTML to force bots into endlessly trawling a cheap-to-serve portion of your webserver instead of something heavier. Might have even integrated with fail2ban to pick out obvious bots and keep them off your network for good.
If you remember the project I would be interested to see it!
But I’ve seen some AI poisoning sink holes before too, a novel concept as well. I have not heard of real world experiences of them yet.
I’m assuming they’re thinking about this
A pseudonymous coder has created and released an open source “tar pit” to indefinitely trap AI training web crawlers in an infinitely, randomly-generating series of pages to waste their time and computing power. The program, called Nepenthes after the genus of carnivorous pitcher plants which trap and consume their prey, can be deployed by webpage owners to protect their own content from being scraped or can be deployed “offensively” as a honeypot trap to waste AI companies’ resources.
Which was posted here a while back
Maybe this is it -
generates an infinitely deep tree
Wouldn’t the bot simply limit the depth of it’s seek?
That would be reasonable. The people running these things aren’t reasonable. They ignore every established mechanism to communicate a lack of consent to their activity because they don’t respect others’ agency and want everything.
Why Sha256? Literally every processor has a crypto accelerator and will easily pass. And datacenter servers have beefy server CPUs. This is only effective against no-JS scrapers.
It requires a bunch of browser features that non-user browsers don’t have, and the proof-of-work part is like the least relevant piece in this that only gets invoked once a week or so to generate a unique cookie.
I sometimes have the feeling that as soon as some crypto-currency related features are mentioned people shut off part of their brain. Either because they hate crypto-currencies or because crypto-currency scammers have trained them to only look at some technical implementation details and fail to see the larger picture that they are being scammed.
So if you try to access a website using this technology via terminal, what happens? The connection fails?
If your browser doesn’t have a Mozilla user agent (I.e. like chrome or Firefox) it will pass directly. Most AI crawlers use these user agents to pretend to be human users
What I’m thinking about is more that in Linux, it’s common to access URLs directly from the terminal for various purposes, instead of using a browser.
If you’re talking about something like
curl
, that also uses its own User agent unless asked to impersonate some other UA. If not, then maybe I can’t help.
I think the maze approach is better, this seems like it hurts valid users if the web more than a company would be.
For those not aware, nepenthes is an example for the above mentioned approach !
This looks like it can can actually fuck up some models, but the unnecessary CPU load it will generate means most websites won’t use it unfortunately
I did not find any instruction on the source page on how to actually deploy this. That would be a nice touch imho.
There are some detailed instructions on the docs site, tho I agree it’d be nice to have in the readme, too.
Sounds like the dev was not expecting this much interest for the project out of nowhere so there will def be gaps.
Or even a quick link to the relevant portion of the docs at least would be cool
Meaning it wastes time and power such that it gets expensive on a large scale? Or does it mine crypto?
Yes, Anubis uses proof of work, like some cryptocurrencies do as well, to slow down/mitigate mass scale crawling by making them do expensive computation.
https://lemmy.world/post/27101209 has a great article attached to it about this.
–
Edit: Just to be clear, this doesn’t mine any cryptos, just uses same idea for slowing down the requests.
And, yet, the same people here lauding this for intentionally burning energy will turn around and spew vitriol at cryptocurrencies which are reviled for doing exactly the same thing.
Proof of work contributes to global warming. The only functional, IRL, difference between this and crypto mining is that this doesn’t generate digital currency.
There are a very few POW systems that do good, like BOINC, which is a POW system that awards points for work done; the work is science, protein analysis, SETI searches, that sort of thing. The work itself is valuable and needs doing; they found a way to make the POW constructive. But just causing a visitor to use more electricity to “stick it” to crawlers is not ethically better than crypto mining.
Just be aware of the hypocrisy.
the functional difference is that this does it once. you could just as well accuse git of being a major contributor to global warming.
hash algorithms are useful. running billions of them to make monopoly money is not.
Which party of git performs proof-of-work? Specifically, intentionally inefficient algorithms whose output is thrown away?
the hashing part? it’s the same algo as here.
That’s not proof of work, though.
git is performing hashes to generate identifiers for versions of files so it can tell when they changed. It’s like moving rocks to build a house.
Proof of work is moving rocks from one pile to another and back again, for the only purpose of taking up your time all day.
okay, git using the same algorithm may have been a bad example. let’s go with video games then. the energy usage for the fraction of a second it takes for the anubis challenge-response dance to complete, even on phones, is literally nothing compared to playing minecraft for a minute.
if you’re mining, you do billions of cycles of sha256 calculations a second for hours every day. anubis does maybe 1000, once, if you’re unlucky. the method of “verification” is the wrong thing to be upset at, especially since it can be changed
This isn’t hypocrisy. The git repo said this was “a bit like a nuclear response”, and like any nuclear response, I believe they expect everyone to suffer.
Not hypocrisy by the author, but by every reader who cheers this while hating on cryptocurrency.
IME most of these people can’t tell the difference between a cryptocurrency, a blockchain, and a public ledger, but have very strong opinions about anyway.
This is a stopgap while we try to find a new way to stop the DDOS happening right now. It might even be adapted to do useful work, if need be.
Hook into BOINC, or something? That’s an idea.
Sucks for people who have scripts disabled, or are using browsers without JS support, though.
It does, and I’m sure everyone will welcome a solution that lets them open things back up for those users without the abusers crippling them. It’s a matter of finding one.
Found the FF14 fan lol
The release names are hilariousWhat’s the ffxiv reference here?
Anubis is from Egyptian mythology.
The names of release versions are famous FFXIV Garleans
It’s a rather brilliant idea really, but when you consider the environmental implications of forcing web requests to ensure proof of work to function, this effectively burns a more coal for every site that implements it.
You have a point here.
But when you consider the current worlds web traffic, this isn’t actually the case today. For example Gnome project who was forced to start using this on their gitlab, 97% of their traffic could not complete this PoW calculation.
IE - they require only a fraction of computational cost to serve their gitlab, which saves a lot of resources, coal, and most importantly, time of hundreds of real humans.
Hopefully in the future we can move back to proper netiquette and just plain old robots.txt file!
I don’t think AI companies care, and I wholeheartedly support any and all FOSS projects using PoW when serving their websites. I’d rather have that than have them go down
Upvote for the name and tag line alone!
Giant middle finger from me – and probably everyone else who uses NoScript – for trying to enshittify what’s left of the good parts of the web.
Seriously, FUCK THAT.
They’re working on no-js support too, but this just had to be put out without it due to the amount of AI crawler bots causing denial of service to normal users.
You should fuck capitalism and corporations instead because they are the reason we can’t have nice things. They took the web from us
Nice. Crypto miners disguised as anti-AI.
what about this is crypto mining?
Anubis is provided to the public for free in order to help advance the common good. In return, we ask (but not demand, these are words on the internet, not word of law) that you not remove the Anubis character from your deployment.
If you want to run an unbranded or white-label version of Anubis, please contact Xe to arrange a contract.This is icky to me. Cool idea, but this is weird.