1114
top 50 comments
sorted by: hot top controversial new old
[-] bizza@lemmy.zip 14 points 1 day ago

I use Anubis on my personal website, not because I think anything I’ve written is important enough that companies would want to scrape it, but as a “fuck you” to those companies regardless

That the bots are learning to get around it is disheartening, Anubis was a pain to setup and get running

[-] nialv7@lemmy.world 32 points 1 day ago* (last edited 1 day ago)

We had a trust based system for so long. No one is forced to honor robots.txt, but most big players did. Almost restores my faith in humanity a little bit. And then AI companies came and destroyed everything. This is why we can't have nice things.

[-] Shapillon@lemmy.world 17 points 1 day ago

Big players are the ones behind most AIs though.

[-] prole 84 points 1 day ago

Tech bros just actively making the internet worse for everyone.

[-] ShaggySnacks@lemmy.myserv.one 64 points 1 day ago

Tech bros just actively making ~~the internet~~ society worse for everyone.

FTFY.

[-] iopq@lemmy.world 1 points 1 day ago

I mean, tech bros of the past invented the internet

[-] notarobot@lemmy.zip 17 points 18 hours ago

Those are not the tech bros. The tech bros are the ones who move fast and break things. The internet was built by engineers and developers

[-] prole 7 points 17 hours ago* (last edited 17 hours ago)

Nah, that was DARPA

[-] CeeBee_Eh@lemmy.world 3 points 17 hours ago

Those were tech nerds. "Tech bros" are jabronis who see the tech sector as a way to increase the value of the money their daddies gave them.

reminder to donate to codeberg and forgejo :)

[-] thatonecoder@lemmy.ca 43 points 1 day ago

I know this is the most ridiculous idea, but we need to pack our bags and make a new internet protocol, to separate us from the rest, at least for a while. Either way, most “modern” internet things (looking at you, JavaScript) are not modern at all, and starting over might help more than any of us could imagine.

[-] Pro@programming.dev 43 points 1 day ago* (last edited 1 day ago)

Like Gemini?

From official Website:

Gemini is a new internet technology supporting an electronic library of interconnected text documents. That's not a new idea, but it's not old fashioned either. It's timeless, and deserves tools which treat it as a first class concept, not a vestigial corner case. Gemini isn't about innovation or disruption, it's about providing some respite for those who feel the internet has been disrupted enough already. We're not out to change the world or destroy other technologies. We are out to build a lightweight online space where documents are just documents, in the interests of every reader's privacy, attention and bandwidth.

[-] 0x0@lemmy.zip 4 points 1 day ago

It's not the most well thought-out, from a technical perspective, but it's pretty damn cool. Gemini pods are a freakin' rabbi hole.

[-] cwista@lemmy.world 9 points 1 day ago

Won't the bots just adapt and move there too?

[-] thatonecoder@lemmy.ca 12 points 1 day ago

Yep! That was exactly the protocol on my mind. One thing, though, is that the Fediverse would need to be ported to Gemini, or at least for a new protocol to be created for Gemini.

[-] echodot@feddit.uk 10 points 1 day ago

If it becomes popular enough that it's used by a lot of people then the bots will move over there too.

They are after data, so they will go where it is.

One of the reasons that all of the bots are suddenly interested in this site is that everyone's moving away from GitHub, suddenly there's lots of appealing tasty data for them to gobble up.

This is how you get bots, Lana

load more comments (1 replies)
load more comments (1 replies)
load more comments (7 replies)
[-] mfed1122@discuss.tchncs.de 15 points 1 day ago* (last edited 1 day ago)

Okay what about...what about uhhh... Static site builders that render the whole page out as an image map, making it visible for humans but useless for crawlers 🤔🤔🤔

[-] iopq@lemmy.world 4 points 1 day ago

AI these days reads text from images better than humans can

[-] lapping6596@lemmy.world 25 points 1 day ago

Accessibility gets throw out the window?

[-] mfed1122@discuss.tchncs.de 13 points 1 day ago

I wasn't being totally serious, but also, I do think that while accessibility concerns come from a good place, there is some practical limitation that must be accepted when building fringe and counter-cultural things. Like, my hidden rebel base can't have a wheelchair accessible ramp at the entrance, because then my base isn't hidden anymore. It sucks that some solutions can't work for everyone, but if we just throw them out because it won't work for 5% of people, we end up with nothing. I'd rather have a solution that works for 95% of people than no solution at all. I'm not saying that people who use screen readers are second-class citizens. If crawlers were vision-based then I might suggest matching text to background colors so that only screen readers work to understand the site. Because something that works for 5% of people is also better than no solution at all. We need to tolerate having imperfect first attempts and understand that more sophisticated infrastructure comes later.

But yes my image map idea is pretty much a joke nonetheless

[-] deaf_fish@midwest.social 1 points 17 hours ago

Don't worry, we were never going to make anything 100% accessible anyway, that would be impossible.

[-] echodot@feddit.uk 7 points 1 day ago

AI is pretty good at OCR now. I think that would just make it worse for humans while making very little difference to the AI.

[-] mfed1122@discuss.tchncs.de 5 points 1 day ago

The crawlers are likely not AI though, but yes OCR could be done effectively without AI anyways. This idea ultimately boils down to the same hope Anubis had of making the processing costs large enough to not be worth it.

[-] nymnympseudonym@lemmy.world 6 points 1 day ago

OCR could be done effectively without AI

OCR has been neural nets even before convolutional networks emerged in the 2010s

[-] mfed1122@discuss.tchncs.de 3 points 1 day ago

Yeah you're right, I was using AI in the colloquial modern sense. My mistake. It actually drives me nuts when people do that. I should have said "without compute-heavy AI".

[-] nymnympseudonym@lemmy.world 5 points 1 day ago

My mistake

hold on I am still somewhat new to Fedi & not fully used to people being polite

load more comments (2 replies)
[-] SufferingSteve@feddit.nu 304 points 2 days ago* (last edited 2 days ago)

There once was a dream of the semantic web, also known as web2. The semantic web could have enabled easy to ingest information of webpages, removing soo much of the computation required to get the information. Thus preventing much of the AI crawling cpu overhead.

What we got as web2 instead was social media. Destroying facts and making people depressed at a newer before seen rate.

Web3 was about enabling us to securely transfer value between people digitally and without middlemen.

What crypto gave us was fraud, expensive jpgs and scams. The term web is now even so eroded that it has lost much of its meaning. The information age gave way for the misinformation age, where everything is fake.

[-] Marshezezz 97 points 2 days ago

Capitalism is grand, innit. Wait, not grand, I meant to say cancer

load more comments (38 replies)
[-] tourist@lemmy.world 65 points 2 days ago

Web3 was about enabling us to securely transfer value between people digitally and without middlemen.

It's ironic that the middlemen showed up anyway and busted all the security of those transfers

You want some bipcoin to buy weed drugs on the slip road? Don't bother figuring out how to set up that wallet shit, come to our nifty token exchange where you can buy and sell all kinds of bipcoins

oh btw every government on the planet showed up and dug through our insecure records. hope you weren't actually buying shroom drugs on the slip rod

also we got hacked, you lost all your bipcoins sorry

At least, that's my recollection of events. I was getting my illegal narcotics the old fashioned way.

load more comments (12 replies)
load more comments (8 replies)
[-] zifk@sh.itjust.works 98 points 2 days ago

Anubis isn't supposed to be hard to avoid, but expensive to avoid. Not really surprised that a big company might be willing to throw a bunch of cash at it.

load more comments (9 replies)
[-] Monument@lemmy.sdf.org 9 points 1 day ago

Increasingly, I’m reminded of this: Paul Bunyan vs. the spam bot (or how Paul Bunyan triggered the singularity to win a bet). It’s a medium-length read from the old internet, but fun.

[-] PhilipTheBucket@piefed.social 97 points 2 days ago

I feel like at some point it needs to be active response. Phase 1 is a teergrube type of slowness to muck up the crawlers, with warnings in the headers and response body, and then phase 2 is a DDOS in response or maybe just a drone strike and cut out the middleman. Once you've actively evading Anubis, fuckin' game on.

[-] turbowafflz@lemmy.world 108 points 2 days ago

I think the best thing to do is to not block them when they're detected but poison them instead. Feed them tons of text generated by tiny old language models, it's harder to detect and also messes up their training and makes the models less reliable. Of course you would want to do that on a separate server so it doesn't slow down real users, but you probably don't need much power since the scrapers probably don't really care about the speed

load more comments (8 replies)
load more comments (11 replies)
[-] zbyte64@awful.systems 29 points 2 days ago

Is there nightshade but for text and code? Maybe my source headers should include a bunch of special characters that then give a prompt injection. And sprinkle some nonsensical code comments before the real code comment.

load more comments (6 replies)
[-] londos@lemmy.world 43 points 2 days ago

Can there be a challenge that actually does some maliciously useful compute? Like make their crawlers mine bitcoin or something.

[-] raspberriesareyummy@lemmy.world 68 points 2 days ago

Did you just say use the words "useful" and "bitcoin" in the same sentence? o_O

[-] polle@feddit.org 74 points 2 days ago

The saddest part is, we thought crypto was the biggest waste of energy ever and then the LLMs entered the chat.

load more comments (9 replies)
[-] kameecoding@lemmy.world 38 points 2 days ago

Bro couldn't even bring himself to mention protein folding because that's too socialist I guess.

load more comments (8 replies)
load more comments (4 replies)
load more comments (3 replies)
load more comments
view more: next ›
this post was submitted on 18 Aug 2025
1114 points (100.0% liked)

Technology

74247 readers
3795 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS