[-] Sythelux@social.tchncs.de 3 points 9 months ago

@ajsadauskas @degoogle I started that because it bothered me that you couldn't just report a website to duckduckgo that obviously was a stackoverflow crawler. This problem persists since reddit and stackoverflow are a thing themselves. why are there no measurements from search engine to get a hold of it.

I never understood that.

[-] Sythelux@social.tchncs.de 2 points 9 months ago

@ajsadauskas @degoogle I mean we could still use all modern tools. I'm hosting a searxng manually and there is currently an ever growing block list for AI generated websites that I regularly import to keep up to date. You could also make it as allow list thing to have all websites blocked and allow websites gradually.

Sythelux

joined 5 years ago