after contact, the company acknowledged the problem and removed the accounts
Meta is outsourcing content moderation to journalists.
after contact, the company acknowledged the problem and removed the accounts
Meta is outsourcing content moderation to journalists.
Meta profits from these accounts, it also profits off scams and fraud posts, because they pay for ad space. They have literally no incentive to moderate beyond the bare minimum their automatic tools do
Parents should get their kids to never touch anything “Meta” made or brought.
But then again, them same parents are currently telling the world what their neighbours are doing, what they’re eating and how cute did “insert name here” look in their new school uniform. 🤦♂️
to bad vrs got a hold and vrchats so much worse than the internet chatrooms we grew up with
Please, please, please abandon these platforms. Just stop using them. There's a cycle to these things and once they are past the due date all that's left is rotten. It really is as simple as stop using their platform.
When I saw this, 2 questions came to mind: How come that this isn't immediately reported? Why would anyone upload illegal material to a platform that tracks as thoroughly as Meta's do?
The answer is:
All of those accounts followed the same visual pattern: blonde characters with voluptuous bodies and ample breasts, blue eyes, and childlike faces.
The 1 question that came to mind upon reading this is: What?
I’m a little confused as to how it can still be AI CSAM if the bodies are voluptuous and the breasts are ample. Childlike faces have been the bread and butter of face filters for years.
Which parts specifically have to be childlike for it to be AI CSAM? This is why we need some laws ASAP.
Things that you want to understand but sure as fuck ain't gonna Google.
My guess is that the algorithm is really good at predicting who will be likely to follow that kind of content, rather than report it. Basically, it flies under the radar purely because the only people who see it are the ones who have a vested interest in it flying under the radar.
Look again. The explanation is that these images simply don't look like any kind of CSAM. The whole story looks like some sort of scam to me.
ai generated content is like plastic pollution
... Meta's security systems were unable to identify...
I think you mean incentivized to ignore
I never saw a child with "voluptuous bodies and ample breasts" though.
Shh! We're trying to ragebait here! Be outraged!
Meta doesn’t care about AI generated content. There are thousands of fake accounts with varying quality of AI generated content and reporting them does exactly shit.
IG is a total fascist shithole. I closed my "political" acct because all of the sponsored content was fascist trash: zionism, flat earthism, qanon, racist stuff, anti-vax, etc.
Switched to Pixelfed and RSS... and Lemmy ofc.
Stuff like this is a good ad for Pixelfed.
That's not surprising but it's messed up
This is a most excellent place for technology news and articles.