It use to be video games and movies taking the blame. Now it's websites. When are we going to decide that people are just bat shit crazy and guns need some form of regulation?
I can see the nuance in an argument that an online community, unmoderated, could be using an algorithm to group these violent people together and amplifying their views. The same can't really be said for most other platforms. Writing threats of violence should still be taken seriously over the internet, especially if it was later acted upon. I don't disagree with you that there's a lot of bat shit crazy out there though.
The thing about bat shit crazy people is that they dont need guns to be violent, they will find another way.
I can't realistically stab ten people in a crowd before I'm disarmed by the mob. And I certainly can't do it from a hotel window.
Idk about this suit but let's not forget how Facebook did actually in fact get a fascist elected president.
https://www.wired.com/2016/11/facebook-won-trump-election-not-just-fake-news/
YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack. Similarly, the lawsuits claim Reddit promoted extreme content and offered a specialized forum relating to tactical gear.
Yeah this is going nowhere.
They're just throwing shit at the wall to see what sticks hoping to get some money. Suing google for delivering search results? It shows how ridiculous blaming tools is. The only person liable here is the shooter.
Well, maybe. I want to be up-front that I haven't read the actual lawsuit, but it seems from the article that the claim is that youtube and reddit both have an algorithm that helped radicalize him:
YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack. Similarly, the lawsuits claim Reddit promoted extreme content and offered a specialized forum relating to tactical gear.
I'd say that case is worth pursuing. It's long been known that social media companies tune their algorithms to increase engagement, and that pissed off people are more likely to engage. This results in algorithms that output content that makes people angry, by design, and that's a choice these companies make, not "delivering search results".
The only person liable here is the shooter.
On the very specific point of liability, while the shooter is the specific person that pulled the trigger, is there no liability for those that radicalised the person into turning into a shooter? If I was selling foodstuffs that poisoned people I'd be held to account by various regulatory bodies, yet pushing out material to poison people's minds goes for the most part unpunished. If a preacher at a local religious centre was advocating terrorism, they'd face charges.
The UK government has a whole ream of context about this: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/97976/prevent-strategy-review.pdf
Google's "common carrier" type of defence takes you only so far, as it's not a purely neutral party in terms, as it "recommends", not merely "delivers results", as @joe points out. That recommendation should come with some editorial responsibility.
The article doesn't really expand on the Reddit point: apart from the weapon trading forum, it's about the shooter being a participant in PoliticalCompassMemes which is a right wing subreddit. After the shooting the Reddit admins made a weak threat towards the mods of PCM, prompting the mods to sticky a "stop being so racist or we'll get deleted" post with loads of examples of the type of racist dog whistles the users needed to stop using in the post itself.
I don't imagine they'll have much success against Reddit in this lawsuit, but Reddit is aware of PCM and its role and it continues to thrive to this day.
PCM isn't just a Right wing subreddit, it's a Nazi recruitment sub under the guise of "political discussion".
The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.
This seems like the only part of the suits that might have traction. All the other bits seem easy to dismiss. That's not a statement on whether others share responsibility, only on what seems legally actionable in the US.
Ahh one of those "We're mad and we don't have anyone to be angry with." style lawsuits. Pretty much the Hail Mary from a lawyer who is getting their name in the paper but knows it won't go anywhere.
"Easy to remove gun lock" that has been tried multiple times and usually fails. "Gun lock" doesn't seem to be related to assault weapons and large capacity magazine but who knows what they mean, even when a gun is "Easily modifiable" it's usually not treated as illegal, because someone has to actually make those modifications. The same will probably be the case for the kevlar. (at the time of the shooting it was legal).
Youtube contributing to radicalization is a laugh, it's an attempt to get their name in the papers and will be dismissed easily. They'd have better chance to name the channels that radicalized him, but first amendment rights would be near absolute here. Besides which "Radicalization" isn't the same as a conspiracy or orders. It's the difference between someone riling up the crowd until they're in a fervor which ends up in a riot, and someone specifically telling people how to riot and who to target. (Even if can be tried as crimes, one is a conspiracy, one is not, and even that "radicalization" would be neither.) Even "I wish someone would go shoot up ..." would be hyperbole, and thrown out as well. It's pretty hard to break the first amendment protections in America (And that's a good thing, if you think it's not imagine if the other party is in power and wants to squash your speech... yeah let's keep that amendment in place).
The same will be the case against Facebook for all the same reasons.
If you think Google should be responsible, then you think the park that someone is radicalized in should be responsible for what's said in it, or the email provider is responsible for every single piece of mail that is sent on it, even though it might not have access to see that mail... it's a silly idea even assuming they could even do that. Maybe they're hoping to scare Google to change it's algorithm, but I doubt that will happen either.
The case against the parents is another one that people try and again... unless there's more than their saying, you still can't sue someone for being a bad parent. Hell there's a better case against the parents of Ethan Crumbley, and even that cases is still pretty shaky, and involved the parents actively ignoring every warning sign, and buying the kid the gun. This there's nothing that seems to be pinnable on the parents.
You know it sucks and I know there's a lot of hurt people but lawsuits like this ultimately fail because it's like rolling the dice, but history pretty much shows this is hoping for a one in a million chance that they get lucky, and they won't, because it's one in a million, and then they'd have to hope it's not overturned even if they do win.
- RMA Armament is named for providing the body armor Gendron wore during the shooting.
No he bought it.
- Vintage Firearms of Endicott, New York, is singled out for selling the shooter the weapon used in the attack.
Not their issue he passed the background check.
- The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.
Any knob w/ a dremel can make a gun full auto, let alone defeating a mag lock. And he broke NY law doing this.
- YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack.
This is just absurd.
My guess is they are hoping for settlements vs going to trial where they lose.
This is the best summary I could come up with:
YouTube, Reddit and a body armor manufacturer were among the businesses that helped enable the gunman who killed 10 Black people in a racist attack at a Buffalo, New York, supermarket, according to a pair of lawsuits announced Wednesday.
The complementary lawsuits filed by Everytown Law in state court in Buffalo claim that the massacre at Tops supermarket in May 2022 was made possible by a host of companies and individuals, from tech giants to a local gun shop to the gunman’s parents.
The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.
YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack.
“We aim to change the corporate and individual calculus so that every company and every parent recognizes they have a role to play in preventing future gun violence,” said Eric Tirschwell, executive director of Everytown Law.
Last month, victims’ relatives filed a lawsuit claiming tech and social media giants such as Facebook, Amazon and Google bear responsibility for radicalizing Gendron.
I'm a bot and I'm open source!
I dislike Reddit now but this is fucked up. It’s not like the platform itself said “hey man, you should totally commit this barbaric, racist act and we’ll supply you with the weapons.”
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed