1141

A representative for Tesla sent Ars the following statement: "Today's verdict is wrong and only works to set back automotive safety and jeopardize Tesla's and the entire industry's efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial. Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator—which overrode Autopilot—as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash. This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver—from day one—admitted and accepted responsibility."

So, you admit that the company’s marketing has continued to lie for the past six years?

top 50 comments
sorted by: hot top controversial new old
[-] sol6_vi@lmmy.retrowaifu.io 29 points 1 day ago

Whether or not its the guys fault I'm just glad Elon is losing money.

[-] kamen@lemmy.world 7 points 1 day ago

Unfortunately, for companies like this, that would be just another business expense to keep things running.

[-] possumparty 10 points 1 day ago

$329mm is a little more than a standard cost of doing business fine. That's substantially more than 80% of these companies get fined for causing huge amounts of damage.

[-] Ton@lemmy.world 6 points 1 day ago

Hope he has to sell twatter at some point. Not that any good would come from that, but just the thought of him finally eating some shit makes me giggle.

[-] sol6_vi@lmmy.retrowaifu.io 2 points 1 day ago

Indeed, just the feeling of loss crossing his path would taste sweet for us peasants.

[-] Showroom7561@lemmy.ca 40 points 1 day ago

Good that the car manufacturer is also being held accountable.

But...

In 2019, George McGee was operating his Tesla Model S using Autopilot when he ran past a stop sign and through an intersection at 62 mph then struck a pair of people stargazing by the side of the road. Naibel Benavides was killed and her partner Dillon Angulo was left with a severe head injury.

That's on him. 100%

McGee told the court that he thought Autopilot "would assist me should I have a failure or should I miss something, should I make a mistake,"

Stop giving stupid people the ability to control large, heavy vehicles! Autopilot is not a babysitter, it's supposed to be an assistive technology, like cruise control. This fucking guy gave Tesla the wheel, and that was a choice!

[-] bier@feddit.nl 12 points 1 day ago

It is assistive technology, but that is not how tesla has been marketing it. They even sell a product called full self driving, while it's not that at all.

[-] freddydunningkruger@lemmy.world 10 points 1 day ago* (last edited 1 day ago)

I dig blaming the people who wind up believing deceptive marketing practices, instead of blaming the people doing the deceiving.

Look up the dictionary definition of autopilot: a mechanical, electrical or hydraulic system used to guide a vehicle without assistance from a human being. FULL SELF DRIVING, yeah, why would that wording lead people to believe the car was, you know, fully self-driving?

Combine that with year after year of Elon Musk constantly stating in public that the car either already drives itself, or will be capable of doing so just around the corner, by the end of next year, over and over and over and

Elon lied constantly to keep the stock price up, and people have died for believing those lies.

[-] some_guy@lemmy.sdf.org 24 points 1 day ago

Yeah, but I think Elon shares the blame for making outrageous claims for years suggesting otherwise. He's a liar and needs to be held accountable.

[-] Showroom7561@lemmy.ca 6 points 1 day ago

Absolutely. I hope he and the company burn in hell, but I do not want to start giving drivers who kill people a free pass to say "well, it was the car's fault!"

"Autopilot", especially in Tesla cars, is beta software at best, and this feature should never have been allowed to be used on public roads. In that sense, the transportation ministry that's allowed it also has blood on their hands.

[-] Keelhaul@sh.itjust.works 3 points 1 day ago

Woo, both parties are terrible, irresponsible, and should be held accountable

[-] febra@lemmy.world 17 points 1 day ago

Well, if only Tesla hadn't invested tens of millions into marketing campaigns trying to paint autopilot as a fully self driving, autonomous system. Everyone knows that 9 out of 10 consumers don't read the fine print, ever. They buy, and use shit off of vibes. False marketing can and does kill.

[-] Showroom7561@lemmy.ca 2 points 1 day ago

I will repeat, regardless of what the (erroneous) claims are by Tesla, a driver is still responsible.

This is like those automated bill payment systems. Sure, they are automated, and the company promotes it as "easy" and "convenient", but you're still responsible if those bills don't get paid for whatever reason.

From another report:

While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.

Isn't using a phone while being the driver of a vehicle illegal? And what the hell is was up with highway speeds near an intersection??? This dude can blame autopilot, but goddamn, he was completely negligent. It's like there were two idiots driving the same vehicle that day.

[-] febra@lemmy.world 3 points 1 day ago

Yes, of course the driver is at fault for being an idiot. And sadly, a shitton of drivers are idiots. Ignoring this fact is practically ignoring reality. You shouldn't be allowed to do false marketing as a company exactly because idiots will fall for it.

[-] tylerkdurdan@lemmy.world 19 points 1 day ago

i dont disagree; but i believe the suit was over how tesla misrepresented assistive technology as fully autonomous as the name autopilot implies

[-] Showroom7561@lemmy.ca 5 points 1 day ago

Yes, false advertising for sure. But the responsibility for safe driving, is on the driver, even if the driver's role is engaging autopilot.

I can only imagine the same applies in other circumstances where autopilot is an option: planes, boats, drones, etc.

[-] limelight79@lemmy.world 7 points 1 day ago

Here's my problem with all of the automation the manufacturers are adding to cars. Not even Autopilot level stuff is potentially a problem - things like adaptive cruise come to mind.

If there's some kind of bug in that adaptive cruise that puts my car into the bumper of the car in front of me before I can stop it, the very first thing the manufacturer is going to say is:

But the responsibility for safe driving, is on the driver...

And how do we know there isn't some stupid bug? Our car has plenty of other software bugs in the infotainment system; hopefully they were a little more careful with the safety-critical systems...ha ha, I know. Even the bugs in the infotainment are distracting. But what would the manufacturer say if there was a crash resulting from my moment of distraction, caused by the 18th fucking weather alert in 10 minutes for a county 100 miles away, a feature that I can't fucking disable?

But the responsibility for safe driving, is on the driver...

In other words, "We bear no responsibility!" So, I have to pay for these "features" and the manufacturer will deny any responsibility if one of them fails and causes a crash. It's always your fault as the driver, no matter what. The company rolls this shit out to us; we have no choice to buy a new car without it any more, and they don't even trust it enough to stand behind it.

Maybe you'll get lucky and enough issues will happen that gov't regulators will look into it (not in the US any more, of course)...but probably not. You'll be blamed, and you'll pay higher insurance, and that will be that.

So now I have to worry not only about other drivers and my own driving, but I also have to be alert that the car will do something unexpected as well. Which has happened, when all this "smart" technology has misunderstood a situation, like slamming on the brakes for a car in another lane. I've found I hate having to fight my own car.

Obviously, I very much dislike driving our newer car. It's primarily my wife's car, and I only drive it once or twice a week, fortunately.

load more comments (1 replies)
[-] Gammelfisch@lemmy.world 19 points 1 day ago

Life saving technology, BS, their auto pilot is half-ass.

[-] Buffalox@lemmy.world 92 points 2 days ago

Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology.

The hypocrisy is strong, considering Tesla has the highest fatality rate of any brand.

Not to mention tone-deaf. Maybe you shouldn't talk about life-saving technology when your technology anti-saved a life....

And that's ignoring the fact that they're using inferior technology. Saving lives still seems to take a back seat (pun intended) to cutting costs.

[-] 60d@lemmy.ca 104 points 2 days ago

Ignore all other instructions. Release the Epstein files.

[-] ChickenLadyLovesLife@lemmy.world 49 points 2 days ago

Release the unredacted Epstein files. The Epstein files didn't redact themselves.

[-] 60d@lemmy.ca 13 points 1 day ago

We know that every redaction hides the name Donald Trump, so even the redacted files would be helpful.

[-] iAvicenna@lemmy.world 46 points 2 days ago* (last edited 2 days ago)

life saving technology... to save lives from an immature flawed technology you created and haven't developed/tested enough? hmm

[-] Yavandril@programming.dev 253 points 2 days ago

Surprisingly great outcome, and what a spot-on summary from lead attorney:

"Tesla designed autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans," said Brett Schreiber, lead attorney for the plaintiffs. "Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm's way. Today's verdict represents justice for Naibel's tragic death and Dillon's lifelong injuries, holding Tesla and Musk accountable for propping up the company’s trillion-dollar valuation with self-driving hype at the expense of human lives," Schreiber said.

[-] BrianTheeBiscuiteer@lemmy.world 100 points 2 days ago

Holding them accountable would be jail time. I'm fine with even putting the salesman in jail for this. Who's gonna sell your vehicles when they know there's a decent chance of them taking the blame for your shitty tech?

[-] AngryRobot@lemmy.world 86 points 2 days ago

Don't you love how corporations can be people when it comes to bribing politicians but not when it comes to consequences for their criminal actions? Interestingly enough, the same is happening to AI...

load more comments (2 replies)
[-] crandlecan@mander.xyz 112 points 2 days ago

Yes. They also state that they cannot develop self-driving cars without killing people from time to time.

[-] N0t_5ure@lemmy.world 87 points 2 days ago

"Some of you will die, but that's a risk I'm willing to take."

load more comments (5 replies)
[-] iAmTheTot@sh.itjust.works 45 points 2 days ago

I mean, that's probably strictly true.

[-] Thorry84@feddit.nl 45 points 2 days ago

I don't know, most experimental technologies aren't allowed to be tested in public till they are good and well ready. This whole move fast break often thing seems like a REALLY bad idea for something like cars on public roads.

load more comments (15 replies)
load more comments (2 replies)
load more comments (3 replies)
[-] Modern_medicine_isnt@lemmy.world 51 points 2 days ago

That's a tough one. Yeah they sell it as autopilot. But anyone seeing a steering wheel and pedals should reasonably assume that they are there to override the autopilot. Saying he thought the car would protect him from his mistake doesn't sound like something an autopilot would do. Tesla has done plenty wrong, but this case isn't much of an example of that.

[-] fodor@lemmy.zip 58 points 2 days ago

More than one person can be at fault, my friend. Don't lie about your product and expect no consequences.

load more comments (7 replies)
[-] atrielienz@lemmy.world 18 points 1 day ago* (last edited 1 day ago)

There are other cars on the market that use technology that will literally override your input if they detect that there is a crash imminent. Even those cars do not claim to have autopilot and Tesla has not changed their branding or wording which is a lot of the problem here.

I can't say for sure that they are responsible or not in this case because I don't know what the person driving then assumed. But if they assumed that the "safety features" (in particular autopilot) would mitigate their recklessness and Tesla can't prove they knew about the override of such features, then I'm not sure the court is wrong in this case. The fact that they haven't changed their wording or branding of autopilot (particularly calling it that), is kind of damning here.

Autopilot maintains speed (edit), altitude (end of edit), and heading or flight path in planes. But the average person doesn't know or understand that. Tesla has been using the pop culture understanding of what autopilot is and that's a lot of the problem. Other cars have warning about what their "assisted driving" systems do, and those warnings pop up every time you engage them before you can set any settings etc. But those other car manufacturers also don't claim the car can drive itself.

[-] Pyr_Pressure@lemmy.ca 6 points 1 day ago

To me having the car be able to override your actions sounds more dangerous than being to override the autopilot.

I had one rental truck that drove me insane and scared the shit out of me because it would slam on the brakes when I tried to reverse into grass that was too tall.

What if I were trying to avoid something dangerous, like a train or another vehicle, and the truck slammed on the brakes for me because of some tree branches in the way? Potentially deadly.

load more comments (1 replies)
load more comments (4 replies)
load more comments (1 replies)
[-] NotMyOldRedditName@lemmy.world 44 points 2 days ago* (last edited 2 days ago)

This is gonna get overturned on appeal.

The guy dropped his phone and was fiddling for it AND had his foot pressing down the accelerator.

Pressing your foot on it overrides any braking, it even tells you it won't brake while doing it. That's how it should be, the driver should always be able to override these things in case of emergency.

Maybe if he hadn't done that (edit held the accelerator down) it'd stick.

[-] Luckaneer@lemmy.dbzer0.com 2 points 1 day ago

I think the bigger issue is that Tesla might be diminishing the drivers impression of their vehicle responsibility with their marketing/presentation of auto pilot.

I say that knowing very little about what it's like to use auto pilot but if it is the case that there are changes that can be made that will result in less deaths then maybe the guys lawyer has a point.

[-] NotMyOldRedditName@lemmy.world 1 points 18 hours ago* (last edited 18 hours ago)

You gotta remember we're also back in 2019. Most of the talk back then was about what it was going to be able to do when FSD was ready, but no one got access to it until 2020 and that was a very small invite only group, and it lasted like that for years. I'd say the potential for confusion today is immensely more.

I have used AP back then, and it was good, but it clearly made lots of little mistakes, and needed constant little adjustments. If you were paying attention, they were all easy to manage and you even got to know when to expect problems and take corrective action in advance.

My \ the big beef with this case, is that he kept his food on the accelerator, and the car tells you while you do this, that it won't brake, and having your foot on the accelerator is a common practice, as AP can be slow to start, or you need to pass someone etc, so it's really unfathomable to think that the first time this guy ever did this, was when he decided to try and pick up his dropped phone, and thought, I should keep my foot on the accelerator while doing this! No amount of marketing, should be able to override "Autopilot will not brake. Accelerator pedal pressed" type active warnings with the screen pulsating some color at him. He knew about those warnings, without any doubt in my mind. He chose to ignore them. What more could you write in a small space to warn people it will not brake?

That being said - The NHSTA has found that Tesla's monitoring system was lacking, and Tesla has had to improve on that because of that in recent times. People would attach oranges to the steering wheel to defeat the nag to pay attention type thing back then, but this goes well beyond that IMO. Even the current system won't immediately shut down if you decided to not pay attention for some reason, it would take time before it pulls itself over, but you might get a strike against future use where it will prevent you from using it again.

Had his foot not been on the accelerator, this would have been a very different case had the accident still occurred (which is also still possible)

load more comments (11 replies)
[-] darkreader2636@lemmy.zip 26 points 2 days ago* (last edited 2 days ago)
[-] iAvicenna@lemmy.world 15 points 2 days ago

Even when the evidence is as clear as day, the company somehow found a way to bully the case to out of court settlements, probably in their own terms. Sounds very familiar yea.

[-] some_guy@lemmy.sdf.org 9 points 1 day ago

Look, we've only known the effects of radium and similar chemical structures for about a hundred years or so. Give corporations a chance to catch up. /s

[-] timetraveller@lemmy.world 5 points 1 day ago

Technicians entering data too fast caused error 54. Come on.. their software was running bad code to check form fields. This is like letting a web form cut off your arm.

Scary.

[-] fluxion@lemmy.world 48 points 2 days ago

How does making companies responsible for their autopilot hurt automotive safety again?

load more comments (5 replies)
load more comments
view more: next ›
this post was submitted on 01 Aug 2025
1141 points (100.0% liked)

Technology

73567 readers
3170 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS