1101

TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

top 50 comments
sorted by: hot top controversial new old
[-] aeternum 2 points 5 hours ago

if only there was a government department to investigate these kinds of things..... Too soon?

[-] KayLeadfoot@fedia.io 2 points 5 hours ago

Disbanded!

... for effieciency!

Tesla self driving is never going to work well enough without sensors - cameras are not enough. It’s fundamentally dangerous and should not be driving unsupervised (or maybe at all).

[-] KayLeadfoot@fedia.io 83 points 1 week ago

Accurate.

Each fatality I found where a Tesla kills a motorcyclist is a cascade of 3 failures.

  1. The car's cameras don't detect the biker, or it just doesn't stop for some reason.
  2. The driver isn't paying attention to detect the system failure.
  3. The Tesla's driver alertness tech fails to detect that the driver isn't paying attention.

Taking out the driver will make this already-unacceptably-lethal system even more lethal.

[-] jonne@infosec.pub 65 points 1 week ago
  1. Self-driving turns itself off seconds before a crash, giving the driver an impossibly short timespan to rectify the situation.
[-] KayLeadfoot@fedia.io 61 points 1 week ago

... Also accurate.

God, it really is a nut punch. The system detects the crash is imminent.

Rather than automatically try to evade... the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.

[-] jonne@infosec.pub 35 points 1 week ago* (last edited 1 week ago)

Yep, that one was purely about hitting a certain KPI of 'miles driven on autopilot without incident'. If it turns off before the accident, technically the driver was in control and to blame, so it won't show up in the stats and probably also won't be investigated by the NTSB.

[-] br3d@lemmy.world 10 points 1 week ago* (last edited 1 week ago)

There's at least two steps before those three:

-1. Society has been built around the needs of the auto industry, locking people into car dependency

  1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody
load more comments (6 replies)
[-] ascense@lemm.ee 16 points 1 week ago

Most frustrating thing is, as far as I can tell, Tesla doesn't even have binocular vision, which makes all the claims about humans being able to drive with vision only even more blatantly stupid. At least humans have depth perception. And supposedly their goal is to outperform humans?

load more comments (3 replies)
[-] Buffalox@lemmy.world 79 points 1 week ago* (last edited 1 week ago)

Hey guys relax! It's all part of the learning experience of Tesla FSD.
Some of you may die, but that's a sacrifice I'm willing to make.

Regards
Elon Musk
CEO of Tesla

load more comments (2 replies)
[-] Gork@lemm.ee 65 points 1 week ago

Lidar needs to be a mandated requirement for these systems.

[-] echodot@feddit.uk 18 points 1 week ago* (last edited 1 week ago)

Or at least something other than just cameras. Even just adding ultrasonic senses to the front would be an improvement.

load more comments (1 replies)
[-] HK65@sopuli.xyz 16 points 1 week ago

Honestly, emergency braking with LIDAR is mature and cheap enough at this point that is should be mandated for all new cars.

load more comments (1 replies)
[-] lnxtx@feddit.nl 37 points 1 week ago

Stop dehumanizing drivers who killed people.
Feature, wrongly called, Full Self-Driving, shall be supervised at any time.

[-] SouthEndSunset@lemm.ee 27 points 1 week ago

If you’re going to say your car has “full self driving”, it should have that, not “full self driving (but needs monitoring.)” or “full self driving (but it disconnects 2 seconds before impact.)”.

[-] Ulrich@feddit.org 11 points 1 week ago

I think it's important to call out inattentive drivers while also calling out the systems and false advertising that may lead them to become less attentive.

If these systems were marketed as "driver assistance systems" instead of "full self driving", certainly more people would pay attention. The fact that they've been allowed to get away with this blatant false advertising is astonishing.

They're also obviously not adequately monitoring for driver attentiveness.

[-] 9488fcea02a9@sh.itjust.works 34 points 1 week ago

Sounds like NHTSA needs a visit from DOGE!

load more comments (1 replies)
[-] Visstix@lemmy.world 31 points 1 week ago

Why is self-driving even allowed?

[-] kameecoding@lemmy.world 12 points 1 week ago* (last edited 1 week ago)

Because muh freedum, EU are a bunch of commies for not allowing this awesome innovation on their roads

(I fucking love living in the EU)

[-] Not_mikey@lemmy.dbzer0.com 12 points 1 week ago

Robots don't get drunk, or distracted, or text, or speed...

Anecdotally, I think the Waymos are more courteous than human drivers. Though waymo seems to be the best ones out so far, idk about the other services.

load more comments (6 replies)
load more comments (12 replies)
[-] AnimalsDream@slrpnk.net 31 points 1 week ago* (last edited 1 week ago)

I imagine bicyclists must be æffected as well if they're on the road (as we should be, technically). As somebody who has already been literally inches away from being rear-ended, this makes me never want to bike in the US again.

Time to go to Netherlands.

[-] poopkins@lemmy.world 16 points 1 week ago
[-] NikkiDimes@lemmy.world 13 points 1 week ago

Affectively, does it realy mater if someone has slite misstakes in there righting?

load more comments (4 replies)
[-] nulluser@lemmy.world 10 points 1 week ago

Thank you for your service.

load more comments (1 replies)
load more comments (1 replies)
[-] Substance_P@lemmy.world 29 points 1 week ago* (last edited 1 week ago)
load more comments (1 replies)
[-] keesrif@lemmy.world 26 points 1 week ago

On a quick read, I didn't see the struck motorcycles listed. Last I heard, a few years ago, was that this mainly affected motorcycles with two rear lights that are spaced apart and fairly low to the ground. I believe this is mostly true for Harleys.

The theory I recall was that this rear light configuration made the Tesla assume it was looking (remember, only cameras without depth data) at a car that was further down the road - and acceleration was safe as a result. It miscategorised the motorcycle so badly that it misjudged it's position entirely.

[-] KayLeadfoot@fedia.io 30 points 1 week ago

I also saw that theory! That's in the first link in the article.

The only problem with the theory: Many of the crashes are in broad daylight. No lights on at all.

I didn't include the motorcycle make and model, but I did find it. Because I do journalism, and sometimes I even do good journalism!

The models I found are: Kawasaki Vulcan (a cruiser bike, just like the Harleys you describe), Yamaha YZF-R6 (a racing-style sport bike with high-mount lights), and a Yamaha V-Star (a "standard" bike, fairly low lights, and generally a low-slung bike). Weirdly, the bike models run the full gamut of the different motorcycles people ride on highways, every type is represented (sadly) in the fatalities.

I think you're onto something with the faulty depth sensors. Sensing distance is difficult with optical sensors. That's why Tesla would be alone in the motorcycle fatality bracket, and that's why it would always be rear-end crashes by the Tesla.

[-] littleomid@feddit.org 10 points 1 week ago

At least in EU, you can’t turn off motorcycle lights. They’re always on. In eu since 2003, and in US, according to the internet, since the 70s.

load more comments (2 replies)
load more comments (4 replies)
[-] jonne@infosec.pub 29 points 1 week ago

Whatever it is, it's unacceptable and they should really ban Tesla's implementation until they fix some fundamental issues.

[-] ExcessShiv@lemmy.dbzer0.com 16 points 1 week ago* (last edited 1 week ago)

The ridiculous thing is, it has 3 cameras pointing forward, you only need 2 to get stereoscopic depth perception with cameras...why the fuck are they not using that!?

Edit: I mean, I know why, it's because it's cameras with three different lenses used for different things (normal, wide angle, and telescopic) so they're not suitable for it, but it just seems stupid to not utilise that concept when you insist on a camera only solution.

load more comments (3 replies)
load more comments (1 replies)
[-] Ledericas@lemm.ee 25 points 1 week ago

the cybertruck is sharp enough to cut a deer in half, surely a biker is just as vulnerable.

load more comments (2 replies)

Remember, you have the right to self-defence, against both rogue robots and rogue humans.

load more comments (6 replies)
[-] expatriado@lemmy.world 20 points 1 week ago

as daily rider, i must add having a tesla behind to the list of road hazards to look out

load more comments (2 replies)
[-] werefreeatlast@lemmy.world 20 points 1 week ago

Every captcha.....can you see the motorcycle? I would be afraid if they wanted all the squares with small babies or maybe just regular folk...can you pick all the hottie's? Which of these are body parts?

[-] SkunkWorkz@lemmy.world 14 points 1 week ago

It’s because the system has to rely on visual cues, since Tesla’s have no radar. The system looks at the tail light when it’s dark to gauge the distance from the vehicle. And since some bikes have a double light the system thinks it’s a car in front of them that is far away, when in reality it’s a bike up close. Also remember the ai is trained on human driving behavior which Tesla records from their customers. And we all know how well the average human drives around two wheeled vehicles.

[-] PalmTreeIsBestTree@lemmy.world 13 points 1 week ago

This is another reason I’ll never drive a motorcycle. Fuck that shit.

[-] KayLeadfoot@fedia.io 14 points 1 week ago

It's like smoking: if you haven't started, don't XD

[-] mutual_ayed@sh.itjust.works 10 points 1 week ago

As a fellow meat crayon I agree

load more comments (8 replies)
[-] misteloct@lemmy.world 13 points 1 week ago

I'm wondering how that stacks up to human drivers. Since the data is redacted I'm guessing not well at all.

load more comments (1 replies)
[-] Litebit@lemmy.world 12 points 1 week ago

Elon needs to take responsibility for their death.

load more comments (1 replies)
[-] Redex68@lemmy.world 12 points 1 week ago

Cuz other self driving cars use LIDAR so it's basically impossible for them to not realise that a bike is there.

[-] jdeath@lemm.ee 1 points 6 days ago

unless it's foggy, etc.

load more comments
view more: next ›
this post was submitted on 02 Apr 2025
1101 points (100.0% liked)

Technology

68567 readers
3876 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS