902
you are viewing a single comment's thread
view the rest of the comments
[-] Pika@sh.itjust.works 152 points 8 months ago

Furthermore, with the amount of telemetry that those cars have The company knows whether it was in self drive or not when it went onto the track. So the fact that they didn't go public saying it wasn't means that it was in self-drive mode and they want to save the PR face and liability.

[-] IphtashuFitz@lemmy.world 93 points 8 months ago

I have a nephew that worked at Tesla as a software engineer for a couple years (he left about a year ago). I gave him the VIN to my Tesla and the amount of data he shared with me was crazy. He warned me that one of my brake lights was regularly logging errors. If their telemetry includes that sort of information then clearly they are logging a LOT of data.

[-] atomicbocks@sh.itjust.works 25 points 8 months ago

Modern cars (in the US) are required to have an OBD-II Port for On-Board Diagnostics. I always assumed most cars these days were just sending some or all of the real-time OBD data to the manufacturer. GM definitely has been.

[-] Pika@sh.itjust.works 25 points 8 months ago

Dude, in today's world we're lucky if they stop at the manufacturer. I know of a few insurances that have contracts through major dealers and they just automatically get the data that's registered via the cars systems. That way they can make better decisions regarding people's car insurance.

Nowadays it's a red flag if you join a car insurance and they don't offer to give you a discount if you put something like drive pass on which logs you're driving because it probably means that your car is already getting that data to them.

[-] CmdrShepard49@sh.itjust.works 3 points 8 months ago

We just got back from a road trip in a friend's '25 Tundra and it popped up a TPMS warning for a faulty sensor then minutes later he got a text from the dealership telling him about it and to bring it in for service.

[-] catloaf@lemm.ee 45 points 8 months ago

I've heard they also like to disengage self-driving mode right before a collision.

[-] sylver_dragon@lemmy.world 9 points 8 months ago

That actually sounds like a reasonable response. Driving assist means that a human is supposed to be attentive to take control. If the system detects a situation where it's unable to make a good decision, dumping that decision on the human in control seems like the closest they have to a "fail safe" option. Of course, there should probably also be an understanding that people are stupid and will almost certainly have stopped paying attention a long time ago. So, maybe a "human take the wheel" followed by a "slam the brakes" if no input is detected in 2-3 seconds. While an emergency stop isn't always the right choice, it probably beats leaving a several ton metal object hurtling along uncontrolled in nearly every circumstance.

[-] zaphod@sopuli.xyz 55 points 8 months ago* (last edited 8 months ago)

That actually sounds like a reasonable response.

If you give the driver enough time to act, which tesla doesn't. They turn it off a second before impact and then claim it wasn't in self-driving mode.

[-] whotookkarl@lemmy.world 18 points 8 months ago

Not even a second, it's sometimes less than 250-300ms. If I wasn't already anticipating it to fail and disengage as it went though the 2-lane wide turn I would have gone straight into oncoming traffic

[-] catloaf@lemm.ee 26 points 8 months ago

Yeah but I googled it after making that comment, and it was sometimes less than one second before impact: https://futurism.com/tesla-nhtsa-autopilot-report

[-] elucubra@sopuli.xyz 3 points 8 months ago* (last edited 8 months ago)

I don't know if that is still the case, but many electronic stuff in the US had warnings, with pictures, like "don't put it in the bath", and the like .

People are dumb, and you should take that into account.

[-] sturmblast@lemmy.world 2 points 8 months ago* (last edited 8 months ago)

That sounds a lot more like a rumor to me.. it would be extremely suspicious and would leave them open to GIGANTIC liability issues.

[-] catloaf@lemm.ee 36 points 8 months ago

In the report, the NHTSA spotlights 16 separate crashes, each involving a Tesla vehicle plowing into stopped first responders and highway maintenance vehicles. In the crashes, it claims, records show that the self-driving feature had "aborted vehicle control less than one second prior to the first impact"

https://futurism.com/tesla-nhtsa-autopilot-report

[-] sem 19 points 8 months ago

It's been well documented. It lets them say in their statistics that the owner was in control of the car during the crash

[-] sturmblast@lemmy.world 1 points 8 months ago

That's my whole point

[-] ayyy@sh.itjust.works 4 points 8 months ago* (last edited 8 months ago)

How so? The human in the car is always ultimately responsible when using level 3 driver assists. Tesla does not have level 4/5 self-driving and therefore doesn’t have to assume any liability.

[-] Pika@sh.itjust.works 1 points 8 months ago* (last edited 8 months ago)

This right here is another fault in regulation that eventually will catch up because Especially with level three where it's primarily the vehicle driving and the driver just gives periodic input It's not the driver that's in control most of the time. It's the vehicle so therefore It should not be the driver at fault

Honestly, I think everything up to level two should be drivers at fault because those levels require a constant driver's input. However, level three conditional driving and higher should be considered liability of the company unless the company can prove that the autonomous control, handed control back to the driver in a human-capable manner (i.e Not within the last second like Tesla currently does)

[-] sturmblast@lemmy.world 1 points 8 months ago

If you are monkeying with the car right before it crashes... wouldn't that raise suspicion?

this post was submitted on 23 Jun 2025
902 points (100.0% liked)

Technology

81709 readers
3381 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS