Paraphrasing:
"We only have the driver's word they were in self driving mode..."
"This isn't the first time a Tesla has driven onto train tracks..."
Since it isn't the first time I'm gonna go ahead and believe the driver, thanks.
Paraphrasing:
"We only have the driver's word they were in self driving mode..."
"This isn't the first time a Tesla has driven onto train tracks..."
Since it isn't the first time I'm gonna go ahead and believe the driver, thanks.
Furthermore, with the amount of telemetry that those cars have The company knows whether it was in self drive or not when it went onto the track. So the fact that they didn't go public saying it wasn't means that it was in self-drive mode and they want to save the PR face and liability.
I have a nephew that worked at Tesla as a software engineer for a couple years (he left about a year ago). I gave him the VIN to my Tesla and the amount of data he shared with me was crazy. He warned me that one of my brake lights was regularly logging errors. If their telemetry includes that sort of information then clearly they are logging a LOT of data.
Modern cars (in the US) are required to have an OBD-II Port for On-Board Diagnostics. I always assumed most cars these days were just sending some or all of the real-time OBD data to the manufacturer. GM definitely has been.
Dude, in today's world we're lucky if they stop at the manufacturer. I know of a few insurances that have contracts through major dealers and they just automatically get the data that's registered via the cars systems. That way they can make better decisions regarding people's car insurance.
Nowadays it's a red flag if you join a car insurance and they don't offer to give you a discount if you put something like drive pass on which logs you're driving because it probably means that your car is already getting that data to them.
I've heard they also like to disengage self-driving mode right before a collision.
That actually sounds like a reasonable response. Driving assist means that a human is supposed to be attentive to take control. If the system detects a situation where it's unable to make a good decision, dumping that decision on the human in control seems like the closest they have to a "fail safe" option. Of course, there should probably also be an understanding that people are stupid and will almost certainly have stopped paying attention a long time ago. So, maybe a "human take the wheel" followed by a "slam the brakes" if no input is detected in 2-3 seconds. While an emergency stop isn't always the right choice, it probably beats leaving a several ton metal object hurtling along uncontrolled in nearly every circumstance.
That actually sounds like a reasonable response.
If you give the driver enough time to act, which tesla doesn't. They turn it off a second before impact and then claim it wasn't in self-driving mode.
Not even a second, it's sometimes less than 250-300ms. If I wasn't already anticipating it to fail and disengage as it went though the 2-lane wide turn I would have gone straight into oncoming traffic
Yeah but I googled it after making that comment, and it was sometimes less than one second before impact: https://futurism.com/tesla-nhtsa-autopilot-report
Since the story has 3 separate incidents where "the driver let their Tesla turn left onto some railroad tracks" I'm going to posit:
Teslas on self-drive mode will turn left onto railroad tracks unless forcibly restrained.
Prove me wrong, Tesla
The ~2010 runaway Toyota hysteria was ultimately blamed on mechanical problems less than half the time. Floor mats jamming the pedal, drivers mixing up gas/brake pedals in panic, downright lying to evade a speeding ticket, etc were cause for many cases.
Should a manufacturer be held accountable for legitimate flaws? Absolutely. Should drivers be absolved without the facts just because we don't like a company? I don't think so. But if Tesla has proof fsd was off, we'll know in a minute when they invade the driver's privacy and release driving events
Tesla has constantly lied about their FSD for a decade. We don't trust them because they are untrustworthy, not because we don't like them.
I have no sources for this so take with a grain of salt... But I've heard that Tesla turns off self driving just before an accident so they can say it was the drivers fault. Now in this case, if it was on while it drove on the tracks I would think would prove it's Tesla's faulty self driving plus human error for not correcting it. Either way it would prove partly Tesla's fault if it was on at the time.
It simply saw a superior technology and decided to attack.
That … tracks

Full stream ahead with the train puns
How the fuck do you let any level 2 system go 40 to 50 fucking feet down the railroad tracks.
We're they asleep?
I'm not sure I'd be able to sleep through driving on the railroad tracks. I'm going to guess this person was simply incredibly fucking stupid, and thought the car would figure it out, instead of doing the bare fucking minimum of driving their goddamn 2 ton heavy death machine themself.
LOL right? Like deciding to see what the car will do ON RAILWAY TRACKS is absolutely fucking bonkers.
Teslas do still have steering wheels, after all
You don't say!
If only there was a way to avoid the place where trains drive.
I checked first. They didn't make a turn into a crossing. It turned onto the tracks. Jalopnik says there's no official statement that it was actually driving under FSD(elusion) but if it was strictly under human driving (or FSD turned itself off after driving off) I guarantee Tesla will invade privacy and slander the driver by next day for the sake of court of public opinion
Damn. I hope the train is ok
Tesla's self-driving is pretty shite but they seem to have a particular problem with railway crossings, as also pointed out in the article. Of all of the obstacles for the self-driving system to fail to detect, the several thousand tons of moving steel is probably one of the worst outcomes.
Maybe if they use LIDAR like they should have instead of just cameras it wouldn’t be such an issue, but they’re determined to minimize costs and maximize profits at the expense of consumers as are all publicly traded companies
Tesla's have a problem with the lefts.
Elongated Musketon: UM THAT WAS JUST 1 FAULTY MODEL STOP CHERRY PICKING GUYS JUST BUY IT!!!1
Working as expected then.
Car drove itself on to the tracks, gets hit by a train. This is some Maximum Overdrive shit.
Driver failed to control their car and avoid a collision.
FTFY.
I'm sure the car did actually take the action. But there are TONS of unavoidable warnings and reminders to the driver to supervise and take control when FSD goes wrong.
Which you can do by such super-technical means as "hitting the brake" or "steering the other way" or "flipping the right stalk up". Rocket science, I know.
Driver's fault. Bad technology, yes. Worse driver.
I just don't know how they're getting away with calling it 'full self driving' if it's not fully self driving.
What a cool and futuristic car. It’s all computer!
I’m still waiting for Elon’s car to drive onto train tracks.
It's stupider than I thought reading the headline. That car started driving down the fing tracks
Hope no one was hurt, regardless whether they're stupid, distracted or whatever! If we can't build fail-saves into cars, what are our chances for real AI?
At this point, if anybody buys one of these vehicles from Tesla, they absolutely deserve what they get. It is absurd.
This is a most excellent place for technology news and articles.