357
top 50 comments
sorted by: hot top controversial new old
[-] coffeebiscuit@lemmy.world 126 points 1 year ago

Auto pilot beta? People are willing to test betas for cars? Are you insane? Insurance is going to have a field day.

[-] zeppo@lemmy.world 66 points 1 year ago

What bothers me is, I have to drive on the road with people running some braindead Elon Musk software?

[-] Ocelot@lemmies.world 26 points 1 year ago

Have you seen how humans drive? Its not a very high bar to do better.

load more comments (2 replies)
[-] elxeno@lemm.ee 37 points 1 year ago

From what i read, Auto Pilot (AP) is just to keep u on your lane while Full Self Driving (FSD) just switches lanes into oncoming traffic.

load more comments (1 replies)
[-] drdabbles@lemmy.world 22 points 1 year ago

Even better, several people have died using it or killed someone else. It also has a long history of driving underneath semi truck trailers. Only Europe was smart enough to ban this garbage.

load more comments (2 replies)
load more comments (4 replies)
[-] lucidinferno@lemmy.world 85 points 1 year ago

“Some of you may die, but it’s a risk I’m willing to take.” - Lord Farquaad and Musk

[-] bezerker03@lemmy.bezzie.world 8 points 1 year ago

I mean... They opted into a beta. Beta means this may happen.

[-] grue@lemmy.world 15 points 1 year ago

Even if those dipshits "opted in," the rest of us sharing the road sure as Hell didn't!

[-] ours@lemmy.film 13 points 1 year ago

This isn't just some email web app that may have a few bugs, it's putting lives at risk on the road. They shouldn't be able to just label it a beta, overpromise its capabilities, and neglect any responsibility.

[-] BCat70@lemmy.world 9 points 1 year ago

I guarantee that the other drivers on that road didn't opt for a "beta".

load more comments (2 replies)
load more comments (2 replies)
[-] FlyingSquid@lemmy.world 69 points 1 year ago

I'm not especially sympathetic to the Tesla drivers this might kill.

I'm worried about everyone else.

[-] PsychedSy@sh.itjust.works 8 points 1 year ago

I consider the suicide attempts a feature. I'll test for you, Tesla.

[-] Asudox@lemmy.world 49 points 1 year ago* (last edited 1 year ago)

It shouldn't have even been released for normal people to use it in daily life, in real roads full of other cars. This poses a big life risk if you ask me, I hope countries start banning this feature soon otherwise many more other deaths will happen, and Elon somehow will get away with them. What's so hard about driving a real car manually? Did you all become fatass lazy people that don't even have the willpower to drive a car? Ridiculous. ML is experimental and for a machine, it's amazing, but it isn't as good as a human YET, thus causing life threatening accidents. FSD literally is still in beta, and people are driving full speed in roads with this beta software.

[-] dufr@lemmy.world 23 points 1 year ago

It can't be used in the EU. It would need to pass a review, Elon have claimed they are close to getting it through but Elon says a lot of things.

[-] echodot@feddit.uk 18 points 1 year ago

Self-driving cars are actually only legal in a few countries. And those countries have tests.

It's only the United States that just lets anyone do what everyone earth it is that they want, even if it's insanely dangerous.

Everywhere else any car company that's espousing self-driving tech would actually have to prove that it is safe, and only a few companies have managed to do this and even then the cars are limited to predefined areas where they are sure they're not going to come across difficult situations.

load more comments (1 replies)
[-] Ocelot@lemmies.world 13 points 1 year ago* (last edited 1 year ago)

Humans did not evolve to drive cars. ML did. It drives consistently with no distractions. It is never tired, drunk, or experiences road rage. It has super human reaction time and can see in a full 360 degrees. It is not about being a lazy fatass it is about safety. Hundreds of people in the US were killed in car accidents just today, and none of them were from self driving cars.

Also please provide an example of a life threatening accident cause by FSD.

[-] Zummy@lemmy.world 20 points 1 year ago

The article listed 2 life threatening near accidents that were only prevented because the person behind the wheel took over and kicked out FSD. Read the article and then comment.

load more comments (6 replies)
load more comments (2 replies)
[-] const_void@lemmy.ml 28 points 1 year ago

Lol who would trust their life to Elon Musk? 🤣

[-] sdf05@lemmy.world 19 points 1 year ago

This is like that show "Upload"; the guy literally gets killed by a car

load more comments (12 replies)
[-] Ocelot@lemmies.world 15 points 1 year ago

Electrek has a long history of anti tesla clickbait. Take this with a grain of salt.

Teslas are factory equipped with a 360 degree dashcam yet we never see any footage of these alleged incidents.

[-] silvercove@lemdro.id 45 points 1 year ago

Are you kidding me? Youtube is full of Tesla FSD/Autopilot doing batshit crazy things.

[-] Ocelot@lemmies.world 7 points 1 year ago

so can you provide a link of an accident caused by FSD?

[-] zeppo@lemmy.world 28 points 1 year ago

Musk just did a 20 minute video that ended with it trying to drive into traffic.

load more comments (12 replies)
[-] naeemthm@lemmy.world 25 points 1 year ago

Your posts here show you’re not interested in reality, but I’ll leave a link anyway

https://www.motortrend.com/news/tesla-fsd-autopilot-crashes-investigations/

Excited to see your response about how this is all user error.

[-] Ocelot@lemmies.world 10 points 1 year ago* (last edited 1 year ago)

I'm sure you're just going to downvote this and move on without reading but I'm going to post it anyway for posterity.

First, a little about me. I am a software engineer by trade with expertise in cloud and AI technologies. I have been an FSD beta tester since late 2020 with tens of thousands of incident-free miles logged on it.

I'm familiar with all of these incidents. Its great that they're in chronological order, that will be important later.

I need to set some context and history because it confuses many people when they refer to the capabilities of autopilot and FSD. Autopilot and FSD (Full Self-Driving) are not the same thing. FSD is a $12,000 option on top of any Tesla, and no Tesla built prior to 2016 has the hardware capability to run FSD.

The second historical point is that FSD did not have any public release until mid-2022, with some waves of earlier releases going to the safest drivers starting in mid-2021. Prior to that it was exclusive to Tesla employees and select few trusted beta testers in specific areas. Any of the issues in this article prior to mid-2021 are completely irrelevant to the topic.

Tesla's autopilot system is an LKAS (Lane keep assist system). This is the same as is offered in Honda (Honda Sensing), Nissan (Pro Pilot Assist), Subaru, Cadillac, etc. Its capabilities are limited to keeping you in your lane (via a front-facing camera) and maintaining distance to the car in front of you (via radar, or cameras in later models). It does not automatically change lanes. It does not navigate for you. It does not make turns or take exits. It does not understand most road signs. Until 2020 it did not even know what a red light was. It is a glorified cruise control and has always been presented as such. Tesla has never advertised this as any sort of "hands-off" system where the driver does not need to pay attention. They do not allow the driver to lose attention from the road in FSD either, requiring hands-on the wheel and constant torque as well as eyes on the road (via an interior camera) in order to work. If you are caught not paying attention enough times the system will disengage and even kick you out of the program with enough violations.

OK, now that being said, lets dig in:

November 24, 2022: FSD malfunction causes 8-car pile-up on Bay Bridge

  • I'm from the area and have driven this exact spot hundreds of times on FSD and have never experienced anything even remotely close to what is shown here
  • "Allegedly" with FSD engaged
  • Tesla FSD "phantom" braking does not behave like this, and never has in the past. Teslas have 360 degree vision and are aware of traffic in front of and behind them.
  • Notice at the beginning of the video that this car was in the process of a lane change, this introduces a couple of possibilities as to what happened here, namely:
  • Teslas do have a feature under autopilot/FSD that if after multiple warnings for the driver to pay attention and no engagement, the car will slow down and pull over to the shoulder and stop. This particular part of the bay bridge does not have a shoulder, so it stopped where it is. This seems unlikely, since neural networks are very capable of identifying what a shoulder is and that its in an active lane of traffic, and even with tesla's massive fleet of vehicles on FSD there are no other recorded instances of this happening anywhere else.
  • This particular spot on the bay bridge eastbound has a very sudden and sharp exit to Yerba Buena Island. What I think happened is that the driver was aiming for this exit, saw that they were about to miss it and tapped the brake and put on the turn signal not realizing that they just disengaged FSD. The car then engaged regen braking and came to a full stop.
  • When a tesla comes to a full stop automatically (an emergency stop) it puts the hazards on automatically. This has been a feature since the v1 autopilot days. This car’s hazards do not come on after the stop.
  • What seems especially weird to me is that the driver continued to let the car sit there at a full stop while traffic piled up behind them. In FSD you are always in control of your own car and all it would have taken is tapping the accelerator pedal to get moving again. FSD will always relinquish control over the car to you if you tap the brakes or grab and move the steering wheel hard enough. Unless there was some mechanical issue that brought the car to a stop and prevented it from moving, in which case this is not the fault of the FSD software.
  • Looking at how quickly (or lack thereof) the car slowed down this seems to very clearly be the car using regen braking, not emergency braking. I'm almost positive this means that FSD was disengaged completely.
  • We don't have all the facts on this case yet and I'll be anxious to see how this plays out in court but there are definitely many red flags on this one that have me questioning what actually happened here, but I doubt if FSD has anything to do with it.
  • If my earlier point is true this is actually an instance of an accident being caused because the driver disengaged self-driving. The car would have been much safer if the driver wasn't even there.

April 22, 2022: Model Y in "summon mode" tries to drive through a $2 million jet

  • This one is a favorite among the tesla hate community. Understandably so.
  • Smart summon has 0 to do with FSD or even autopilot. It is a party trick to be used under very specific supervised processes
  • Smart summon relies exclusively on the front camera and ultrasonic sensors
  • While smart summon is engaged, the user still has full control over their car via the phone app. If the car does anything unexpected you only need to release your finger from the button and the car stops immediately. The "driver" did not do this and was not supervising the car, the car did not see the jet because it was entirely above the ultrasonic sensors, and as I'm sure you can understand the object recognition isn't exactly trained on parked airplanes.
  • The app and the car remind the driver each and every time it is engaged that they need to be within a certain range and within eyesight of the car to use it. If you remote control your car into an obstacle and it causes an accident, its your fault, period.
  • Tesla is working on a new version of smart summon which will make this feature more useful in the future.

February 8, 2022: FSD nearly takes out bicyclist as occupants brag about system's safety

  • I suggest actually watching the video here. What happened is highly at odds with what is actually in the video, but the vid is just over an hour long so I bet most people don't bother watching it.
  • "It wouldn't have hit them, it definitely wouldn't have hit them. Do we need to cut that?" "No, you can keep it in"
  • If you look at what was happening on the car's display, it detected someone entering the crosswalk and stepping out into traffic on the left side. The car hit the brake, sounded an alert and swerved to the right. There was a bicycle in front of where the car swerved but at no point was it about to "nearly take out a bicyclist". It did definitely overreact here out of safety but at no point was anyone in danger.
  • Relatively speaking this is a very old version of FSD software, just after the first wave of semi-public release.

December 6, 2021: Tesla accused of faking 2016 Full Self Driving video

  • lol

March 17, 2021: Tesla on Autopilot slams into stationary Michigan cop car

  • Now we're getting into pre-FSD autopilot. See above comments about the capabilites of autopilot. Feel free to compare these to other cars LKAS systems. You will see that there are still lots of accidents across the board even with LKAS. That is because it is an assist system and the driver is still fully responsible and in-control of the car.

June 1, 2020: Tesla Model 3 on Autopilot crashes into overturned truck

  • Again, pre-FSD. If the driver didn't see the overturned truck and disengaged to stop then I'm not sure how anyone expects a basic LKAS system to be able to do that for them.

March 1, 2019: NHTSA, NTSB investigating trio of fatal Tesla crashes

  • This one involves a fatality, unfortunately. However, the car was not self-driving. There is something else very important to point out here:
  • The feature that allows Teslas to change lanes automatically on the freeway (Navigate on Autopilot) was not released until a year after this accident happened. That means, that if AP was engaged in this accident, the driver deliberately instructed the car via engaging the turn signal to merge into that truck.

May 7, 2016: First known fatality involving Tesla's Autopilot system

  • Now we're getting way back into the V1 autopilot systems which weren't even made by tesla. This uses a system called MobilEye and is made by a third party and is even less capable than V2 autopilot

So, there we go. FSD has been out to the public for a few years now to a massive fleet of vehicles, driving collectively millions upon millions of miles and this is the best we've got in terms of a list showing how "Dangerous" it is? That is pretty remarkable.

Excited to see your response.

[-] naeemthm@lemmy.world 14 points 1 year ago* (last edited 1 year ago)

Interesting, you wrote an entire dissertation on why you think this is all a false flag about Full Self Driving, but it seems to be mostly anecdotal or what you think is happening. Being a “software by trade” isn’t enough to face the facts that something fishy is 100% going on with Tesla’s autopilot system.

“The last time NHTSA released information on fatalities connected to Autopilot, in June 2022, it only tied three deaths to the technology. Less than a year later, the most recent numbers suggest 17 fatalities, with 11 of them happening since May 2022. The Post notes that the increase in the number of crashes happened alongside a rapid expansion of Tesla’s "Full Self-Driving" software from around 12,000 vehicles to almost 400,000 in about a year”

https://www.caranddriver.com/news/a44185487/report-tesla-autopilot-crashes-since-2019/#

You claim the timeline is important here and this is all post-2022.

load more comments (5 replies)
load more comments (1 replies)
load more comments (5 replies)
[-] Hotdogman@lemmy.world 23 points 1 year ago

I saw the videos of them running over infants in strollers. Does that count?

[-] Ocelot@lemmies.world 11 points 1 year ago

on FSD? link please

load more comments (2 replies)
[-] drdabbles@lemmy.world 20 points 1 year ago

Bud, we've seen literally thousands of videos of this happening, even from the Tesla simps. You're seven years behind on your talking points.

[-] Ocelot@lemmies.world 7 points 1 year ago

Can you link a few? Something where FSD directly or indirectly causes an accident?

[-] drdabbles@lemmy.world 12 points 1 year ago

You're working very hard in this thread to remain in the dark. You could take two seconds to look for yourself, but it seems like you won't. Hell, they performed a recall because it was driving through stops. Something it'll still do, of course, but they performed a recall.

[-] Astroturfed@lemmy.world 10 points 1 year ago

Elon literally had to hit the brakes manually in a Livestream of the self driving tech as the car was going to go strait through a red light. Like less than a week ago... SOOOO safe, all the news stories of it killing people are fake!

load more comments (7 replies)
load more comments (17 replies)
load more comments (2 replies)
[-] kinther@lemmy.world 20 points 1 year ago

Wishful thinking that Tesla would publicly distribute footage of an accident caused by one of their cars...

[-] Ocelot@lemmies.world 8 points 1 year ago* (last edited 1 year ago)

Its saved on to a thumb drive. any user can pull off and use or post the footage anywhere. It never gets uploaded to tesla, only snapshots and telemetry.

lol the anti tesla crew will downvote even the most basic facts.

load more comments (1 replies)
[-] DingoBilly@lemmy.world 19 points 1 year ago

Given your posts and rampant Tesla fanboyism, I honestly wouldn't be surprised if you're Elon himself just anxiously trying to save face.

Then again, Elon would just publicly sprout misinformation about it all so it probably isn't. Still, surprising that people are just so obsessed with Tesla they can't take the bad with the good.

load more comments (9 replies)
[-] Astroturfed@lemmy.world 13 points 1 year ago

Ah yes, there's no readily available footage of the dead bodies flying into the street or being crushed under the wheels so it's made up. Of course.

load more comments (15 replies)
[-] LibertyLizard@slrpnk.net 9 points 1 year ago

Hilariously I’ve also seen them accused of a pro-Tesla bias. Personally I think they are pretty balanced.

load more comments (5 replies)
[-] Mockrenocks@lemmy.world 8 points 1 year ago

Frankly, it speaks incredibly poorly to the NHTSA that this kind of behavior is allowed. "Beta testing" a machine learning driving assistance feature on active highways at 70+ miles an hour is a recipe for disaster. Calling it Full-Self Driving while also not having guardrails on its behavior is false advertising as well as just plain dangerous.

load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 03 Sep 2023
357 points (100.0% liked)

Technology

59119 readers
1952 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS