Imagine naming a feature "Full Self-Driving," and yet you can't take your attention away from the road and must be ready to take over at a moment's notice.
This is on par for Elon's entire career. He loves claiming success and taking credit for things he either didn't accomplish himself, or things he hasn't accomplished yet.
Or ever will.
I remember reading a post that claimed that Tesla's safety rating was given to them because a bunch of their crashes were determined to be human error - because the self-driving feature would automatically disconnect if it faced a crash it couldn't avoid.
Fairly certain the statistic requires fsd to have been disabled for 10s before or is counted as human-caused
Correct. It's documented.
The issue is a bit muddied by the fact that hitting the brake or the accelerator will deactivate it, and people will usually hit one of those if they believe that they are going to crash.
It's ok, it's in beta, so some features may not be complete just yet, but hey, let's just release this to the public anyways.
And charge a shit load for it
It's just a driving assistant, like in any other car. As far as I know, currently Mercedes is the only one who implemented autonomous driving, and even that one is limited to some specific areas. But at least that one is real. So much, that legally Mercedes (the company) is considered to be the driver of such cars, in case anything happens on the roads.
I feel like even with fully autonomous cars, there's going to be laws about how the main driver should always be alerted. This would be the case unless our cars are their own independent drivers like a cab.
Honestly, there should be laws against full self driving modes unless they can be proven to be good enough to not require driver intervention at all, and the manufacturer can be legally considered as the driver in case of an incident.
Requiring a driver to be alert and attentive to the road while not doing anything to operate the car runs contrary to human psychology. People cannot be expected to maintain focus on the road for extended periods while the car drives itself.
I don’t know exactly where the line should be drawn between basic cruise control and full self driving, but either the driver should be kept actively involved in driving or the car manufacturer should be held liable for whatever the car does.
You're absolutely right, it can be quite misleading to name a feature "Full Self-Driving" when it still requires constant attention and intervention from the driver. The expectations set by such a name may not align with the reality of the technology's current limitations.
Without LIDAR, this is a fool's endeavor.
I wish this was talked about every single time the subject came up.
Responsible, technologically progressive companies have been developing excellent, safe, self-driving car technology for decades now.
Elon Musk is eviscerating the reputation of automated vehicles with his idiocy and arrogance. They don't all suck, but Tesla sure sucks.
Even with LIDAR there are just too many edge cases for me to ever trust a self driving car that uses current-day computing technology. Just a few situations I’ve been in that I think a FSD system would have trouble with:
-
I pulled up at a red light where a construction crew was working on the side of the road. They had a police detail with them. As I was was watching the red light the cop walked up to my passenger side and yelled “Go!” at me. Since I was looking at the light I didn’t see him trying to wave me through the intersection. How would a car know to drive through a red light if a cop was there telling you to?
-
I’ve seen cars drive the wrong way down a one way street because the far end was blocked due to construction and backtracking was the only way out. (Residents were told to drive out the wrong way) Would a self driving car just drive down to the construction site and wait for hours for them to finish?
-
I’ve seen more than one GPS want to route cars improperly. In some cases it thinks a practically impassible dirt track is a paved road. In other cases I’ve seen chains and concrete barriers block intersections that cities/towns have determined traffic shouldn’t be going through.
-
Temporary detour or road closure signs?
-
We are having record amounts of rain where I live and we’ve seen roads covered by significant flooding that makes them unsafe to drive on. Often there aren’t any warning signs or barricades for a day or so after the rain stops. Would an FSD car recognize a flooded out road and turn around, or drive into the water at full speed?
In my opinion, FSD isn’t attempting to solve any of those problems. Those will require human intervention for the foreseeable future.
Musk’s vision is (was?) to eventually turn Tesla’s into driverless robo-taxis. At one point he even said he could see regular Tesla owners letting their cars drive around like automated Ubers, making money for them, instead of sitting idle in garages.
Just like that cheaper non-lidar Roomba with room mapping technology, it will get lost.
Do you have lidar on your head? No, yet you're able to drive with just two cameras on your face. So no lidar isn't required. Not that driving in a very dynamic world isn't very difficult for computers to do, it's not a matter of if, it's just a matter of time.
Would lidar allow "super human" driving abilities? Like seeing through fog and in every direction in the dark, sure. But it's not required for the job at hand.
You have eyes that are way more amazing than any cameras that are used in self driving, with stereoscopic vision, on a movable platform, and most importantly, controlled via a biological brain with millions of years of evolution behind it.
I'm sorry, you can't attach a couple cameras to a processor, add some neural nets, and think it's anything close to your brain and eyes.
Do you have lidar on your head?
Nope,
And that's exactly why humans crash. Constantly.
Even when paying attention.
They don't have resolution in depth perception, nor the FOV.
So, when are we changing this forums name from Technology to it's actual purpose of late "every click and rage bait post about Tesla and Musk so people can circlejerk worse than reddit"?
It's literally nothing but bullshit about Tesla and Twitter. All day long. No one cares!
I want to know about some actual tech, not the drama.
You don't care. If no one cared, there wouldn't be so many posts and extremely active discussions about them. If you want different content, post it.
It seems like a new anti Tesla article hits lemmy every day. It's boring at this point.
Unfortunately, all the current tech news is either people running naked scams or people debunking them.
The tragedy of our modern era is how much money we've invested in selling people a box labeled "Newest Life Changing Gadget" that's just full of rocks.
Check out the podcast TrashFuture. They do a bit about a shitty tech enterprise every episode, sometimes twice a week. From Juicero to Neom, the list of awful tech bullshit is limitless.
The only way to fix it is to post more interesting stuff yourself. Me too, tbh.
The elephant in the room is that the NHTSA still doesn't have a director, and hasn't had a long-term director since 2017.
Steven Cliff was the director for 2 months in 2022. Aside from that, this important safety organization has been... erm... on autopilot (see what I did there?) and leaderless.
How are we supposed to keep tabs on car safety if the damn agency in charge of automobile safety doesn't even have a leader?
TBF, we have achieved a FSD that is safer than one human this year. But we took away the driver license of grandma so now we have to find another human that’s worse than FSD.
Tesla's software is not safe:
https://www.washingtonpost.com/technology/2023/06/10/tesla-autopilot-crashes-elon-musk/
I wonder how much impact there might have been on code quality when Elon forced lead devs from their projects at Tesla to work on Twitter. I've never seen a situation like that turn out well for either party.
I wonder how this statistically compares to non-Tesla crashes?
Edit: quick Google/math shows average rate of lethal automobile crashes at 12 per 100,000 drivers. Tesla has supposedly sold 4.5million cars. 4.5million divided by 17 deaths from the article = 1 death per 200,000 Tesla drivers.
This isn't exactly apples-to-apples and would love for some to "do the math" more accurately, but it seems like Tesla is much safer than a standard driver.
The other confounding factor is we don't know how many of these drivers were abusing autopilot by cheating the rules (it requires hands on the wheel and full attention on the road)
Your statistical analysis is so bad that it's not even wrong. It's just a pile of disparate data strung together with false assumptions.
So all of those Teslas were sold in America? And all 4.5 million of those Teslas have Autopilot? And they're in Autopilot mode 100% of the time?
You forgot the most important issue: Tesla drivers are not representative of the average driver. They have more money and more education. They live in places with nicer weather. These all contribute to lower crash rates without self driving. I bet high end Mercedes have lower crash rates too, because people don't defer maintenance and then drive them crazily in the snow.
Compare apples to apples and I bet Teslas have average crash rates for luxury cars.
It is not a valid comparison. Many deaths are in bad weather or in bad roads. Tesla self driving will not even turn on in these conditions. I do not believe apples to apples data exists.
This is the best summary I could come up with:
Back in 2016, Tesla CEO Elon Musk stunned the automotive world by announcing that, henceforth, all of his company’s vehicles would be shipped with the hardware necessary for “full self-driving.” You will be able to nap in your car while it drives you to work, he promised.
But while Musk would eventually ship an advanced driver-assist system that he called Full Self-Driving (FSD) beta, the idea that any Tesla owner could catch some z’s while their car whisks them along is, at best, laughable — and at worst, a profoundly fatal error.
Since that 2016 announcement, hundreds of fully driverless cars have rolled out in multiple US cities, and none of them bear the Tesla logo.
His supporters point to the success of Autopilot, and then FSD, as evidence that while his promises may not exactly line up with reality, he is still at the forefront of a societal shift from human-powered vehicles to ones piloted by AI.
You’ll also hear from a former Tesla employee who was fired after posting videos of FSD errors, experts who compare the company’s self-driving efforts to its competitors, and even from the competitors themselves — like Kyle Vogt, CEO of the General Motors-backed Cruise, who is unconvinced that Musk can fulfill his promises without rethinking his entire hardware strategy.
Listen to the latest episode of Land of the Giants: The Tesla Shock Wave, a co-production between The Verge and the Vox Media Podcast Network.
The original article contains 497 words, the summary contains 236 words. Saved 53%. I'm a bot and I'm open source!
I've been ranting about this since 2016.
Having consumer trust in developing AI vehicles is hard enough without this asshole's ego and lies muddying the water.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed