LiDAR is essentially radar that uses light instead of sound
Radar doesn’t use sound. It sounds like the author doesn’t know the difference between sonar and radar.
LiDAR is essentially radar that uses light instead of sound
Radar doesn’t use sound. It sounds like the author doesn’t know the difference between sonar and radar.
This is why comments are so useful. I was already on the fence about viewing a site named futurism
and your comment made sure I will avoid it moving forward.
I think that's a big qualm of mine in terms of the sources allowed here, and I suppose it will take time to weed out the trustworthy from the not.
It was set to protect human lives, so it made the only logical play.
Just need to sneak in this code
if (facialMatch(elon)) { haltAndCatchFire(); }
Also I bet he's more mad it was video'd and publicised.
“We’re trying to have those conversations with Elon to establish what the sensors would need to do,” Baglino added. “And they were really difficult conversations, because he kept coming back to the fact that people have just two eyes and they can drive the car.”
Yes, and people crash cars all the time Elon...
If you want an autopilot with the failure rate of a human, then you might only need two eyes. If you want an autopilot with a near zero failure rate, you need much better telemetry data
Our heads are just loaded with sensory capabilities that are more than just the two eyes. Our proprioception, balance, and mental mapping allows us to move our heads around and take in visual data from almost any direction at a glance, and then internally model that three dimensional space as the universe around us. Meanwhile, our ears can process direction finding for sounds and synthesize that information with our visual processing.
Meanwhile, the tactile feedback of the steering wheel, vibration of the actual car (felt by the body and heard by the ears), give us plenty of sensory information for understanding our speed, acceleration, and the mechanical condition of the car. The squeal of tires, the screech of brakes, and the indicators on our dash are all part of the information we use to understand how we're driving.
Much of it is trained through experience. But the fact is, I can tell when I have a flat tire or when I'm hydroplaning even if I can't see the tires. I can feel inclines or declines that affect my speed or lateral movement even when there aren't easy visual indicators, like at night.
To be fair, 98% of drivers seem to barely be able to hold a straight line and can't see past the end of their hood, let alone do shoulder checks and be able to hear anything over the stereo turned up to 11. So I'd take my chances with the half-baked autopilot that can at least discern what a red light looks like.
I followed one gentleman for about 10 blocks before he stopped and I could tell him that he was missing the entire tire on the rear left of his car. There were a lot of sparks and metal screeching. Not a clue.
Just adding to your point, when F1 drivers were asked to play a racing sim, they could not perform like real life because they said no matter how good the sim is, it doesn't provide the feedback of a real car.
And people turn their heads, move their eyes across their windshield, change focus to look ahead or closer, look in their mirrors, listen for sounds (emergency vehicles, car honks, etc), are able to do things like look through gaps and other car windows to adjust to partial obstructions.
The fact that he doesn't realize you need a multitude of sensors to do even a little bit of what a human can do tells you all you need to know about Elon's so called brilliance.
Even the social aspect of driving eludes him. You and another driver come up to a 4 way stop at the same time, crossing paths. They wave you on to be polite. You wave back and go first. How and when does he plan to handle that behavior?
Is Elin really this dense? People have two eyes and milions of years of evolution behind them.
We tamed massive animals to use them as means of transportation, ffs.
He's the epitome of Cognitive Bias. He knoes a little, enough to think he knows enough, but not to recognise just how much there actually is to know. His own narcissism¹ and self-image as a genius would never allow him to critically reflect and question whether he might be wrong.
He's like the type of engineer that will abstract a premise to a concise and calculable model, solve the problem on paper, then assume the rest is implementation details. Except he doesn't even do the modeling - he takes the layman's approach to technology and biology where he assumes that it should be doable to replicate what biology does with machines.
Nevermind that biology is still flawed and you'd have to significantly outdo biology for a technology to reach public acceptance.
¹I'm not a psychiatrist nor familiar enough with him to actually diagnose a Narcissistic Personality Disorder, but his behaviour lines up with my lay understanding of it, so I'll use that shorthand. The irony of applying my own lay understanding while criticising his is not lost on me, but I hold that my assessment doesn't put anyone's life at risk.
Anybody else remember the now-removed Tesla blog post from 2016 arguing that FSD will require LIDAR? Idk why they' (Elon) are so stubborn about it. It can see through fog and darkness . Add that data to their model and they would probably already be near deployment readiness of real FSD.
Automotive lidar costs around $500-1000 to add to a car.
That's it. That's the whole reason.
Well we perform pretty well with just two eyes, but the difference is that we are a highly skilled general pattern recognition machine that you just can’t recreate in software yet. A few lines diverging with a bigger and smaller circle under it? Guess that’s a truck going that way. Oh the lines are changing angles? Holy shit the truck is coming into this lane!!
A person approaching on foot or a bicycle from my right side at the coincidentally perfect speed can accidentally stay within both my human eyes' blind spots (behind the support pillar) as I come to a stop at a 4-way. I have learned I need to crane around a bit before proceeding, or their frightened and angry face will suddenly lurch into view too close for comfort. The robot must be designed to have zero blind spots because humans are ridiculously good at hiding in them. Especially the little humans.
We're disappointed it didn't finish the job.
The car is simply looking out for its own best interest.
To be honest, I’m taking it as a sign of it developing genuine artificial intelligence. It examined its situation and surroundings, and made the only logical choice
Ive had multiple people get so mad at me for comments about how poorly this shit works. I don't understand how this is the hill so many people want to die on. It doesn't work.
Sunk cost. The price for these "premium" cars is sillyz and the features don't work. But people wouldn't pay such a price for unfinished crap, right? Right?! So they justify it to themselves and get defensive.
This. People need to stop simping for billionaires. It's embarrassing to watch.
Well autopilot, you know what they say. If at first you don't succeed...
I remember how much better things were when nobody knew who this jerk was.
"We're trying to have those conversations with Elon to establish what the sensors would need to do," Baglino added. "And they were really difficult conversations, because he kept coming back to the fact that people have just two eyes and they can drive the car."
But people have human brains, unlike Teslas or their CEO. Conversely, goldfish have two eyes, yet cannot drive a car.
Autopilot engineers in meeting: 'Oh, hi.... You're.... here.'
You know who also didn't listen to their engineers? NASA back in the day with Space Shuttle Challenger. You'd think Musk would be cognizant of the importance of listening to engineers when they bring up safety concerns, particularly as he owns SpaceX.
But no, he'd rather be a knob.
That was a ride, thank you!
This is the best summary I could come up with:
Way back in 2015, Tesla CEO Elon Musk would frequently give his engineers an earful after his car company's infamous Autopilot driver assistance tech nearly got him killed during test drives on multiple occasions — though there's a chance its dangerous behavior may have been due to Musk's stubbornness on how the technology should be built.
Per its chapter on the launch of the driver assistance tech, Musk would learn firsthand that a curve on Interstate 405 caused Autopilot, thrown off by the road's faded lane lines, to steer into and "almost hit" oncoming traffic.
But if Musk wanted safer software, he perhaps should've listened to his engineers, who have frequently petitioned over the years to incorporate what's known as light detection and ranging technology, or LiDAR.
LiDAR is essentially radar that uses light instead of sound, and Tesla's competitors, including Google's Waymo, have long leveraged it to help their autonomous cars "see."
Musk, however, has insisted that Tesla's cars only use optical sensors, likening it to how humans primarily use their eyes to drive, according to the biography, and as such, he's been tepid on using plain old radar, too.
"We told Elon that it was best safety-wise to use it … but it was clear that he thought we should eventually be able to rely on camera vision only, "one young engineer who joined in 2014 recalled, as quoted in the biography.
The original article contains 466 words, the summary contains 233 words. Saved 50%. I'm a bot and I'm open source!
Looks like autopilot have developed consciousness. Does it drink beer? Can we be friends?
So there is still hope that our machine overlords will make good decisions.
As 2023 FSD frequently attempts potentially lethal actions, 2015 FSD must have been spectacularly awful. The headline neglects the fact this was 8 years ago.
Well, yes. Because it specifically happened to him. He doesn't care about everyone else.
Typical conservative.
This is a most excellent place for technology news and articles.