897
submitted 9 months ago by jorge@feddit.cl to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] TypicalHog@lemm.ee 46 points 9 months ago

It only matters if the autopilot does more kills than an average human driver on the same distance traveled.

[-] NIB@lemmy.world 58 points 9 months ago

If the cars run over people while going 30kmh because they use cameras and a bug crashed into the camera and that caused the car to go crazy, that is not acceptable, even if the cars crash "less than humans".

Self driving needs to be highly regulated by law and demand to have some bare minimum sensors, including radars, lidars, etc. Camera only self driving is beyond stupid. Cameras cant see in snow or dark or whatever. Anyone who has a phone knows how fucky the camera can get under specific light exposures, etc.

Noone but tesla is doing camera only "self driving" and they are only doing it in order to cut down the cost. Their older cars had more sensors than their newer cars. But Musk is living in his Bioshock uber capitalistic dream. Who cares if a few people die in the process of developing visual based self driving.

https://www.youtube.com/watch?v=Gm2x6CVIXiE

[-] PipedLinkBot@feddit.rocks 3 points 9 months ago

Here is an alternative Piped link(s):

https://www.piped.video/watch?v=Gm2x6CVIXiE

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[-] Geobloke@lemm.ee 28 points 9 months ago

No it doesn't. Every life stolen matters and if it could be found that if tesla could have replicated industry best practice and saved more lives so that they could sell more cars then that is on them

[-] PresidentCamacho@lemm.ee 26 points 9 months ago

This is the actual logical way to think about self driving cars. Stop down voting him because "Tesla bad" you fuckin goons.

[-] gallopingsnail@lemmy.sdf.org 18 points 9 months ago

Tesla's self driving appears to be less safe and causes more accidents than their competitors.

"NHTSA’s Office of Defects Investigation said in documents released Friday that it completed “an extensive body of work” which turned up evidence that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities."

Tesla bad.

[-] TypicalHog@lemm.ee 8 points 9 months ago

Can you link me the data that says Tesla's competitors self-driving is more safe and causes less accidents and WHICH ONES? I would really like to know who else has this level of self-driving while also having less accidents.

[-] gallopingsnail@lemmy.sdf.org 3 points 9 months ago

The data doesn't exist, no other company has a level of "autonomy" that will let your car plow through shit without you paying attention.

[-] nxdefiant@startrek.website 4 points 9 months ago

No one else has the same capability in as wide a geographic range. Waymo, Cruise, Blue Cruise, Mercedes, etc are all geolocked to certain areas or certain stretches of road.

[-] GiveMemes@jlai.lu 5 points 9 months ago

Ok? Nobody else is being as wildly irresponsible, therefore tesla should be... rewarded?

[-] nxdefiant@startrek.website 2 points 9 months ago

I'm saying larger sample size == larger numbers.

Tesla announced 300 million miles on FSD v12 in just the last month.

https://www.notateslaapp.com/news/2001/tesla-on-fsd-close-to-license-deal-with-major-automaker-announces-miles-driven-on-fsd-v12

Geographically, that's all over the U.S, not just in hyper specific metro areas or stretches of road.

The sample size is orders of magnitude bigger than everyone else, by almost every metric.

If you include the most basic autopilot, Tesla surpassed 1 billion miles in 2018.

These are not opinions, just facts. Take them into account when you decide to interpret the opinion of others.

[-] GiveMemes@jlai.lu 1 points 8 months ago* (last edited 8 months ago)

That's not how rates work tho. Larger sample size doesn't correlate with a higher rate of accidents, which is what any such study implies, not just raw numbers. Your bullshit rationalization is funny. In fact, a larger sample size tends to correspond with lower rates of flaws, as there is less chance that an error/fault makes an outsized impact on the data.

[-] nxdefiant@startrek.website 1 points 8 months ago* (last edited 8 months ago)

No one's talking about rates. The article itself, all the articles linked in these comments are talking about counts. Numbers of incidents. I'm not justifying anything because I'm not injecting my opinion here. I'm only pointing out that without context, counts don't give you enough information to draw a conclusion, that's just math. You can't even derive a rate without that context!

[-] GiveMemes@jlai.lu 1 points 8 months ago

That's not my point though. We both know that the government agency doing this work is primarily interested in the rates, whether or not reports from the media are talking about the total numbers or not. The only reason they started the process of investigation was because of individual incidents, yes, but they're not looking for a few cases, but a pattern.

(Like this one:https://www.ranzlaw.com/why-are-tesla-car-accident-rates-so-high/)

[-] nxdefiant@startrek.website 1 points 8 months ago* (last edited 8 months ago)

Once more, I'm literally not injecting an opinion here or arguing for or against anyone's point. All the articles here talked about counts of individual accidents with zero context about sample size, something that is absolutely crucial to establishing exactly what you're talking about, rates. You can shit all over that, and then pretend you didn't, but Im only pointing out that the math doesn't work unless that context is there.

(I find it funny that the article you just posted is literally an ad for a traffic accident lawyer: here's the study the ad is citing. The ad did some creative interpretation on those numbers, ignoring things like DUI's for example: https://www.lendingtree.com/insurance/brand-incidents-study/#:~:text=Tesla%20drivers%20have%20the%20highest%20accident%20rate%20compared%20with%20all,over%2020.00%20per%201%2C000%20drivers.)

[-] Socsa@sh.itjust.works 3 points 9 months ago

I don't quite understand what they mean by this. It tracks drivers with a camera and the steering wheel sensor and literally turns itself off if you stop paying attention. What more can they do?

[-] nxdefiant@startrek.website 2 points 9 months ago* (last edited 9 months ago)

The NHSTA hasn't issued rules for these things either.

the U.S. gov has issued general guidelines for the technology/industry here:

https://www.transportation.gov/av/4

They have an article on it discussing levels of automation here:

https://www.nhtsa.gov/vehicle-safety/automated-vehicles-safety

By all definitions layed out in that article:

BlueCruise, Super Cruise, Mercedes' thing is a lvl3 system ( you must be alert to reengage when the conditions for their operation no longer apply )

Tesla's FSD is a lvl 3 system (the system will warn you when you must reengage for any reason)

Waymo and Cruise are a lvl 4 system (geolocked)

Lvl 5 systems don't exist.

What we don't have is any kind of federal laws:

https://www.ncsl.org/transportation/autonomous-vehicles

Separated into two sections – voluntary guidance and technical assistance to states – the new guidance focuses on SAE international levels of automation 3-5, clarifies that entities do not need to wait to test or deploy their ADS, revises design elements from the safety self-assessment, aligns federal guidance with the latest developments and terminology, and clarifies the role of federal and state governments.

The guidance reinforces the voluntary nature of the guidelines and does not come with a compliance requirement or enforcement mechanism.

(emphasis mine)

The U.S. has operated on a "states are laboratories for laws" principal since its founding. The current situation is in line with that principle.

These are not my opinions, these are all facts.

[-] doubtingtammy@lemmy.ml 12 points 9 months ago

It's not logical, it's ideological. It's the ideology that allows corporations to run a dangerous experiment on the public without their consent.

And where's the LIDAR again?

[-] iamtrashman1312@lemmy.world 12 points 9 months ago

So your stance is literally "human lives are a worthy sacrifice for this endeavor"

[-] MenigPyle@feddit.dk 4 points 9 months ago

Username checks out.

[-] PresidentCamacho@lemm.ee 4 points 9 months ago* (last edited 9 months ago)

My argument is that self driving car fatalities have to be compared against human driven car fatalities. If the self driving cars kill 500 people a year, but humans kill 1000 people a year, which one is better. Logic clearly isn't your strong suit, maybe sit this one out...

[-] fuckingkangaroos@lemm.ee 3 points 9 months ago

They're saying if this endeavor is overall saving lives then leave it alone...

[-] Tja@programming.dev 3 points 9 months ago

But... Panel gaps!

[-] mortemtyrannis@lemmy.ml 2 points 9 months ago

Knock knock

“Who is it?”

“Goons”

“Hired Goons”

load more comments (1 replies)
[-] mojofrododojo@lemmy.world 23 points 9 months ago

this is bullshit.

A human can be held accountable for their failure, bet you a fucking emerald mine Musk won't be held accountable for these and all the other fool self drive fuckups.

[-] sabin@lemmy.world 8 points 9 months ago

So you'd rather live in a world where people die more often, just so you can punish the people who do the killing?

[-] mojofrododojo@lemmy.world 12 points 9 months ago

That's a terrifically misguided interpretation of what I said, wow.

LISTEN UP BRIGHT LIGHTS, ACCOUNTABILITY ISN'T A LUXURY. It's not some 'nice to have add-on'.

Musk's gonna find out. Gonna break all his fanboys' hearts too.

[-] sabin@lemmy.world 7 points 9 months ago

Nothing was misguided and if anything your tone deaf attempt to double down only proves the point I'm making.

This stopped being about human deaths for you a long time ago.

Let's not even bother to ask the question of whether or not this guy could ultimately be saving lives. All that matters to you is that you have a target to take your anger out on the event that a loved one dies in an accident or something.

You are shallow beyond belief.

[-] mojofrododojo@lemmy.world 7 points 9 months ago

This stopped being about human deaths for you a long time ago.

Nope, it's about accountability. The fact that you can't see how important accountability is just says you're a musk fan boy. If Musk would shut the fuck up and do the work, he'd be better off - instead he's cheaping out left and right on literal life dependent tech, so tesla's stock gets a bump. It's ridiculous, like your entire argument.

[-] sabin@lemmy.world 3 points 9 months ago

I don't give a fuck about musk. I think hos Hyperloop is beyond idiotic and nothing he makes fucking works. In fact I never even said I necessarily think the state of Tesla autopilot is acceptable. All I said was that categorically rejecting autopilot (even for future generations where tech can be much better) for the express purpose of being able to prosecute people is beyond empty and shallow.

If you need to make up lies about me and strawman me to disagree you only prove my point. You stopped being a rational agent who weighs the good and bad of things a long time ago. You don't care about how good the autopilot is or can be. All you care about is your mental fixation against the CEO of the company in question.

Your political opinions should be based on principles, not whatever feels convenient in the moment.

[-] mojofrododojo@lemmy.world 1 points 9 months ago

You stopped being a rational agent who weighs the good and bad of things a long time ago.

sure thing, you stan musk for no reason, and call me irrational. pfft. gonna block you now, tired of your bullshit

[-] TypicalHog@lemm.ee 3 points 9 months ago

Where did I say that a human shouldn't be held accountable for what their car does?

this post was submitted on 27 Apr 2024
897 points (100.0% liked)

Technology

60811 readers
3340 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS