olliej 12 days ago

It remains absurd that we have a mechanism that n the road to allow self driving” cars, but only if there’s a person in the drivers seat that has greater focus and skill than an airline pilot.

x86x87 12 days ago

> After closing the first investigation, regulators opened another, this one into whether that recall to install new Autopilot safeguards was adequate.

Tesla in a nutshell.

mrkandy 12 days ago

3,855/278,870,463 = 0.00138% - average for all

13/6,000,000 = 0.00021% - average per tesla

  • 7e 12 days ago

    Are you trolling, or a pro-Tesla bot?

    It's not 13 fatalities. It's 13 incidents involving a total of 29 deaths. https://www.tesladeaths.com has the total fatality account involving Slaugherpilot at 42.

    Also, you're omitting all the deaths in Teslas for which it hasn't yet been proven that "Autopilot" was involved. Teslas are driven my more affluent people, are newer, and incorporate safer non-Autopilot tech. than the average car.

    Finally, 6M is the worldwide number of cars Tesla has sold, which you're comparing to a U.S. count of non-Teslas.

    • snapplebobapple 11 days ago

      The math is wrong but the idea isn't wrong. Its never going to be zero, what matters is how it compares to humans driving.

    • 7e 11 days ago

      tesladeaths.com has the total number of Tesla fatalities, from any cause, at 500.

al_borland 12 days ago

It seems like this number need context. Ideally computers would never make mistakes, but are these numbers better than a human driver? By how much?

I see people distracted and not paying attention every time I drive, and they don’t have anything in the car helping them. The only reason they don’t hit me is because I moved out of the way when they come in my lane.

  • vagrantJin 12 days ago

    Such being the case, a person recklessly endangering your life can and will be held to account. Im afraid human lives perishing because of a bug in the software is less than tolerable. Who can hold the software to account? The court? Internal QA?

    If these systems actually did the hard part of driving, avoid crashes with super-human reaction times - then they would be extremely valuable. But that's not the case and they are essentially worthless. Might as well hire a driver for you of you feel driving isn't for you. Save you and the whole planet the trouble until these systems actually do the stuff that matters.

    • add-sub-mul-div 12 days ago

      Another problem with bugs is that they'll affect the entire fleet overnight with a software update. But human behavior (collectively) is predictable and stable.

  • hnlmorg 12 days ago

    Agreed though it’s also worth noting that every time an autopilot fatality hits the news it’s combined with the driver also not paying attention. Often it’s claimed that the driver isn’t paying attention because of their faith in autopilot. And that is the real issue. Even Tesla themselves say it shouldn’t be relied upon, despite what their marketing suggests.

  • creativeSlumber 12 days ago

    > but are these numbers better than a human driver

    This is a wrong way to look at it. recently a Tesla just ran over a motorcyclist. These people would have been alive if it was a normal human driving. These are fault modes that did not exist before. Also, we have adapted to drive along with other human drivers. Bikers are taught to make sure they are seen on the road and make eye contact with drivers etc.. How do you make sure a Tesla's ML model recognized you? What does that recognition even mean?

    • ripjaygn 12 days ago

      > These people would have been alive if it was a normal human driving

      That's the wrong way to look at it. 109 people were killed on the road today. How many would've been alive if Autopilot was active?

      Assuming your goal was to reduce fatalities.

      • medvezhenok 12 days ago

        The number of deaths matters, but the distribution of deaths also matters. When you drive, you are in control of the amount of risk you take, and your likelihood of a crash - it’s not the same probability for everyone.

  • Veserv 12 days ago

    Nobody knows.

    Tesla does not disclose the information needed to make a proper apples to apples comparison to the public or researchers. They do not disclose the number of miles or the number of accidents, let alone the conditions or circumstances that would be needed to account for bias or confounding variables and determine the corresponding comparable metric. Of the reports they make to NHTSA [1] which are public by default, they declare the vast majority of contained information as confidential business information and their lawyers force NHTSA redact it.

    A company kills a bunch of people and nobody is allowed to know how safe it is because that is confidential. That alone should worry you.

    But, that is not all. Tesla does not know either. If you look at the reports you will see that of the over 1,000 crashes, Tesla declined to investigate to determine if anybody was injured in ~95% of cases. Of the reported confirmed fatalities, Tesla telemetry missed the majority of them. I will say that again, they can not even detect the majority of fatalities while their system is active. Their data collection procedures are so grossly inadequate that you can not even determine the upper bound on the fatality rate, let alone more nuanced safety measures.

    To quote the safety investigation mentioned in the article [2]: "Tesla largely receives data for crashes only with pyrotechnic deployment, which are a minority of police reported crashes. A review of NHTSA’s 2021 FARS and Crash Report Sampling System (CRSS) finds that only 18 percent of police-reported crashes include airbag deployments." Tesla bases their safety statistics on data collection conditions that under-report actual crash rates by a factor of 500% on average.

    The statistics Tesla releases do not even attempt to account for the glaring discrepancies and under-reporting that is clearly visible in their fatality reports and data collection conditions and instead intentionally state the under-reporting that benefits them as simple fact. They are intentionally overstating product safety based on data that any competent professional knows is faulty and inadequate to push product. That level of misconduct is criminal.

    [1] https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_In...

    [2] https://static.nhtsa.gov/odi/inv/2022/INCR-EA22002-14496.pdf

  • Terr_ 12 days ago

    I'd like to point out that accident rate is not the only thing to look at here.

    We also want to know whether there are correlations between the accidents that happen that are outside of factors we would consider normal among humans.

    A contrived the example to prove the point: We wouldn't be satisfied with an automatic driver that got into only half the accidents humans did... but also possessed an inexplicable tendency to charge at people wearing certain color-combinations of clothes.

  • this_user 12 days ago

    The real question is how many fatal accidents have been prevented by Autopilot.

  • sjrjsksjf 12 days ago

    The root cause to problem is more important precisely because people will decide to recontextualise statistically to justify their poor performance. In good faith and bad

dools 12 days ago

It does seem rather ridiculous that they’re allowed to call this feature either autopilot or full self driving beta.

  • BugsJustFindMe 12 days ago

    Oh, I don't know. When my brain is on autopilot I'm prone to accidents too.

    • sjrjsksjf 12 days ago

      [flagged]

      • madamelic 12 days ago

        Even if self-driving is 1% better in the best conditions, that's a safer car than one driven entirely by humans.

        Humans right now are the best they'll ever be at driving, you can't say the same for self-driving cars. Even for a mediocre solution currently, it's likely better to use the self-driving in good conditions than drive it manually.

        Computers never get tired, distracted, or emotional. What the humans' jobs are is to understand when it is a good condition for self-driving and when isn't it.

        I'd venture to guess these fatal self-driving accidents are one of the following:

        - freak accidents (ie: not the self-driving cars fault / it would've happened either way)

        - accidents due to humans trusting the car too much (ie: we aren't level 5, be available to take over)

        - the self-driving just being bad. This is the category a lot of people want these to be.

        To get really morbid in my full-throated apology: cars are really dangerous. People die in car accidents all the time. The main reason these get attention is because it is notable. It's unfortunately not nationally notable when an entire family gets wiped out on their way to their child's soccer game in a non self-driving car.

        • dools 12 days ago

          But the article specifically says that the features contributed to the accident in foreseeable ways. Meaning that a human without the feature would not have had the accident. It may well be a power user problem, ie. it makes you better if you use it properly but otherwise makes it worse.

      • dools 12 days ago

        I actually think it was just a joke. Acknowledgingthat FSB and autopilot are accident prone and therefore appropriately named.