Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • anlumo@feddit.de
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    23
    ·
    1 year ago

    Everybody who has a bit of an idea what an autopilot in a plane actually does is not mislead. Do people really think that commercial airline pilots just hit the “autopilot” button in their cockpit after disengaging the boarding ramp and then lean back until the boarding ramp at the destination is attached?

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      19
      arrow-down
      2
      ·
      1 year ago

      They’re not buying a plane though. They’re buying a car with an autopilot that is labeled as “full self driving”. That term does imply it will handle a complete route from A to B.

      People are wrongly buying into the marketing hype and that is causing crashes.

      I’m very concerned about some of the things I’ve seen regarding FSD on Teslas. Such as sudden hard braking on highways, failing to avoid an accident (but it’s OK it disengaged seconds before impact so the human was in control) and of course the viral video of FSD trying to kill a cyclist.

      They should not be allowed to market the feature this way and I don’t think it should be openly available to normal users as it is now. It’s just too dangerous to put in the hands (or not) of normal drivers.

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        edit-2
        1 year ago

        Autopilot has never been “Full Self Driving”. FSD is an additional $15,000 package on top of the car. Autopilot is the free system providing lane keeping with adaptive cruise, same as “Pro Pilot Assist” or “Honda Sensing” or any of the other packages from other car companies. The only difference is whenever someone gets in an accident using any of those technologies we never get headlines about it.

      • anlumo@feddit.de
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        7
        ·
        1 year ago

        I’ve never sat in a Tesla, so I’m not really sure, but based on the things I’ve read online, autopilot and FSD are two different systems on Tesla cars you can engage separately. There shouldn’t be any confusion about this.

        • Miqo@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          1 year ago

          I’ve never sat in a Tesla, so I’m not really sure

          There shouldn’t be any confusion about this.

          U wot m8?

        • r00ty@kbin.life
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Well, if it’s just the lane assistance autopilot that is causing this kind of crash. I’d agree it’s likely user error. The reason I say if, is because I don’t trust journalists to know or report on the difference.

          I am still concerned the FSD beta is “out there” though. I do not trust normal users to understand what beta means, and of course no-one is going to read the agreement before clicking agree. They just want to see their car drive itself.

          • anlumo@feddit.de
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            If it were about the FSD implementation, things would be very different. I’m pretty sure that the FSD is designed to handle cross traffic, though.

            I do not trust normal users to understand what beta means

            Yeah, Google kinda destroyed that word in the public conciousness when they had their search with the beta flag for more than a decade while growing to be one of the biggest companies on Earth with it.

            When I first heard about it, I was very surprised that the US even allows vehicles with beta self-driving software on public roads. That’s like testing a new fire fighter truck by randomly setting buildings on fire in a city and then trying to stop that with the truck.

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            1 year ago

            Yeah, I don’t trust a machine that has been trained for millions of hours and simulated every possible traffic scenario tens of millions of times and has millisecond reaction time while seeing the world in a full 360 degrees. A system that never drives drunk, distracted or fatigued. You know who’s really good at driving though? Humans. Perfect track record, those humans.

    • Einar@lemm.ee
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      6
      ·
      edit-2
      1 year ago

      So I need to understand the autopilot of a plane first before I buy a car?

      I would be mislead then, as I have no idea how such autopilots work. I also suspect that those two systems don’t really work the same. One flies, the other drives. One has traffic lights, the other doesn’t. One is operated by well paid professionals, the other, well, by me. Call me simple, but there seem to be some major differences.

      • Caculon@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I would have though people would read autopilot and think automatic. At least that’s what I do. I guess pilot is closely associated with planes but it certainly isn’t what I think of.

      • CmdrShepard@lemmy.one
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        1 year ago

        This is a pretty absurd argument. You could apply this to literally any facet of driving.

        “I have to learn what each color of a traffic light means before driving?”

        “I have to learn what white and yellow paint means and dashes versus lines? This is too confusing”

        God help you when you get to 4-way stops and roundabouts.

        • Einar@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 year ago

          Not absurd, but reality. We do that in driving school.

          I don’t know where you are from and which teaching laws apply, of course, but I definitely learned all those lessons you mentioned.

          • CmdrShepard@lemmy.one
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            1 year ago

            That’s precisely my argument and why “learning my new car’s features is too confusing” is an absurd argument.

      • anlumo@feddit.de
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        9
        ·
        1 year ago

        Yeah, there are some major differences in the vehicles, but both disengage when there’s anything out of the ordinary going on. Maybe people base their understanding of autopilots on the movie “Airplane!” where that inflatable puppet groped the Stewardess afterwards.

          • anlumo@feddit.de
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            1 year ago

            True, good point. As far as I know, it does turn itself off if it detects something it can’t handle, though. The problem with cross traffic is that it obviously can’t detect it, otherwise turning itself off would already be a way of handling it.

            Proximity detection is far easier up in the air, especially if you’re not bound by the weird requirement to only use visible spectrum cameras.

            (To make things clear, I’m just defending the engineers there who had to work within these constraints. All of this is a pure management failure.)

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I’m sorry, what? If you set an airplane to maintain altitude and heading with autopilot, it will 100% fly you into the side of a mountain if there’s one in front of you.

    • El_illuminacho@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      1 year ago

      Why do you think companies need to warn about stuff like “Caution, Contents are hot” on paper coffee shops? People are stupid.

      • anlumo@feddit.de
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        10
        ·
        1 year ago

        Those labels are there because people made a quick buck suing the companies when they messed up, not to protect the stupid customers.

        If the courts would apply a reasonable level of common sense, they wouldn’t exist.