A 56-year-old Snohomish man had set his Tesla Model S on Autopilot and was looking at his cellphone on Friday when he struck and killed a motorcyclist in front of him in Monroe, court records show.

A Washington State Patrol trooper arrested the Tesla driver at the crash site on Highway 522 at Fales Road shortly before 4 p.m. on suspicion of vehicular manslaughter, according to a probable cause affidavit.

The motorcyclist, Jeffrey Nissen, 28, of Stanwood, died at the scene, records show.

The Tesla driver told a state trooper he was driving home from having lunch in Bothell and was looking at his phone when he heard a bang and felt his car lurch forward, accelerate and hit the motorcyclist, according to the affidavit.

The man told the trooper his Tesla got stuck on top of the motorcyclist and couldn’t be moved in time to save him, the affidavit states.

The trooper cited the driver’s “inattention to driving, while on autopilot mode, and the distraction of the cell phone while moving forward,” and trusting “the machine to drive for him” as probable cause for a charge of vehicular manslaughter, according to the affidavit.

The man was booked into the Snohomish County Jail and was released Sunday after posting bond on his $100,000 bail, jail records show.

  • FuglyDuck@lemmy.world
    link
    fedilink
    English
    arrow-up
    115
    arrow-down
    2
    ·
    7 months ago

    this guy should get everything coming to him. FSD/autopilot is not good enough to take your hands off the wheel and not pay attention to the fuck’s going on around you.

    that said. Tesla absolutely should get a massive wrongful death lawsuit and get fucked by the courts.

    • jmcs@discuss.tchncs.de
      link
      fedilink
      arrow-up
      49
      arrow-down
      1
      ·
      7 months ago

      Everyone from the driver to whoever certified the car as road worthy to Elon Musk should be held responsible. In reality I would be surprised if anyone except the driver will even see the inside of the court.

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        32
        arrow-down
        1
        ·
        7 months ago

        yup. every one else has very expensive lawyers.

        reality is Teslas are a shit product with false advertising.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      37
      ·
      7 months ago

      I’m honestly not sure which is worse, that Tesla made a system they call Autopilot that isn’t an autopilot or that Tesla owners still think it is.

        • Flying Squid@lemmy.world
          link
          fedilink
          arrow-up
          7
          arrow-down
          8
          ·
          7 months ago

          Yeah, but it was made clear years ago, even by Tesla themselves, that “autopilot” doesn’t actually mean “autopilot.” So maybe it’s 50/50?

          • Bonehead@kbin.social
            link
            fedilink
            arrow-up
            25
            ·
            edit-2
            7 months ago

            They still sell it as “Autopilot and Full Self Driving”. Sure, they claim “Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment”, but that’s buried in the marketing text that no one reads. It really should be called “Enhanced Lane Assist with Auto-Follow Cruise Control”, but that doesn’t sell as many cars.

            Edit: Warning…Lemmy or Kbin, I’m not sure which, is current buggy and showing my link as a video connected to RedGifs. Clicking the link actually brings you to the Tesla website. Don’t open the video…

            • azertyfun@sh.itjust.works
              link
              fedilink
              arrow-up
              4
              ·
              7 months ago

              No matter how they’re marketed and used, self-driving systems will make people less engaged (that’s the entire point, people don’t use it out of arm fatigue, they use it because it’s mentally relaxing!) and therefore more distracted.

              “The driver should keep their full attention on the road and be prepared to take over at any point” is an impossible standard and a lame-ass loophole that shouldn’t even be allowed to be cited in a court of law. Fully engaged drivers do not ask an “autopilot” to steer for them.

            • AA5B@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              2
              ·
              7 months ago

              When I tried it out, after your hands are off the wheel for several seconds, the screen flashes blue then autopilot disengages and the car comes to a stop. Not excusing what it can and can’t do, but you can’t really drive hands free without intentionally working around very obvious restrictions. The driver can’t claim ignorance

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        7 months ago

        naw. I think the worst part is when, in tests because that’s the only places they can turn it on in most states, it’ll slow down for an object in the road, and then decide to floor it.

        • nxdefiant@startrek.website
          link
          fedilink
          arrow-up
          6
          ·
          edit-2
          7 months ago

          You can turn it on anywhere in the U.S. I’m not sure if it’s geolocked elsewhere. You might be confusing it with GM, Ford, Mercedes, and other systems which only work on certain stretches of certain roads.

    • machinin@lemmy.world
      link
      fedilink
      arrow-up
      21
      arrow-down
      2
      ·
      7 months ago

      FSD/autopilot is not good enough to take your hands off the wheel and not pay attention to the fuck’s going on around you.

      What? There’s a video on Tesla’s website right now that says the driver is at the wheel only for legal reasons. There is no other purpose to have a driver.

      I’m stumped!

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        7 months ago

        Yeah. how many times did they say ‘Cybertruck is coming out this year’, only for it to not? I suspect they got it into production by mashing parts together because of consumer protection things coming into effect. they’re full of shit. and the “legal reasons” are so as to make the driver responsible and not them.

    • MakePorkGreatAgain@lemmy.basedcount.com
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      7 months ago

      and you’re not even supposed to have a cellphone in use while driving in WA, that’s an automatic ticket… though the police have you catch you doing it first.

      • just_change_it@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        7 months ago

        Same deal almost everywhere… but firsthand experience is that a significant portion of all drivers have their phone out.

        Would love to see some proportional crash rates of autopilot use vs not autopilot use too. People focus on things like crash totals or death totals. 17 deaths is a tragedy to be sure.

        That being said when the US has over 40,000 auto deaths per year… and this article is telling me only 17 deaths are in any way involved with Autopilot since 2019… I really wonder why this is somehow more outrageous than the ~240,913 other vehicle deaths in the US since 2019. Given that Tesla is about 5% of all autos in the US, I would expect tesla deaths to be about 12,000 deaths in that period, or 5%.

        Are so few people using autopilot? Shouldn’t the autopilot death toll be something closer to the 2000 deaths per year one would expect statistically from Tesla Drivers?

        Is autopilot much safer than human drivers? Is it more dangerous?

        Is Autopilot + Attentive safer than just attentive?

        Is the 40k deaths per year not something that should be considered simply because people stop thinking of so many deaths as a tragedy and just think of it as a statistic?

        Is the outrage and focus on car self driving just an extension of human phobia of technology and articles allow for people to have anecdotal confirmation bias?

    • AA5B@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      I wish there was some sort of safe harbor for publishing the details - whatever liability they deserve should not be affected by being honest and transparent. It would be very useful to know what the car detected or didn’t and what mode it was in, but I’m sure their lawyers will keep any details to a minimum for liability

      • FuglyDuck@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        If I ever get into a self driving car and let it do its thing, it’s going to be fucking open source.

        There is zero chance in trusting tesla or any other giant corpos with my life, safety and the life and safety of everyone around me