In the piece — titled “Can You Fool a Self Driving Car?” — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

  • LemmyFeed@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    13 days ago

    Don’t get me wrong, autopilot turning itself off right before a crash is sus and I wouldn’t put it past Tesla to do something like that (I mean come on, why don’t they use lidar) but maybe it’s so the car doesn’t try to power the wheels or something after impact which could potentially worsen the event.

    On the other hand, they’re POS cars and the autopilot probably just shuts off cause of poor assembly, standards, and design resulting from cutting corners.

    • T156@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      13 days ago

      Rober seems to think so, since he says in the video that it’s likely disengaging because the parking sensors detect that it’s parked because of the object in front, and it shuts off the cruise control.

    • skuzz@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      13 days ago

      Normal cars do whatever is in their power to cease movement while facing upright. In a wreck, the safest state for a car is to cease moving.

    • FiskFisk33@startrek.website
      link
      fedilink
      English
      arrow-up
      1
      ·
      13 days ago

      if it can actually sense a crash is imminent, why wouldn’t it be programmed to slam the brakes instead of just turning off?

      Do they have a problem with false positives?