this post was submitted on 19 Mar 2025
1501 points (98.5% liked)

Not The Onion

15546 readers
478 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
 

In the piece — titled "Can You Fool a Self Driving Car?" — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

(page 4) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 142 points 1 week ago (23 children)

I hope some of you actually skimmed the article and got to the "disengaging" part.

As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

It's a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

load more comments (23 replies)
[–] [email protected] 24 points 1 week ago* (last edited 1 week ago) (7 children)

To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well. That being said, tesla shouldn't rely on cameras

Edit: having just watched the video, that was a very obvious fake wall. You can see the outlines of it pretty well. I'm also surprised it failed other tests when not on autopilot, seems pretty fucking dangerous.

[–] [email protected] 29 points 1 week ago* (last edited 1 week ago) (3 children)

To be fair, if you were to construct a wall and paint it exactly like the road, people will run into it as well.

this isn't being fair. It's being compared to the other- better- autopilot systems that use both LIDAR and radar in addition to daylight and infrared optical to sense the world around them.

Teslas only use daylight and infrared. LIDAR and radar systems both would not have been deceived.

load more comments (3 replies)
[–] [email protected] 16 points 1 week ago* (last edited 1 week ago) (5 children)

The video does bring up human ability too with the fog test ("Optically, with my own eyes, I can no longer see there's a kid through this fog. The lidar has no issue.") But, as they show, this wall is extremely obvious to the driver.

load more comments (5 replies)
[–] [email protected] 7 points 1 week ago (1 children)

Yeah, the Roadrunner could easily skip by such barriers, frustrating the Coyote to no end. Tesla is not a Roadrunner.

load more comments (1 replies)
load more comments (4 replies)
[–] [email protected] 46 points 1 week ago (1 children)

It's a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

So, who's the YouTuber that's gonna test this out? Since Elmo has pushed his way into the government in order to quash any investigation into it.

[–] [email protected] 14 points 1 week ago (4 children)

It basically already happened in the Mark Rober video, it turns off by itself less than a second before hitting

load more comments (4 replies)
load more comments
view more: ‹ prev next ›