this post was submitted on 18 Oct 2024
782 points (98.5% liked)

Technology

58885 readers
3541 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The U.S. government’s road safety agency is again investigating Tesla’s “Full Self-Driving” system, this time after getting reports of crashes in low-visibility conditions, including one that killed a pedestrian.

The National Highway Traffic Safety Administration says in documents that it opened the probe on Thursday with the company reporting four crashes after Teslas entered areas of low visibility, including sun glare, fog and airborne dust.

In addition to the pedestrian’s death, another crash involved an injury, the agency said.

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”

(page 4) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 89 points 1 week ago (14 children)

Tesla, which has repeatedly said the system cannot drive itself and human drivers must be ready to intervene at all times.

how is it legal to label this "full self driving" ?

load more comments (14 replies)
[–] [email protected] 5 points 1 week ago

If it took them this long to look at Full Self Driving, I don't have a lot of hope. But I'd like to be pleasantly surprised.

[–] [email protected] 3 points 1 week ago

Investigators will look into the ability of “Full Self-Driving” to “detect and respond appropriately to reduced roadway visibility conditions

They will have to look long and hard...

[–] [email protected] 6 points 1 week ago

Does anyone else find this enraging ?

It’s a decade too late.

[–] [email protected] 2 points 1 week ago (1 children)

This is why you can’t have an AI make decisions on activities that could kill someone. AI models can’t say “I don’t know”, every input is forced to be classified as something they’ve seen before, effectively hallucinating when the input is unknown.

[–] [email protected] 3 points 1 week ago (1 children)

I'm not very well versed in this but isn't there a confidence value that some of these models are able to output?

[–] [email protected] 3 points 1 week ago (1 children)

All probabilistic models output a confidence value, and it's very common and basic practice to gate downstream processes around that value. This person just doesn't know what they're talking about. Though, that puts them on about the same footing as Elono when it comes to AI/ML.

[–] [email protected] 1 points 1 week ago (3 children)

Right, which is why that marvelous confidence value got somebody ran over.

load more comments (3 replies)
[–] [email protected] 5 points 1 week ago (3 children)

Maybe have a safety feature that refuses to engage self drive if it's too foggy/rainy/snowy.

[–] [email protected] 8 points 1 week ago

Inb4 someone on TikTok shows how to bypass that sensor by jamming an orange in it -__-

load more comments (2 replies)
[–] [email protected] 4 points 1 week ago

I wonder if they will now find the Emperor has no clothes.

[–] [email protected] 116 points 1 week ago (3 children)

National Highway Traffic Safety Administration is now definitely on Musk's list of departments to cut if Trump makes him a high-ranking swamp monster

[–] [email protected] 94 points 1 week ago (1 children)

Why do you think musk dumping so much cash to boost Trump? The plan all along is to get kickbacks like stopping investigation, lawsuits, and regulations against him. Plus subsidies.

Rich assholes don't spend money without expectation of ROI

He knows Democrats will crack down on shady practices so Trump is his best bet.

[–] [email protected] 27 points 1 week ago* (last edited 1 week ago)

He's not hoping for a kickback, he is offered a position as secretary of cost-cutting.

He will be able to directly shut down everything he doesn't like under the pretense of saving money.

Trump is literally campaigning on the fact that government positions are up for sale under his admin.

"I’m going to have Elon Musk — he is dying to do this... We’ll have a new position: secretary of cost-cutting, OK? Elon wants to do that," the former president said"

load more comments (2 replies)
[–] [email protected] 13 points 1 week ago* (last edited 1 week ago) (6 children)

I thought it was illegal to call it full self driving? So I thought Tesla had something new.
Apprently it's the moronic ASSISTED full self driving the article is about. So nothing new.
Tesla does not have a legal full self driving system, so why do articles keep pushing the false narrative, even after it's deemed illegal?

[–] [email protected] 3 points 1 week ago (3 children)

so why do articles keep pushing the false narrative, even after it’s deemed illegal?

The same reason that simple quadcopters have been deemed by the press to be called "drones". You can't manufacture panic and outrage with a innocuous name.

load more comments (3 replies)
[–] [email protected] 2 points 1 week ago

It was called that name at the time when the kills happened.

[–] [email protected] 16 points 1 week ago (2 children)

Assisted full self driving is an oxymoron.

load more comments (2 replies)
load more comments (2 replies)
[–] [email protected] 24 points 1 week ago (1 children)

If anyone was somehow still thinking RoboTaxi is ever going to be a thing. Then no, it’s not, because of reasons like this.

[–] [email protected] 26 points 1 week ago (6 children)

It doesn't have to not hit pedestrians. It just has to hit less pedestrians than the average human driver.

[–] [email protected] 7 points 1 week ago

The average human driver is tried and held accountable

[–] [email protected] 5 points 1 week ago* (last edited 1 week ago)

That is the minimal outcomes for an automated safety feature to be an improvement over human drivers.

But if everyone else is using something you refused to that would have likely avoided someone's death, while misnaming you feature to mislead customers, then you are in legal trouble.

When it comes to automation you need to be far better than humans because there will be a higher level of scrutiny. Kind of like how planes are massively safer than driving on average, but any incident where someone could have died gets a massive amount of attention.

[–] [email protected] 13 points 1 week ago (1 children)

It needs to be way way better than ‘better than average’ if it’s ever going to be accepted by regulators and the public. Without better sensors I don’t believe it will ever make it. Waymo had the right idea here if you ask me.

[–] [email protected] 1 points 1 week ago (3 children)

But why is that the standard? Shouldn't "equivalent to average" be the standard? Because if self-driving cars can be at least as safe as a human, they can be improved to be much safer, whereas humans won't improve.

[–] [email protected] 3 points 1 week ago (1 children)

I'd accept that if the makers of the self-driving cars can be tried for vehicular manslaughter the same way a human would be. Humans carry civil and criminal liability, and at the moment, the companies that produce these things only have nominal civil liability. If Musk can go to prison for his self-driving cars killing people the same way a regular driver would, I'd be willing to lower the standard.

[–] [email protected] 6 points 1 week ago (6 children)

Sure, but humans are only criminally liable if they fail the "reasonable person" standard (i.e. a "reasonable person" would have swerved out of the way, but you were distracted, therefore criminal negligence). So the court would need to prove that the makers of the self-driving system failed the "reasonable person" standard (i.e. a "reasonable person" would have done more testing in more scenarios before selling this product).

So yeah, I agree that we should make certain positions within companies criminally liable for criminal actions, including negligence.

load more comments (6 replies)
load more comments (2 replies)
[–] [email protected] 21 points 1 week ago (4 children)

Exactly. The current rate is 80 deaths per day in the US alone. Even if we had self-driving cars proven to be 10 times safer than human drivers, we’d still see 8 news articles a day about people dying because of them. Taking this as 'proof' that they’re not safe is setting an impossible standard and effectively advocating for 30,000 yearly deaths, as if it’s somehow better to be killed by a human than by a robot.

[–] [email protected] 1 points 1 week ago

The problem with this way of thinking is that there are solutions to eliminate accidents even without eliminating self-driving cars. By dismissing the concern you are saying nothing more than it isn't worth exploring the kinds of improvements that will save lives.

[–] [email protected] 9 points 1 week ago (1 children)

If you get killed by a robot, it simply lacks the human touch.

[–] [email protected] 7 points 1 week ago (3 children)

If you get killed by a robot, you can at least die knowing your death was the logical option and not a result of drunk driving, road rage, poor vehicle maintenance, panic, or any other of the dozens of ways humans are bad at decision-making.

[–] [email protected] 2 points 1 week ago

or a flipped comparison operator, or a "//TODO test code please remove"

[–] [email protected] 7 points 1 week ago* (last edited 1 week ago)

It doesn't even need to be logical, just statistically reasonable. You're literally a statistic anytime you interact w/ any form of AI.

load more comments (1 replies)
load more comments (2 replies)
load more comments (1 replies)
[–] [email protected] 62 points 1 week ago (5 children)

Humans know to drive more carefully in low visibility, and/or to take actions to improve visibility. Muskboxes don't.

[–] [email protected] 8 points 1 week ago (4 children)

I'm not so sure. Whenever there's crappy weather conditions, I see a ton of accidents because so many people just assume they can drive at the posted speed limit safely. In fact, I tend to avoid the highway altogether for the first week or two of snow in my area because so many people get into accidents (the rest of the winter is generally fine).

So this is likely closer to what a human would do than not.

[–] [email protected] 3 points 1 week ago (1 children)

low visibility, including sun glare, fog and airborne dust

I also see a ton of accidents when the sun is in the sky or if it is dusty out. \s

load more comments (1 replies)
load more comments (3 replies)
[–] [email protected] 48 points 1 week ago (1 children)

They also decided to only use cameras and visual clues for driving instead of using radar, heat cameras or something like that as well.

It's designed to be launched asap, not to be safe

[–] [email protected] 12 points 1 week ago

I mean, that’s just good economics. I’m willing to bet someone at Tesla has done the calcs on how many people they can kill before it becomes unprofitable

[–] [email protected] 19 points 1 week ago

Muskboxes

like that

load more comments (1 replies)
load more comments
view more: ‹ prev next ›