this post was submitted on 02 Apr 2025
1131 points (98.9% liked)
Technology
68813 readers
4799 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Let's get this out of the way: Felon Musk is a nazi asshole.
Anyway, It should be criminal to do these comparisons without showing human drivers statistics for reference. I'm so sick of articles that leave out hard data. Show me deaths per billion miles driven for tesla, competitors, and humans.
Then there's shit like the boca raton crash, where they mention the car going 100 in a 45 and killing a motorcyclist, and then go on to say the only way to do that is to physically use the gas pedal and that it disables emergency breaking. Is it really a self driving car at that point when a user must actively engage to disable portions of the automation? If you take an action to override stopping, it's not self driving. Stopping is a key function of how self driving tech self drives. It's not like the car swerved to another lane and nailed someone, the driver literally did this.
Bottom line I look at the media around self driving tech as sensationalist. Danger drives clicks. Felon Musk is a nazi asshole, but self driving tech isn't made by the guy. it's made by engineers. I wouldn't buy a tesla unless he has no stake in the business, but I do believe people are far more dangerous behind the wheel in basically all typical driving scenarios.
In Boca Raton, I've seen no evidence that the self-driving tech was inactive. According to the government, it is reported as a self-driving accident, and according to the driver in his court filings, it was active.
Insanely, you can slam on the gas in Tesla's self-driving mode, accelerate to 100MPH in a 45MPH zone, and strike another vehicle, all without the vehicle's "traffic aware" automation effectively applying a brake.
That's not sensationalist. That really is just insanely designed.
FTFA:
If the guy smashes the gas, just like in cruise control I would not expect the vehicle to stop itself.
The guy admitted to being intoxicted and held the gas down... what's the self driving contribution to that?
I know what's in the article, boss. I wrote it. No need to tell me FTFA.
TACC stands for Traffic Aware Cruise Control. If I have a self-driving technology like TACC active, and the car's sensor suite detects traffic immediately in front of me, I would expect it to reduce speed (as is its advertised function). I would expect that to override gas pedal input, because the gas pedal sets your maximum speed in cruise control, but the software should still function as advertised and not operate at the maximum speed.
I would not expect it to fail to detect the motorcyclist and plow into them at speed. I think we can all agree that is a bad outcome for a self-driving system.
Here's the manual, if you're curious. It doesn't work in bright sunlight, fog, excessively curvy roads (???), situations with oncoming headlights (!?!), or if your cameras are dirty or covered with a sticker. They also helpfully specify that "The list above does not represent an exhaustive list of situations that may interfere with proper operation of Traffic-Aware Cruise Control," so it's all that shit, and anything else - if you die or kill somebody, you have just found another situation that may interfere with proper function of the TACC system.
https://www.tesla.com/ownersmanual/2012_2020_models/en_us/GUID-50331432-B914-400D-B93D-556EAD66FD0B.html#%3A%7E%3Atext=Traffic-Aware+Cruise+Control+determines%2Cmaintains+a+set+driving+speed.
So do you expect self driving tech to override human action? or do you expect human action to override self driving tech?
I expect the human to override the system, not the other way around. Nobody claims to have a system that requires no human input, aside from limited and experimental implementations that are not road legal nationwide. I kind of expect human input to override the robot given the fear of robots making mistakes despite the humans behind them getting into them drunk and holding down the throttle until they turn motorcyclists into red mist. But that's my assumption.
With the boca one specifically, the guy got in his car inebriated. That was the first mistake that caused the problem that should never have happened. If the car was truly self driving automated and had no user input, this wouldn't have happened. It wouldn't have gone nearly 2.5x the speed limit. It would have braked long in advance before hitting someone in the road.
I have a ninja 650. We all know the danger comes from things we cannot control, such as others. I'd trust an actually automated car over a human driver always, even with limited modern tech. The second the user gets an input though? zero trust.
The driver being drunk doesn't mean the self-driving feature should not detect motorcycles. The human is a fallback to the tech. The tech had to fail for this fatal crash to occur.
If the system is advertised as overrriding the human speed inputs ( traffic aware cruise control, it is supposed to brake when it detects traffic, regardless of pedal inputs), then it should function as advertised.
Incidentally, I agree, I broadly trust automated cars to act more predictably than human drivers. In the case of specifically Teslas and specifically motorcycles, it looks like something is going wrong. That's what the data says, anyhow. If the government were functioning how it should, the tech would be disabled during the investigation, which is ongoing.
He may not be an engineer, but he's the one who made the decision to use strictly cameras rather than lidar, so yes, he's responsible for these fatalities that other companies don't have. You may not be a fan of Musk, but it sounds like you're a fan of Tesla
"Critical Thinker" Yikes. Somehow the right made that a forbidden word in my mind because they hide behind that as an excuse for asking terrible questions etc.
Anyway. Allegedly the statistics are rather mediocre for self driving cars. But sadly I haven't seen a good statistic about that, either. The issue here is that automatable tasks are lower risk driving situations so having a good statistic is near impossible. E.g. miles driven are heavily skewed when you are only used on highways as a driver. There are no simple numbers that will tell you anything of worth.
That being said the title should be about the mistake that happened without fundamental statements (i.e. self driving is bad because motorcyclists die).
Did I ask a terrible question, or do you just not like anything being objective about the issue? I'm so far over on the left side ideologically that you'd be hard pressed finding an issue that i'm conservative on. I don't fit the dem mold though, i'm more of a bernie.... though I am very critical in general. I don't just take things at face value. Anywho...
Saying that the statistics aren't great just lends credence to the fact that we can't objectively determine how safe or unsafe anything is without good data.