this post was submitted on 16 Jun 2025
1 points (100.0% liked)

Fuck Cars

12311 readers
110 users here now

A place to discuss problems of car centric infrastructure or how it hurts us all. Let's explore the bad world of Cars!

Rules

1. Be CivilYou may not agree on ideas, but please do not be needlessly rude or insulting to other people in this community.

2. No hate speechDon't discriminate or disparage people on the basis of sex, gender, race, ethnicity, nationality, religion, or sexuality.

3. Don't harass peopleDon't follow people you disagree with into multiple threads or into PMs to insult, disparage, or otherwise attack them. And certainly don't doxx any non-public figures.

4. Stay on topicThis community is about cars, their externalities in society, car-dependency, and solutions to these.

5. No repostsDo not repost content that has already been posted in this community.

Moderator discretion will be used to judge reports with regard to the above rules.

Posting Guidelines

In the absence of a flair system on lemmy yet, let’s try to make it easier to scan through posts by type in here by using tags:

Recommended communities:

founded 2 years ago
MODERATORS
top 49 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 1 week ago

That's pretty good, they're fairly small targets.

[–] [email protected] 0 points 1 week ago (2 children)

I'll give you my uneducated findings: self driving cars are not ready.

I doubt they will ever be really ready, they'll eventually be considered "ready enough" no software will always work without flaws. When that software controls a car a minor flaw might mean 20 deaths.

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago)

40.000 deaths by traffic accident by year (in the US). Only 20 deaths would be a major improvement. Obviously "cars" is a highly irrational discussion though.

And it's not just the victims who could be spared their lives, it's also the mental toll on those who kill people on accident. Blaming it on a flaw in the software that can be improved and flaws permanently fixed is great.

I say let the mechanized reduced slaughter begin!

[–] [email protected] 0 points 1 week ago (3 children)

Isn't Waymo in San Francisco completely self driving? And if their own recently released data is anything to go by, it would seem self driving cars are more ready than manually controlled cars. Because people are absolutely awful at driving.

[–] [email protected] 0 points 1 week ago

Comparing self driving cars to American driving standards is kinda a moot point because the american safety standards are so low that death and injury is considered the cost of doing business.

I'd be curious to see how well waymo performs compared to a country with far safer road designs and drivers that are better trained and respect rules of the road more frequently.

[–] [email protected] 0 points 1 week ago

Waymo cars use much better technology than Tesla.

Nobody is disputing that a machine that is never distracted and has reaction times down to fractions of a second would make a better driver than even the most skilled human, but Tesla's FSD hardware and software aren't there yet and probably never will be.

[–] [email protected] 0 points 1 week ago

Way is also operating in a fairly small fixed area that is highly mapped.

Not saying that's a bad thing, they are doing things the right way, slowly and cautiously.

[–] [email protected] 0 points 1 week ago

Great, they've invented Christine.

[–] [email protected] 0 points 1 week ago

It's fine, they'll fix these issues in time for the robotaxi rollout ten years from now.

What's that? They're planning on launching the robotaxis at the end of this month? Well then.

[–] [email protected] 0 points 1 week ago (1 children)

its not that the car is programmed poorly, it just really hates children

[–] [email protected] 0 points 1 week ago

Hunter Seeker Mode: School Busses. Easy Prey.

[–] [email protected] 0 points 1 week ago (1 children)

Well there is the problem right there, FSD shouldn't be doing these tests in the first place! How else is Tesla supposed to get their amazing cyber taxi out of it has to follow all these dumb rules?

[–] [email protected] 0 points 1 week ago

Not American, but I think FSD stands for Full Self-Driving, not an organization.

[–] [email protected] 0 points 1 week ago (1 children)

Pretty normal for a Tesla.

[–] [email protected] 0 points 1 week ago (1 children)

I'm a school bus driver. This is also totally normal for human-driven vehicles.

[–] [email protected] 0 points 1 week ago

That makes sense, Tesla use real driver data to train the cars. The cars ignore the traffic controls humans ignore, follows the rules humans follow

They try to fix bad behaviour, but I bet there haven't been enough human driven Teslas illegally passing school buses and having a collision for Tesla to notice that FSD ignores a rule it shouldn't

[–] [email protected] 0 points 1 week ago (1 children)

Didn’t I just read this like a few weeks ago? But there’s a Jun 15 date in the article. So did this happen again?

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago)

It's the same story making the rounds.

Edit: They also did it in Austin and somewhere else, so same situation in 2 different spots, generating like 4-5x the stories as each one gets repeated in the news cycle

[–] [email protected] 0 points 1 week ago (1 children)

If I worked at Tesla, I would very much be doing a crappy job and slipping bad ideas into what looks like good code. The Lord's work.

[–] [email protected] 0 points 1 week ago

How would you know where to put it among all the other shitty code?

[–] [email protected] 0 points 1 week ago (3 children)

I'm reading the comments about this video, and I think people are missing the point.

It's not about the Telsa running into the kid. It's about the Tesla completely ignoring the FLASHING FUCKING STOP SIGN at the side of the bus, which resulted in it hitting the kid dummy.

This could have been a pedestrian crossing, railroad stop, intersection, etc.

These vehicles aren't "smart" and should not be allowed on the road. Any idiot can have greater awareness than a Tesla.

[–] [email protected] 0 points 1 week ago (1 children)

Oh, now I get it. Didn‘t know it’s not allowed to pass the bus even when it’s on the other side of the street. In our country we teach the kids to not run across the street when they get out of the bus.

[–] [email protected] 0 points 1 week ago

Kids will do stupid things sometimes, no avoiding that. In Germany you can pass a stopped bus on the other side of the road, but if it has its hazards on, you can't go faster than walking speed.

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago)

Yeah, it might kill the kid, it might not.

Im still gonna stick to my ford F50000 Fleshreaper (BLOOD FOR THE CAR GOD!™) driven by a good old fashioned human to get the job done.

Besides, it avoids the whole mess of theological issues about who gets moloch's love.

[–] [email protected] 0 points 1 week ago (1 children)

No 'smart' device is smart.

[–] [email protected] 0 points 1 week ago (1 children)

you wouldn't say that to his face, would you? 🥺

[–] [email protected] 0 points 1 week ago

With a hammer to the camera

[–] [email protected] 0 points 1 week ago (2 children)

Still safer than a human driver tbh

[–] [email protected] 0 points 1 week ago

I have 0 killed children in 25 years.

[–] [email protected] 0 points 1 week ago (1 children)

That's shrimply not true. The numbers Tesla releases are heavily cooked.

Had a quick look around but I didn't manage to find any numbers that weren't either using Tesla'd numbers, or guessing.

But it's pretty well known that FSD sucks (have been in a car using it .. terrifying af) and that it'll turn itself off before an accident to pass accountability to the driver.

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago) (2 children)

I love how this keeps getting repeated by everyone everywhere

it’ll turn itself off before an accident to pass accountability to the driver.

But both Tesla (5 seconds) and the NHSTA (30 seconds) count any incident where a L2 system was on before the accident as having happened with the system active. So no, they do not use it for that purpose.

You know that video going around a few weeks ago where some dude with FSD on darted across the rode into a tree? Well, he got the cars data, and it turns out it was disabled due to enough torque on the wheel which is one of the ways you disable it. He probably nudged the wheel too hard by mistake and disabled it, or there was a mechanical failure which disabled it, but the accident counted as FSD in the report he got from Tesla as ON even though it was OFF at the time of the accident when he started going out of his lane.

So please just stop it with that nonsense.

[–] [email protected] 0 points 1 week ago (1 children)

I may be buying the foolishness of the masses, but your anecdotes are only as good as mine.

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago)

Which part don't you like?

I could source the crash report and a video explaining what likely happened, but if you simply don't believe that Tesla truly stands by the 5s rule in their self reported data even with the crash report, then that's another matter entirely.

[–] [email protected] 0 points 1 week ago (1 children)

what about this one? https://youtu.be/V2u3dcH2VGM

I don't know anything about self driving, but I can't imagine why it would turn off right before a crash instead of keeping the breaks held

(also I know the drivers is a total idiot and it's 100% their fault, I just want to know why it turned off)

[–] [email protected] 0 points 1 week ago

I imagine the driver was startled and pressed the brake or turned the steering wheel, either of which will cancel FSD

[–] [email protected] 0 points 1 week ago (3 children)

Sounds like a typical driver in the US

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago) (1 children)

I'm a school bus driver and this year we finally got the automatic cameras that catch people going past our red flashers and stop signs. My camera has captured about two to three drivers per day doing this. I would have rather had the automatic machine guns but the camera is a fine second choice.

Edit: the funniest thing I've had happen with the camera so far is one person that came flying past my reds, noticed the lights and stop sign as they were passing me, slammed on their brakes and then backed up past me again while mouthing "I'm so sorry" to me. Yes, they received two tickets for this - and I had nothing to do with it as the cameras are completely automated.

[–] [email protected] 0 points 1 week ago (1 children)

do they get tickets in the mail?

[–] [email protected] 0 points 1 week ago (2 children)

Yeah, that's how it works. A lot of people just never pay them, though.

[–] [email protected] 0 points 1 week ago

That must impact their insurance even if they try to avoid the tickets though?

Glad you got cameras for your bus!

[–] [email protected] 0 points 1 week ago (1 children)

Do people in the US just get away with not paying tickets?

Over here, if you don't pay fines, it will get escalated until the point of seizure, and if you have nothing else to seize, they will take your car.

Not paying isn't really an option.

[–] [email protected] 0 points 1 week ago

Apparently, the issue with mail-in tickets specifically is that while the camera can catch the license plate number, it can't really prove who was driving the car. So whereas an in-person ticket from a cop for passing a school bus will result in points on your license, a mail-in ticket from a camera like this won't. The same problem applies to people that just don't pay the mail-in tickets - the state doesn't really know who to go after specifically.

[–] [email protected] 0 points 1 week ago

The only difference is that a driver would get out of their car, check for damage to their vehicle, and then get mad at the kid! /s

[–] [email protected] 0 points 1 week ago (1 children)
[–] [email protected] 0 points 1 week ago (2 children)

What the hell is wrong with the author of that article? Jesus christ

[–] [email protected] 0 points 1 week ago (1 children)
[–] [email protected] 0 points 1 week ago (1 children)

It predates ChatGPT so I doubt it. This is organic incompetence

[–] [email protected] 0 points 1 week ago

Bot written articles were a thing before ChatGPT.

[–] [email protected] 0 points 1 week ago* (last edited 1 week ago)

Seriously, it was written like a 1980s interview with Boy George. No thought, missing words, and even sentences with no subject.