this post was submitted on 22 Oct 2024
89 points (94.9% liked)

[Dormant] Electric Vehicles

3206 readers
1 users here now

We have moved to:

[email protected]

A community for the sharing of links, news, and discussion related to Electric Vehicles.

Rules

  1. No bigotry - including racism, sexism, ableism, casteism, speciesism, homophobia, transphobia, or xenophobia.
  2. Be respectful, especially when disagreeing. Everyone should feel welcome here.
  3. No self-promotion.
  4. No irrelevant content. All posts must be relevant and related to plug-in electric vehicles — BEVs or PHEVs.
  5. No trolling.
  6. Policy, not politics. Submissions and comments about effective policymaking are allowed and encouraged in the community, however conversations and submissions about parties, politicians, and those devolving into general tribalism will be removed.

founded 1 year ago
MODERATORS
 

A fan of Tesla might think that the automaker just can't catch a break when it comes to its autonomous driving tech. It’s already subject to several federal investigations over its marketing and deployment of technologies like Autopilot and Full Self-Driving (FSD), and as of last week, we can add another to the list involving around 2.4 million Tesla vehicles. This time, regulators are assessing the cars' performance in low-visibility conditions after four documented accidents, one of which resulted in a fatality.

The National Highway Traffic Safety Administration (NHTSA) says this new probe is looking at instances when FSD was engaged when it was foggy or a lot of dust was in the air, or even when glare from the sun blinded the car’s cameras and this caused a problem.

What the car can "see" is the big issue here. It's also what Tesla bet its future on.

all 32 comments
sorted by: hot top controversial new old
[–] [email protected] 11 points 3 weeks ago

Sure just use the public for beta testing automobile safety.

Any reasonable adult should be fine with that.

Whoops. Sorry for your loss. Hey v2.0 is out!

[–] [email protected] 4 points 3 weeks ago (4 children)

Unlike the vast majority of its competitors that are giving their cars with autonomous driving capabilities more ways to “see” their surroundings, Tesla removed ultrasonic and other types of sensors in favor of a camera-only approach in 2022.

This means there isn’t really any redundancy in the system, so if a Tesla with FSD enabled drives through dense fog, it may not have an easy time keeping track of where the road is and staying on it. Vehicles that not only have cameras but also radar and lidar will make more sense of their environment even through dense fog, although these systems are also affected by the elements. Inclement weather seems to sometimes make FSD go rogue.

I didn’t realize they were using other sensors in the past and dropped them on newer models.

Older Teslas had a combination of radar and cameras for Autopilot and driver assistance systems. With newer software versions launched after Tesla went down the "Pure Vision" route, it disabled the sensors in the older cars that had them from the factory. So even if you have FSD enabled in an older Tesla that has more than just cameras, only the cameras will be used when the car is driving itself.

🤦‍♂️

Didn’t want to develop two different versions of software I guess?

[–] [email protected] 1 points 3 weeks ago

If FSD noticed poor weather conditions, it will prompt you to take over as it will not just drive you off the road.

[–] [email protected] 1 points 3 weeks ago (1 children)

Isn’t vision cameras the only sensor we have to recognize lane markings? This article is bunk making it seem like that’s not industry standard. RADAR can’t see paint on the road. My understanding is neither can LiDAR well enough for real-time lane markings at highway speeds.

[–] [email protected] 1 points 1 week ago

It's not only about seeing the markings. It's also about recognizing potential colliding objects in less than ideal scenarios.

[–] [email protected] -1 points 3 weeks ago (1 children)

The problem was the different sensors could sometimes disagree. Like, vision sees an obstacle but radar isn't picking it up...which one does the software believe?

And if you think vision has problems with things like rain and fog, try radar or lidar!

Not mentioning the downsides of the other sensors always makes me suspicious of an article.

The key point of going vision-only is that: its what humans do every day. Articles that leave that out also disappoint me.

[–] [email protected] 1 points 3 weeks ago

It's called consensus. Have three sensors and each get a vote. Typically these sensors are the same and thus can detect a failure or incorrect reading of one. This idea is used in IT around data backups and RAID configurations as well as aviation. And .. I personally would just favor the radar. If vision says go and radar says stop... stop and avoid hitting that firetruck parked on the highway. Or that motorcyclist. Or any other bizarre vision-only, fatal crashes that this system has wrought.

Also humans can hear things. So, not just vision.

[–] [email protected] 3 points 3 weeks ago

I thought they canceled a contract for an outsourced system

[–] [email protected] 11 points 3 weeks ago (2 children)

Maybe Tesla shouldn't be allowed to call their enhanced cruise control "autopilot." Everyone knows how "autopilots" are supposed to work.

[–] [email protected] 5 points 3 weeks ago

Well, actually, that's kind of the problem. It actually does more than what real autopilot does already. Autopilot in a plane can't help the plane not hit moving objects, it's not context aware at all. It just flies a pre-programmed route and executes pre-programmed maneuvers. Literally the first release was already better than what autopilot really is.

Planes are only safe because there is never supposed to be anything else anywhere near them. Which makes autopilot super easy. Which is why planes have had it since long before we had any context aware machines.

Also, if "roadspace" was treated the same as "airspace", including the amount of training and practice pilots have, as well as "road traffic controllers" like air traffic controllers. Self driving would have had no trouble right from the get-go. Pre-programmed routes, and someone making sure there is a specified gratuitous minimum space between each vehicle. And any violation being immediately harshly reprimanded...

Aitopilot is relatively easy compared to self-driving, if anything, calling it autopilot was being under ambitious.

[–] [email protected] 0 points 3 weeks ago* (last edited 3 weeks ago)

Everyone thinks they know.

But the autopilot on an aircraft or ship is often just a cruise control, maintaining a constant heading, speed, and (for aircraft) altitude. The pilot or skipper remains 100% responsible for course changes and collision avoidance.

[–] [email protected] 2 points 3 weeks ago

"Pure vision" is just "too cheap to buy LiDAR"

[–] [email protected] 46 points 3 weeks ago (4 children)

Seriously though, wtf is up with Elon not liking LIDAR? I think pretty much every other manufacturer incorporates it into their higher-end driver assist stuff at this point.

[–] [email protected] 42 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

First of all, Elon isn't that smart.

Second, it would cost more money to put multiple types of sensors on the car. Spending money bad!

Personal speculation based on Elon's past behavior follows:

Plus he wanted to focus on visual recognition stuff likely because it would have multiple possible income streams compared to a sensor that is just good at keeping a car from running into things. Focusing on the visible light spectrum means the possibilities for facial recognition, data collection by a fleet of Teslas, including the ones people bought, taking pictures, etc.

Basically he wanted to focus on the one thing that seemed more profitable and didn't want to spend money on that stupid thing that just kept the car from crashing.

[–] [email protected] 4 points 3 weeks ago

I can tell you the real reason: Cameras are cheap for the amount of stuff you can kinda-sorta manage with them. That's literally it. There's no other 4D chess game of data collecting or anything else. They're cheap to add and integrate, and adequate for object detection in typical scenarios. No need to worry about the shape of the bumper or paint effecting the radar, no need to have a bunch of individual ultrasonics integrated into the bumper and the associated wiring/labor costs.

I worked with them, and there were numerous times where they came to us asking for new sensors because their cameras were too shitty for what they wanted to do, then once they got a quote, they miraculously didn't need them and figured it out. It happened with corner radars on the Y, it happened with them removing the front radars on everything, and it happened with the ultrasonics.

They bet it all on cameras as a lie to consumers and defraud investors that their cheap shit-boxes would be income generating Robotaxis. Even worse, their own engineers had hard data showing that removing the radar would directly result in pedestrian/motorcyclist deaths, but they had to keep those bullshit production numbers going, so they took them out and it's directly resulted in dozens of likely preventable deaths.

Anyone who's ever worked with Tesla directly knows they're an absolute fucking nightmare, and even compared to the shitshow of GM or Stellantis, the absolute blatant disregard for human life at that company is disgusting.

[–] [email protected] 18 points 3 weeks ago (1 children)

He's probably stuck on his decision to cut on LIDARs and compensate it with machine learning on cam inputs alone. That doesn't bring him the edge he wanted. Still, he doubles down as he's not risking anything besides being proclaimed wrong with that decision.

[–] [email protected] 27 points 3 weeks ago (1 children)

It’s hilarious because every single time he speaks about some unique aspect of starship that goes against conventional rocketry wisdom, like “we don’t need flame trenches. You get more efficiency on flat ground”, we just have to wait a year or two and all of a sudden they’re adding back the thing they tried to do without (see tower 2 flame trench going in as we speak).

[–] [email protected] 20 points 3 weeks ago (2 children)

Then they brag about doing the thing everyone else was already doing as if it was some new concept and his Muskrats eat it up.

[–] [email protected] 8 points 3 weeks ago

He needs to risk something to care. As long as his bubble keeps floating, he can sell everything to institutions, businesses and consumers. With existing baby mittens he is cared by, he can openly scam people and burn money with a flamethrower without any repercussions.

[–] [email protected] 7 points 3 weeks ago

It sucks because the talent and skill on display over there is insane and incredible, they really work so hard to achieve never before things. But he has to speak and be the key man PR idiot and diminish those amazing accomplishments.

[–] [email protected] 8 points 3 weeks ago (4 children)

"Humans only need visible light"

[–] [email protected] 2 points 3 weeks ago

If the Wright Brothers used the same logic their flying machines would have all flapped…

[–] [email protected] 6 points 3 weeks ago

Humans can move their heads to avoid glare. They can shield glare from their eyes with visors.

Tesla cameras currently can't do either.

[–] [email protected] 11 points 3 weeks ago (1 children)

Humans are bad drivers as well. Technology should try to do better than humans, not accept the limitations of humans. When Radar, lidar (and others - possibly including things not invented yet) exist we should use them to make cars safer.

[–] [email protected] 1 points 3 weeks ago* (last edited 3 weeks ago)

You can still do better than human drivers wiith only visible light cameras by using more of them at different heights and angles than a person could pay attention to. I think mixing in other sensors and data sources would still be even better, but they're already getting more data than a human could.

[–] [email protected] 6 points 3 weeks ago (2 children)

Musk is of course right. The "only" thing he forgot was that his vision-only model needs full human level artificial intelligence behind it to work.

Very genius.

[–] [email protected] 5 points 3 weeks ago (1 children)

Musk is also forgetting that humans use other senses when driving, not just their sight.

[–] [email protected] 9 points 3 weeks ago

I use echolocation by screaming at other drivers.

[–] [email protected] 4 points 3 weeks ago

And even then, it would only be able to drive as well as a human, and humans kill tons of people on the highways.

[–] [email protected] 12 points 3 weeks ago (1 children)

I know a lot of companies go with RADAR over LIDAR because of reliability issues. RADAR is much more reliable because you can do it solid state, where LIDAR either has moving parts or is subject to IR bleed. However solid state LIDAR is finally becoming a thing so LIDAR will start becoming more commonplace in the next few years.

[–] [email protected] 6 points 3 weeks ago

How long till cars blind each other?