this post was submitted on 30 Oct 2024
647 points (89.3% liked)

Technology

70248 readers
4209 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

OK, its just a deer, but the future is clear. These things are going to start kill people left and right.

How many kids is Elon going to kill before we shut him down? Whats the number of children we're going to allow Elon to murder every year?

(page 4) 50 comments
sorted by: hot top controversial new old
[–] whotookkarl@lemmy.world 8 points 6 months ago (5 children)

It doesn't have to not kill people to be an improvement, it just has to kill less people than people do

load more comments (5 replies)
[–] Kbobabob@lemmy.world 24 points 6 months ago (5 children)

Is there video that actually shows it "keeps going"? The way that video loops I know I can't tell what happens immediately after.

load more comments (5 replies)
[–] Sam_Bass@lemmy.world 18 points 6 months ago (1 children)

the deer is not blameless. those bastards will race you to try and cross in front of you.

[–] WoahWoah@lemmy.world 12 points 6 months ago (2 children)

Finally someone else familiar with the most deadly animal in North America.

[–] Sam_Bass@lemmy.world 2 points 6 months ago

yeah well ive hit about $15k worth of them over the years

load more comments (1 replies)
[–] werefreeatlast@lemmy.world 2 points 6 months ago

Just a small clarification.... Teslas only kill forward or backwards. Hardly ever has a car killed left or right 😂.

[–] NutWrench@lemmy.world 26 points 6 months ago (4 children)

For the 1000th time Tesla: don't call it "autopilot" when it's nothing more than a cruise control that needs constant attention.

[–] GoodEye8@lemm.ee 11 points 6 months ago

It is autopilot (a poor one but still one) that legally calls itself cruise control so Tesla wouldn't have to take responsibility when it inevitably breaks the law.

load more comments (3 replies)
[–] Nytixus@kbin.melroy.org 8 points 6 months ago (1 children)

I roll my eyes at the dishonest bad faith takes people have in the comments about how people do the same thing behind the wheel. Like that's going to make autopiloting self-driving cars an exception. Least a person can react, can slow down or do anything that an unthinking, going-by-the-pixels computer can't do at a whim.

[–] Lets_Eat_Grandma@lemm.ee -2 points 6 months ago (2 children)

How come human drivers have more fatalities and injuries per mile driven?

Musk can die in a fire, but self driving car tech seems to be vastly safer than human drivers when you do apples to apples comparisons. It's like wearing a seatbelt, you certainly don't need to have one to go from point A to point B, but you're definitely safer with it - even if you are giving up a little control. Like a seatbelt, you can always take it off.

[–] Semi_Hemi_Demigod@lemmy.world 2 points 6 months ago

I honestly think it shouldn't be called "self driving" or "autopilot" but should work more like the safety systems in Airbusses by simply not allowing the human to make a decision that would create a dangerous situation.

load more comments (1 replies)
[–] burgersc12@mander.xyz 2 points 6 months ago

I thought the deer would be running or something, but no its just straight on from the car, doesn't move at all! How the fuck does a deer standing dead center in front of you not get caught by the camera!

[–] Grangle1@lemm.ee 4 points 6 months ago (2 children)

I know a lot of people here are/will be mad at Musk simply for personal political disagreement, but even just putting that aside, I've never liked the idea of self-driving cars. There's just too much that can go wrong too easily, and in a 1-ton piece of metal and glass moving at speeds up to near 100 mph, you need to be able to have the control enough to respond within a few seconds if the unexpected happens, like a deer jumping in the middle of the road. Computers don't, and may never, have the benefit of contextual awareness to make the right decision as often as a human would in those situations. I'm not going to cheer for the downfall of Musk or Tesla as a whole, but they do severely need to reconsider this idea or else there will be a lot of people hurt and/or killed and a lot of liability on them when it happens. That's a lot of risk to take on for a smaller auto maker like them, just thinking in business terms.

[–] billiam0202@lemmy.world 6 points 6 months ago

An FSD car that makes perfect decisions would theoretically be safer than a human driver who also makes perfect decisions, if for no other reason than the car could do it faster.

Personally, I would love to see autonomous cars see widespread use. They don't have to be perfect, just safer mile-for-mile than human drivers. (Which means that Teslas, with Musk's gobsmackingly stupid insistence on only using cameras, will never reach that threshold).

[–] dependencyinjection@discuss.tchncs.de 8 points 6 months ago (1 children)

I mean we do let humans drive cars and some of them are as dumb as bricks and some are malicious little freaks.

Not saying we are anywhere FSD and Elon is a clown, but I would support a future with this technology if we ever got there. The issue is we would have to be all or nothing. Like you can’t have a mix of robots and people driving around.

[–] VonReposti@feddit.dk 2 points 6 months ago (1 children)

The problem is that with dumb drivers you can easily place blame at the driver and make him pay for his idiocracy. FSD is a lot more complicated. You can't really blame the driver since he wasn't driving the car but neither did the engineer or the company itself. We'd have to draw up entirely new frameworks in order to define and place criminal neglect if one should exist. Is the company responsible for a malicious developer? Is the company responsible for a driver ignoring a set guideline and sits impaired behind the emergency stop? Is the driver responsible for a software fault?

All of these questions and many more needs to be answered. Some probably can't and must remain a so-called "act of God" with no blame to place. And people is not fond of blaming just the software, they're out for blood when an accident happens and software don't bleed. Of course the above questions might be the easiest to answer but the point still stands.

[–] WoodScientist@lemmy.world 2 points 6 months ago

Full self driving should only be implemented when the system is good enough to completely take over all driving functions. It should only be available in vehicles without steering wheels. The Tesla solution of having "self driving" but relying on the copout of requiring constant user attention and feedback is ridiculous. Only when a system is truly capable of self-driving 100% autonomously, at a level statistically far better than a human, should any kind of self-driving be allowed on the road. Systems like Tesla's FSD officially require you to always be ready to intervene at a moment's notice. They know their system isn't ready for independent use yet, so they require that manual input. But of course this encourages disengaged driving; no one actually pays attention to the road like they should, able to intervene at a moment's notice. Tesla's FSD imitates true self-driving, but it pawns off the liability do drivers by requiring them to pay attention at all times. This should be illegal. Beyond merely lane-assistance technology, no self-driving tech should be allowed except in vehicles without steering wheels. If your AI can't truly perform better than a human, it's better for humans to be the only ones actively driving the vehicle.

This also solves the civil liability problem. Tesla's current system has a dubious liability structure designed to pawn liability off to the driver. But if there isn't even a steering wheel in the car, then the liability must fall entirely on the vehicle manufacturer. They are after all 100% responsible for the algorithm that controls the vehicle, and you should ultimately have legal liability for the algorithms you create. Is your company not confident enough in its self-driving tech to assume full legal liability for the actions of your vehicles? No? Then your tech isn't good enough yet. There can be a process for car companies to subcontract out the payment of legal claims against the company. They can hire State Farm or whoever to handle insurance claims against them. But ultimately, legal liability will fall on the company.

This also avoids criminal liability. If you only allow full self-driving in vehicles without steering wheels, there is zero doubt about who is control of the car. There isn't a driver anymore, only passengers. Even if you're a person sitting in the seat that would normally be a driver's seat, it doesn't matter. You are just a passenger legally. You can be as tired, distracted, drunk, or high as you like, you're not getting any criminal liability for driving the vehicle. There is such a clear bright line - there is literally no steering wheel - that it is absolutely undeniable that you have zero control over the vehicle.

This actually would work under the same theory of existing drunk-driving law. People can get ticketed for drunk driving for sleeping in their cars. Even if the cops never see you driving, you can get charged for drunk driving if they find you in a position where you could drunk drive. So if you have your keys on you while sleeping drunk in a parked car, you can get charged with DD. But not having a steering wheel at all would be the equivalent of not having the keys to a vehicle - you are literally incapable of operating it. And if you are not capable of operating it, you cannot be criminally liable for any crime relating to its operation.

[–] Imgonnatrythis@sh.itjust.works 1 points 6 months ago

I wouldn't be against using teslas to clean up the deer overpopulation problem in the US. I'm in favor of rolling this code into all Tesla models in the next update.

[–] brbposting@sh.itjust.works 24 points 6 months ago (1 children)

Tesla’s approach to automotive autonomy is a unique one: Rather than using pesky sensors, which cost money, the company has instead decided to rely only on the output from the car’s cameras. Its computers analyze every pixel, crunch through tons of data, and then apparently decide to just plow into deer and keep on trucking.

[–] Demdaru@lemmy.world 9 points 6 months ago* (last edited 6 months ago) (1 children)

I mean, to be honest...if you are about to hit a deer on the road anyway, speed up. Higher chance the scrawny fucker will get yeeted over you after meeting your car, rather than get juuuuust perfectly booped into air to crash through windshield and into your face.

Official advice I heard many times. Prolly doesn't apply if you are going slow.

Edit: Read further down. This advice is effing outdated, disregard. -_- God I am happy I've never had to put it i to test.

load more comments (1 replies)
[–] daniskarma@lemmy.dbzer0.com 4 points 6 months ago* (last edited 6 months ago)

People are well known for never ever running over anything or anyone.

load more comments