The obvious answer is to autopilot into ICE and Trump Regime officials. Elon pays the fine, the world is ridden of MAGATs, and one less Tesla on the road. D, D, D.
/s.
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to [email protected]
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
A collection of some classic Lemmy memes for your enjoyment
The obvious answer is to autopilot into ICE and Trump Regime officials. Elon pays the fine, the world is ridden of MAGATs, and one less Tesla on the road. D, D, D.
/s.
I'd imagine you are always responsible for what you do when you're driving, even if a system like autopilot is helping you drive.
If you are in the drivers seat, you are responsible for anything the car does unless there was a provable mechanical failure.
Especially cause autopilot disengages right before the accident so it's technically always your fault.
Yup gotta read the fine print
Just let it happen we are all going to die anyway
In my country it's always your fault. And I'm very glad.
Here in the US, even if the driver is found responsible it is often only a small fine ($500) and maybe a 30 day suspension for killing someone.
What if you kill a CEO?
Except the autopilot will modify its data that it was turned off right at the moment it hits people...
Nah, it just disengages a fraction of a second before impact so they can claim "it wasn't engaged at the moment of impact, so not our responsibility."
There were rumours about this for ages, but I honestly didn't fully buy it until I saw it in Mark Rober's vison vs lidar video and various other follow-ups to it.
It not about responsibility, it's about marketing. At no point do they assume responsibility, like any level 2 system. It would look bad if it was engaged, but you are 100% legally liable for what the car does when on autopilot (or the so called "full self driving"). It's just a lane keeping assistant.
If you trust your life (or the life of others) to a a lane keeping assistant you deserve to go to jail, be it Tesla, VW, or BYD.
For fucking real??
It turns off, but it's likely so the AEB system can kick in.
AP and AEB are separate things.
Also all L2 crashes that involve an air bag deployment or fatality get reported if it was on within something like 30s before hand, assuming the OEM has the data to report, which Tesla does.
Rules are changing to lessen when it needs to be reported, so things like fender benders aren't necessarily going to be reported for L2 systems in the near future, but something like this would still be and alway has.
What's AEB? Automatic Energetic Braking?
I'm guessing automatic emergency braking
Ok but if Tesla's using that report to get out from liability, we still've a damn problem
If it's a L2 system the driver is always liable. The report just makes sure we know it's happening and can force changes if patterns are found. The NHSTA made Tesla improve their driver monitoring based off the data since that was the main problem. The majority of accidents (almost all) were drunk or distracted drivers.
If it's a L4 system Tesla is always liable, we'll see that in June in Austin in theory for the first time on public roads.
The report never changes liability, it just let's us know what the state of the vehicle was for the incident. Tesla can't say the system was off because it was off 1 second before because we'll know it was on prior to that. But that doesn't change liability.
Oh. A fine. How will Musk survive that financially?
He won't be fine...
In the US its still on you if you have the last clear chance to avoid an accident when you are in the drivers seat. Fun to pay for a billionaires mistakes.