https://abcnews.go.com/Business/wireStory/driver-stand-trial-deadly-tesla-crash-california-84863184 “There is enough evidence to try Kevin George Aziz Riad, 27, on two counts of vehicular manslaughter, a Los Angeles County judge said.” Tesla said sensors showed the driver had his hand on the wheel but didn’t use the brakes. The issue I have with too much automation is to know where automation stops and when you takeover. Bottom line: this could be the beginning of the end of self driving cars and trucks. Even if the drivers deserve the blame, Tesla is the one with the money. We don’t put robots in jail, just their owners.
People will evaluate the risks vs the benefits and decide. IMO, when the self driving AI is good enough to allow the driver to sleep in the backseat it will become irresistible even if there are few casualties.
The driver should be entirely responsible for everything a self-driving car does because the self-driving option is not mandatory. If you're going to trust your life and the life of every motorist around you by trusting unreliable technology, you can face the full consequences when it fails. The other drivers, passengers, and pedestrians who will be hurt or killed will have to face the consequences of your actions too, except they won't get a choice.
True. This dovetails into tractor trailer rigs being self driving. I’m against it for two reasons: A: Truck driving is the best job some people will ever have. B: 80,000 pounds on auto pilot sharing the road with squishy humans creates a sea change in the driving experience. For example, they could all line up one behind the other making an impenetrable barrier preventing you from merging to get off the freeway, for example.
Insurance lobbists are going to be all over Congress greasing palms for damage minimization (to there bottom line) legislation.
Stalin and Robespierre both apparently said it. So, there is a lot to be said for not even trying to make omelettes.
Just no way. Now I will state that for 40 of my working career I was in industrial sales and traveled the Southeast primarily with management duties in other regions and put anywhere from 40K-60K miles a year behind the wheel. Even now in retirement my fun is to take my MX-5 Mazdaspeed on a backroad out there. Manual for me. NO WAY I'm going to ride in a car and it be on automatic and I just sit there. I hate it when other people are driving let alone an autonomous machine. And to have to do it but YOU have to be on alert with hands on the wheel and ready to take over. Well what's the point then? Yep you are the operator of that vehicle you are responsible for what happens with it. Hope this sends a chill down everyone's back about not being in control of your vehicle on a public road.
I'm sure Tesla's lawyers made sure that all potential car buyers signed off on not being an idiot and were told they're responsible for maintaining control of the car. The GPS in my car tells me not to be an idiot every time I turn it on and I have to agree to it before it does any navigating. Also, In my city of Las Cruces and down in El Paso, anyone who hits anything is presumed to be at fault for not taking the precautions for not hitting something and they always get a ticket for reckless driving. The guy who was driving this Tesla is an idiot. I'll be he was wearing a mask.
I try to interact with other drivers when I am driving to make sure we see each other and know what each is going to do. When it is just an automous machine you better give it the right of way. Going into automated factories and having to watch the safety film before you could go in they make it clear any robotic vehicle has the right of way and it is your responsibly to get out of it's way. They can't be trusted to make that decision.
Isn't there some system where if your hands aren't on the wheel for so many seconds it makes an alarm? I heard it from some salesguys about "OH I'll be able to do my reports and office work while the car drives itself it's be great"............NO, keep your eyes on the road, you have to pay attention and be prepared to take over.
I don't know about that, but let's say you're an airline pilot flying at 35000 feet where there is no traffic or red lights. If you put the plane on auto pilot and start messing around with a stewardess and things go south you are in a lot of trouble. You may never work again and you might be arrested.
I fly flight sims, just learning how to work those flight systems is an education, but I prefer hands on in civilian aircraft than that heavies. Making those bush landings or island hopping!
Only an utter FOOL (at this early stage) would be stupid enough to trust the "autonomous" self driving car marketing game from these same penny pinching automobile manufactures; and their parts suppliers who can't even make their airbags safely.
Not a bug. There is no system marketed or licensed in the US as an "autopilot" for cars. This one is entirely on the driver.
if you sell it, and it's buggy, you bear some responsibility especially if they said it was safe, which confuses people - that said, I agree, I would not trust them "Tesla insists Autopilot and Full Self-Driving are safe, but US senators aren’t buying it" https://www.theverge.com/2022/3/9/22969297/tesla-autopilot-fsd-letter-safety-blumenthal-markey
I call it “Automation that does too much”. My new car does all kinds of “assists”. With my old cruise control, I knew my “role”: “brakes are on you, buddy”. On my new car, I can let it speed up, brake, slow down speed up then brake again, until something happens and it is suddenly my turn to be a driver again. It’s like playing tag, but ending in “Oh Bleep!”