I don't know whether this has gotten much discussion, but I just have to say... I don't like this autopilot nonsense. As a safe and responsible driver, and one who knows that people will tend to let down their guard and trust too much in automation if given a chance, I think this technology is a step in the wrong direction. Computers make stupid mistakes that a human would not, as seen here: http://money.cnn.com/2018/03/31/technology/tesla-model-x-crash-autopilot/index.html Tesla said autopilot was activated during a fatal Model X crash last week in California. The driver's hands were not on the steering wheel for about six seconds before the Model X collided with a highway median on Highway 101 in Mountain View, the company said in a statement issued late Friday. And the driver had received "several visual and one audible" cue from the vehicle to grab the wheel "earlier in the drive," it added. The autopilot feature is not fully autonomous. It handles some driving functions, but not all, and drivers are expected to stay engaged when the feature is activated. In a blog post, Tesla called the crash, which killed the driver, "devastating." It also said that drivers are safer when they have autopilot activated than when they do not. "Tesla Autopilot does not prevent all accidents -- such a standard would be impossible -- but it makes them much less likely to occur," the post reads. The company also pointed to a government report from January 2017 that found autopilot reduced crash rates for Tesla by 40%. "The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe," Tesla said. The incident is being investigated by the National Transportation Safety Board, and the news came amid a brutal week of headlines for Tesla as the company continues to struggle with production issues of its new mass-market car, the Model 3. And a recall for 123,000 luxury Model S sedans was issued Thursday due to an issue that can make the cars difficult to steer at low speeds. The company's shares are down more than 22% so far this month. The Tesla crash followed a high-profile crash in Tempe, Arizona, in which a fully autonomous Uber car struck and killed a pedestrian. ... **** In both of these crashes, we saw a similar set of circumstances - an autopilot failed to register its environment properly and a driver failed to pay attention and react in time to prevent a tragedy. Well, you're not going to change people to make them all react better, and these systems encourage people to do the opposite. These systems may be safer than the worst drivers out there, but then I think this will make bad drivers of more people if it becomes a common, widespread technology.
This **** should not be on the highways. I would prefer it not even be on be on low speed roads, that's my opinion, but why is unproven tech being thrust into 70 mph highways when clearly its still in an infant stage?
Self driving cars will never be safe unless ALL cars are computer driven, either the human element must be removed or the computer. Obviously the human version has been extremely destructive so my choice is complete automation.
Self-driving vehicles will always lull the emergency fail-safe into a dangerous state of inattentiveness.
You may have migivings about the technology, but it is here and will eventually be prolific on the roads. FYI, people make mistakes too. It will start off as driver assisted, which are in some cars now, e.g., anti slip, parallel parking and crash avoidance are just a few, and it will only increase from there.
At the very least, I think it's too early to be allowing them on public roads. I'm glad they aren't being tested where I live. I've seen plenty of stupid driving, but humans still have more sense than a dumb computer with sensors and computer logic.
well not all drivers are equal, so that dumb computer and sensors may be able to compete. to start, the pilotless will help the aged and infirmed.
And another showing a fatal flaw in the system. Video Shows Tesla On Autopilot Nearly Crashing On Hwy 101 MOUNTAIN VIEW (KPIX 5) — New video shows Tesla’s autopilot feature failing at the same location where a Tesla driver crashed and died just a couple weeks ago in Mountain View. The autopilot feature appears to have a fatal flaw. Driver Shantanu Joshi was commuting on Highway 101 near Highway 85 and testing out what his Tesla would do on autopilot. “I low key freaked out, but the car definitely starts swerving left without giving me any warnings, right into that divider,” Joshi said. A video of the incident shows the Tesla begins to veer to the left, straight into the divider, and the car never gives the driver any warnings. When the video is slowed down, you can see parts of the white lanes are faded and the car seems to think the left side of the lane — is the right. http://sanfrancisco.cbslocal.com/20...esla-on-autopilot-nearly-crashing-on-hwy-101/
I would pay for a subscription app that shows where self-driving cars are on the road. There's simply no way in hell I want to be on the same road as these experiments.
I don't get it you still have to have your hands on the wheel and watching what the car is doing and where it is heading. Driving long distance is part of my job and I love it and would never give up that control .
Same here. I love taking drives on the weekends when the weather is nice. Just fun to be part of the machine, Rowing through the gears on a nice stretch of road with a growling V8. So exhilarating, and I never want to give it up.
self driving vehicles are here to stay, better get used to it. it will mostly be used for deliveries, and then the aged and infirmed.
I don't want to see this junk try to operate on winter roads. Or in any kind of storm, come to think of it.
I think for those people it may have a good use, but I'm still not sure that a cab wouldn't be a better choice. Perhaps there should be "AI Only" roadways, and let people who want to risk it do so.
there is already driver assisted vehicle subsystems in production for some time now, e.g., antiskid control, parallel parking, collision detection and braking, et al.
Where they're permitted to be tested. They're not out in numbers and not in trickier conditions, and I would never, ever trust them in some conditions, for basically the same reason that AI can't fully take over human language translation: it's more A than I. It does not "understand" things the way a human does, and this invariably leads to critical mistakes. Humans, on the other hand, tend to lack a machine's perfect attention span and responsiveness. They fail for very different reasons.
Now there it can do well as things are today. It's just not the same thing as being in full control of a vehicle.
Absolutely, and I think those are good...but that's a far cry from the Human just closing his eyes or reading a book lol.
Automation is great. My wife's new car will actually sense an inevitable collision coming from the front, rear, sides and will activate airbags and other defenses before the crash actually happens. As for self-driving cars, I've driven these cars that have an active cruise control, and it was a constant fighting me with computer - a nerve wreaking experience!
They do not work in the storm, they actually needs to see the road surface, lines etc. They revert to a human control.
It should work better on highways as the roads are well mapped in GPS systems and the roads are one direction and usually are well lined. My car has a system to keep the car within the lines as well as a radar adaptive cruise control that will adjust speed and keep a safe following distance from the car in front. It can alert me when it no longer can make out the lines. I guess one good thing is that it detects when your hands are not on the wheel. It would be better if it used a camera system and facial recognition to make sure your eyes are on the road.
here are some search links. soon they will be ubiquitous. https://www.google.com/amp/s/www.th...self-driving-cars-map-testing-bloomberg-aspen https://www.google.com/amp/s/www.wired.com/story/embark-self-driving-truck-deliveries/amp
it may not be in full control, but that is the direction. an increase in driver assistance to the point of full auto drive.