I'm writing this post in the context of the recent Tesla Model X crash. But there have been other failures of the autopilot here, here, and here.
Let me explain what I mean by starting with driver-assist technologies such as lane departure warning and pedestrian detection, where the driver does most of the work, but is assisted by the car.
Machine watching human ==> Great idea
Having a machine watch over a human and assist when needed is great idea. Why? A machine never tires of doing the same thing (monitoring conditions and making sure the human driver is doing the right thing). Examples of these are technologies such as lane departure warning, pedestrian detection, alertness detection. Should these fail, the human is still responsible, because the human was supposed to be alert and watching for these problems anyway. The computer may help step in and save the day should the human make an error.
Having a machine watch over a human and assist when needed is great idea. Why? A machine never tires of doing the same thing (monitoring conditions and making sure the human driver is doing the right thing). Examples of these are technologies such as lane departure warning, pedestrian detection, alertness detection. Should these fail, the human is still responsible, because the human was supposed to be alert and watching for these problems anyway. The computer may help step in and save the day should the human make an error.
Now let's look at the flip side where we have autopilot. Here most of the work is done by the car with the human expected to take over when things go wrong.
Human watching machine ==> Bad idea
Have a human waiting to take over when machine makes a mistake is a bad idea. Why? Humans have a really short attention span. Unless we are totally involved in a certain task, our minds tend to wander off into the boonies and often times even audio or visual alerts may take a while to get our attention. That's because we learn to tune out external stimuli when we are engrossed in something. This is why autopilot systems should never be sold unless the car is capable of controlling itself in all situations including the event of total system failure without any input from the user. Whoever designed this system expecting the human to be able to take over in any reasonable amount of time doesn't understand how the human mind works.
Autopilot in airplanes
Have a human waiting to take over when machine makes a mistake is a bad idea. Why? Humans have a really short attention span. Unless we are totally involved in a certain task, our minds tend to wander off into the boonies and often times even audio or visual alerts may take a while to get our attention. That's because we learn to tune out external stimuli when we are engrossed in something. This is why autopilot systems should never be sold unless the car is capable of controlling itself in all situations including the event of total system failure without any input from the user. Whoever designed this system expecting the human to be able to take over in any reasonable amount of time doesn't understand how the human mind works.
Autopilot in airplanes
Let's talk about autopilot in airplanes which has been around for years. There are many cases where airline pilots fall asleep when the plane is on autopilot.
Back to cars
How many users actually go through a training for using the autopilot systems in cars that gives them experience with the various ways in which the system can fail and the consequences of those failures?
Just because we say "the driver should be fully engaged even when autopilot is in use" doesn't mean the driver will actually be able to do so. Further, as mentioned in this article, there are situations where the car would fail to alert the driver altogether which means a driver has to be 100% engaged during autopilot use, and yet that is not realistic given how the human mind works.
So with Tesla putting full responsibility on the driver, Tesla is marketing a car with an autopilot feature but in reality they are having the driver pay Tesla to be a beta tester of their autonomous driving technology.
More than half of pilots have fallen asleep while in charge of a plane, a survey by a pilots' union suggests.
Of the 56% who admitted sleeping, 29% told Balpa that they had woken up to find the other pilot asleep as well.Fortunately, there have not been any fatalities due to that. I should note that, with airline pilots, these folks receive training and are paid to stay alert watching the machine and they are unable to do so.
Back to cars
How many users actually go through a training for using the autopilot systems in cars that gives them experience with the various ways in which the system can fail and the consequences of those failures?
Just because we say "the driver should be fully engaged even when autopilot is in use" doesn't mean the driver will actually be able to do so. Further, as mentioned in this article, there are situations where the car would fail to alert the driver altogether which means a driver has to be 100% engaged during autopilot use, and yet that is not realistic given how the human mind works.
So with Tesla putting full responsibility on the driver, Tesla is marketing a car with an autopilot feature but in reality they are having the driver pay Tesla to be a beta tester of their autonomous driving technology.