Engineer: “Aha! Self-driving cars are finally here!”
Industrial Psychologist: “Really? You mean fully autonomous vehicles that require no human intervention? You know, the ones you engineers have been promising for years?”
Engineer: “Yes, indeed. They’re finally here!
Psychologist: “That’s great! My eyesight isn’t so good these days. It would be nice to get a ride whenever I want. So, are they safe?”
E: “Our data clearly indicates that self-driving cars are very safe. The difference in fatal accident rates between autonomous vehicles and humans is significant. The vast majority of motor vehicle accidents are due to human error. Frankly, I don’t understand the controversy, other than human ignorance and fear of the unknown. The current motor vehicle accident rate for autonomous vehicles is less than 1 per 100,000 miles driven.”
P: “I’m not sure how impressive that is. There are millions of drivers on the road with an accident rate far lower. I’m a 58 year old industrial psychologist, and have not had an accident since I was a dumb teenager 40 years ago. During that time, I’ve driven over one million miles without an accident. And I have plenty of friends who have only had accidents thanks to some clueless driver who was either drunk, texting, or eating snacks. Many drivers are going to view their accident rate as essentially zero. That means every time they hear about a self-driving car blowing through a stop sign or rear-ending an emergency vehicle, they’re going to cringe. And when you claim that your fatality rate is lower, you’re ignoring the huge number of drivers who avoid the most common risk factors for fatal accidents: distracted driving, alcohol or drug abuse, failure to wear a seat belt, and excessive speeding, especially at night or in challenging weather. If you remove all of those careless drivers, your fatality rate doesn’t look so good anymore.”
E: “Well, the data is the data. Overall, self-driving cars have a far better safety record than human drivers.”
P: “Hmm. Is it my imagination, or did you just ignore everything I said? And, by the way, the lower accident rate is currently only true for fatalities. Fender-benders are actually quite common with self-driving cars. And to repeat, you need to take into consideration which human drivers you are talking about. Not all drivers are created equal. You also have the Wobegon Effect to contend with. Everybody thinks they’re above average.”
E: “Well, the irrationality of humans believing that they’re better drivers than they really are is not something that I can control. As self-driving cars become increasingly safe, many people will utilize them: business commuters, people with poor vision or other disabilities, the elderly, etc. Many people will jump at the opportunity!”
P: “I agree. But there are still important issues that you can’t avoid. You have to consider all the drivers on the road, not just those who are passengers in self-driving cars. For example, what speed will your self-driving cars travel on the highway?”
E: “Why, the speed limit, of course.”
P: “Then you’re going to piss off almost everybody else on the road, including those business commuters riding in your self-driving cars, because nobody drives at the speed limit. You’ve got to be realistic.”
E: “Sorry, but if our autonomous vehicles travel above the posted speed limit, it will expose our company to all sorts of legal problems. If there’s an accident, we’ll be accused of travelling at an unsafe speed. And government authorities wouldn’t be too pleased if we programmed our vehicles to travel in excess of the posted speed limit. I mean, the speed limit is there for a reason, right?”
P: “Yes, but as a psychologist I can assure you that the posted speed limit on many highways is there to informally tell people that they can safely travel 5 to 7 miles per hour above that speed without getting a ticket. Have you looked at the track record of red light and speed limit cameras? They’re not too popular. If you force everybody to travel at the exact speed limit, your self-driving cars may not be so popular, either.”
E: “Well, I’m merely an engineer. These sorts of psychological things are your domain, not mine.”
P: “My domain, but your problem. You are the one building the cars. What’s the point of spending billions of dollars on self-driving vehicles if you’re not going to address the social issues? Do you know what happened when the United States federal government imposed a 55 mile per hour maximum speed limit in the 1970’s to save gasoline? Many western states revolted. Montana has double the land mass of all six New England states combined. Montana is also 50% larger in size than the entire United Kingdom. People in these western states have to travel long distances just to go to the grocery store. But they were forced to adopt the lower speed limit because the national government threatened to withhold their federal highway funds.”
Engineer: “Well, it was all for the greater good, I suppose.”
Psychologist: “They didn’t see it that way. So, they came up with their own solution. They lowered the speeding fines to five or ten dollars. Even in the eastern states with higher traffic densities, over 80% of drivers ignored the speed limit, despite heavy fines including potential jail time. In other words, the speed limit mandate didn’t work. Neither will self-driving cars that refuse to go over the speed limit.”
E: “Hmph. Well, there’s no accounting for human behavior. It’s above my pay grade to make complex social decisions. My job is to build a reliable car that will safely transport humans.”
P: “I agree that it would sure be a wonderful thing, but I see all sorts of issues with widespread implementation. Human behavior needs to be taken into account when developing any new technology. Otherwise, things might not turn out so great. Take the Internet. It’s opened up incredible new vistas for accessing information and increasing the knowledge base of society. But it’s also used for a lot of nasty stuff.”
E: “Well, you have to take the good with the bad. Again, let me repeat that my job is to design the cars. These social issues are not my problem.”
P: “Then whose problem are they?”