Wednesday 27 August 2014

The Moral Case for Self-Driving Cars

Guest post by Ronald Bailey

Tesla, Nissan, Google, and several carmakers have declared that they will have commercial self-driving cars on the highways before the end of this decade. Experts at the Institute of Electrical and Electronics Engineers predict that 75% of cars will be self-driving by 2040. So far California, Nevada, Florida, Michigan, and the District of Columbia have passed laws explicitly legalizing self-driving vehicles, and many other states are looking to do so.

The coming era of autonomous autos raises concerns about legal liability and safety, but there are good reasons to believe that robot cars may exceed human drivers when it comes to practical and even ethical decision making.

More than 90% of all traffic accidents are the result of human error. In 2011, there were 5.3 million automobile crashes in the United States, resulting in more than 2.2 million injuries and 32,000 deaths. Americans spend $230 billion annually to cover the costs of accidents, accounting for approximately 2 to 3% of GDP.

Proponents of autonomous cars argue that they will be much safer than vehicles driven by distracted and error-prone humans. The longest-running safety tests have been conducted by Google, whose autonomous vehicles have traveled more than 700,000 miles so far with only one accident (when a human driver rear-ended the car). So far, so good.

Stanford University law professor Bryant Walker Smith, however, correctly observes that there are no engineered systems that are perfectly safe.

Smith has roughly calculated that “Google’s cars would need to drive themselves more than 725,000 representative miles without incident for us to say with 99% confidence that they crash less frequently than conventional cars.”

Given expected improvements in sensor technologies, algorithms, and computation, it seems likely that this safety benchmark will soon be met.

Still, all systems fail eventually. So who will be liable when a robot car — howsoever rarely — crashes into someone?

An April 2014 report from the good-government think tank the Brookings Institution argues that the current liability system can handle the vast majority of claims that might arise from damages caused by self-driving cars.

A similar April 2014 report from the free market Competitive Enterprise Institute (CEI) largely agrees, “Products liability is an area that may be able to sufficiently evolve through common law without statutory or administrative intervention.”

A January 2014 RAND Corporation study suggests that one way to handle legal responsibility for accidents might be to extend a no-fault liability system, in which victims recover damages from their own auto insurers after a crash. Another RAND idea would be to legally establish an irrebuttable presumption of owner control over the autonomous vehicle.

Legislation could require that “a single person be responsible for the control of the vehicle. This person could delegate that responsibility to the car, but would still be presumed to be in control of the vehicle in the case of a crash.”

This would essentially leave the current liability system in place. To the extent that liability must be determined in some cases, the fact that self-driving cars will be embedded with all sorts of sensors, including cameras and radar, will provide a pretty comprehensive record of what happened during a crash.

Should we expect robot cars to be more ethical than human drivers? In a fascinating March 2014 Transportation Research Record study, Virginia Tech researcher Noah Goodall wonders about “Ethical Decision Making During Automated Vehicle Crashes.”

Goodall observes that engineers will necessarily install software in automated vehicles enabling them to “predict various crash trajectory alternatives and select a path with the lowest damage or likelihood of collision.”

To illustrate the challenge, Stanford’s Smith considers a case in which you are driving on a narrow mountain road between two big trucks. “Suddenly, the brakes on the truck behind you fail, and it rapidly gains speed,” he imagines. “If you stay in your lane, you will be crushed between the trucks. If you veer to the right, you will go off a cliff. If you veer to the left, you will strike a motorcyclist. What do you do? In short, who dies?”

Fortunately such fraught situations are rare. Although it may not be the moral thing to do, most drivers will react in ways that they hope will protect themselves and their passengers. So as a first approximation, autonomous vehicles should be programmed to choose actions that aim to protect their occupants.

Once the superior safety of driverless cars is established, they will dramatically change the shape of cities and the ways in which people live and work.

Roadway engineers estimate that typical highways now accommodate a maximum throughput of 2,200 human-driven vehicles per lane per hour, utilizing only about 5% of roadway capacity. Because self-driving cars would be safer and could thus drive closer and faster, switching to mostly self-driving cars would dramatically increase roadway throughput.

One estimate by the University of South Florida’s Center for Urban Transportation Research in November 2013 predicts that a 50% autonomous road fleet would boost highway capacity by 22%; an 80% robot fleet will goose capacity 50%, and a fully automated highway would see its throughput zoom by 80%.

Autonomous vehicles would also likely shift the way people think about car ownership. Currently most automobiles are idle most of the day in driveways or parking lots as their owners go about their lives. Truly autonomous vehicles make it possible for vehicles to be on the road much more of the time, essentially providing taxi service to users who summon them to their locations via mobile devices.

Once riders are done with the cars, the vehicles can be dismissed to serve other patrons. Self-driving cars will also increase the mobility of the disabled, elderly, and those too young to drive.

Researchers at the University of Texas, devising a realistic simulation of vehicle usage in cities that takes into account issues such as congestion and rush hour patterns, found that if all cars were driverless each shared autonomous vehicle could replace 11 conventional cars. In their simulations, riders waited an average of 18 seconds for a driverless vehicle to show up, and each vehicle served 31 to 41 travelers per day. Less than one half of one percent of travelers waited more than five minutes for a ride.

By one estimate in a 2013 study from Columbia University’s Earth Institute, shared autonomous vehicles would cut an individual’s average cost of travel by as much as 75% compared to now. There are some 600 million parking spaces in American cities, occupying about 10% of urban land.

In addition, 30% of city congestion originates from drivers seeking parking spaces close to their destinations. A fleet of shared driverless cars would free up lots of valuable urban land while at the same time reducing congestion on city streets. During low demand periods, vehicles would go to central locations for refueling and cleaning.

Since driving will be cheaper and more convenient, demand for travel will surely increase. People who can work while they commute might be willing to live even farther out from city centers.

But more vehicle miles traveled would not necessarily translate into more fuel burned. For example, safer autonomous vehicles could be built much lighter than conventional vehicles and thus consume less fuel. Smoother acceleration and deceleration would reduce fuel consumption by up to 10%.

Optimized autonomous vehicles could cut both the fuel used and pollutants emitted per mile. And poor countries could “leapfrog” to autonomous vehicles instead of embracing the personal ownership model of the 20th century West.

If driverless cars are in fact safer, every day of delay imposes a huge cost. People a generation hence will marvel at the carnage we inflicted as we hurtled down highways relying on just our own reflexes to keep us safe.


Ronald BaileyRonald Bailey is the award-winning science correspondent for Reason magazine and Reason.com, where he writes a weekly science and technology column. Bailey is the author of the book Liberation Biology: The Moral and Scientific Case for the Biotech Revolution (Prometheus, 2005), and his work was featured in The Best American Science and Nature Writing 2004.
This post appeared previously at Reason.com, and Laissez Faire Today.

4 comments:

Dinther said...

As a techie I'd jump to embrace self driving cars. However as a libertarian I am not quite as excited as it should be clear that a large amount of autonomy and privacy will be surrendered when automated cars become the norm.

It is easy to see how government would take control of your vehicle because they'd want a word with you or unpaid bills result in you being refused travel. Lifting logs tells authorities where you have been and when. (Yes, I know largely already the case with license plate scanners)

A few years back Electronic Stability Control on cars was the next new novel feature. Now Government is making Electronic Stability Control compulsory on new vehicles. It is not hard to imagine what will happen with self driving cars.

Anonymous said...

"So as a first approximation, autonomous vehicles should be programmed to choose actions that aim to protect their occupants."

This brings up some issues. The programmer is now the one making the moral decisions, albeit from a distance in time and location. He is the one whose choices in coding cause the actions of the self-driving car. Remember, the car makes no decisions, it can only do as it is programmed to do. It is a mindless automaton. So, in reality the programmer is responsible for what happens. There is where the responsiblity and liability lie. Watch these mercantilist swine duck out from that one by using political pull to have the laws written to poke liability onto the "little people"- the ones who have not got deep enough pockets to fight city hall.

As far as the claim that these types of "moral choice" incident are quite rare, the author is fooling himself.

If you do not own and control your own car, then what you are dealing with is a version of public transport under a central authority with all the issues and problems for you that necessarily entails.

As for this increasing of the road capacity by reducing headway between vehicles and having them cross busy intersections without hesitation at speed (the idea is that they do not stop but are instead timed such that each misses the other), is it worth the chance? Is it something that you'd put to the test on a regular basis with your life? What about your children's? How lucky do you feel today? After all, this is all only about as good as the nerd what wrote the software. Hopefully he wasn't having a blue screen of death day. But just imagine the scene. Three second headways at 50kph. A busy collision cross roads. The cars coming into the intersection intended to continue on their trajectories through the intersection at speed. The timing, as one crosses directly in front of another, has to be perfect. Perfect every time. Perfect. Nothing else will do. Perfect. Now think about that "moral choice" coding again.

This is not the best way to go. It would be far preferable to place autonomous vehicles on their own separated grade.

One more point. The publically available vehicles (the ones that the author expects to serve 30 or more "travellers" a day) will end up looking exactly as you'd expect- interiors cursed with litter, excrement, rubbish, junk, vomit, graffiti, unpleasant odours, vandalised interiors, sticky stuff on the seats and all that sort of nastiness. In one or two design iterations the interior design will have degenerated into the grotty anti-vandal public toilet style complete with inescapable advertising and audio public service messages for your safety, citizen. Meanwhile the exteriors will be ugly to the point of offensive, intended to be as gharish as possible to call attention to the "service"- think franchise, think low rent front of mind attention seeking. Take a look at the Google "car" thing. It IS ugly.

And this is all justified on the appeal to safety! So was communism.

Amit

Unknown said...

Thanks a lot for sharing. You have done a brilliant job.dealing used cars

alia52nalie said...

I am not quite sure how successful self driving cars will be as when you imagine thousands of them on road, you can’t simply expect everything to be normal. My brother who works for a DUI lawyer was telling me that thought the number of drunk or distracted driving cases will significantly reduce.