SDCs: Legal Issues

From CS378H Public Policy and the Digitally Native Technologist
Jump to: navigation, search

In the United States

In the past few years, increasing amounts of legislation has been created to allow the testing of autonomous motor vehicles. However, the possibility of self-driving cars hitting the streets coming closer to reality, the United States will have to speed up the process of revamping the current out-dated legislation in support of this new era. The main considerations are: legality, liability, patents, and security.

In the United States, regulation of autonomous vehicles begins at the state level. Rather than issuing technical specifications directly, legislatures are instructing state agencies to oversee the introduction of autonomous vehicles to state roads. In five states, a person operating one of these vehicles must have a specific endorsement on their driver’s license from the DMV or its equivalent in that state.

At the national level, the biggest contribution that the National Highway Traffic Safety Association (NHTSA) has made in the world of autonomous cars is standardizing a system to measure levels of autonomy[1]. The 5 levels span from no assistance to complete autonomy. They suggest requiring a special license for such vehicles because safety concerns differ for autonomous vehicles.

The NHTSA road rules refer specifically to human anatomy for braking/steering, so a human must be present while an autonomous vehicle is driving so that there is a driver present that can perform a manual override[2]. As a result, the United States allows driverless cars, but stipulates the presence of a human occupant. However, the NHTSA granted exemptions to Google. Now, Google cars drive themselves without a human occupant on roads with speed limits under 20 mph.

The NHTSA wants to rework the existing legal code to allow for self-driving vehicles. At present, the NHTSA recommends restricting the legality of open-road class 3 (driver needs to be present for occasional control) and 4 (fully autonomous) vehicles. NHTSA also wants to pass federal regulations to standardize legality of autonomous vehicles across the nation.

The International Community

Few existing international laws and treaties govern autonomous vehicles. Existing documents address issues about who can drive a car and have been updated to accommodate driverless cars. The 1949 Geneva Convention on Road Traffic[3], which outlines international traffic law, amended standards to account for driverless cars. According to Article 8, every car must have a driver who “at all times… [is] able to control it.” A 2011 amendment to Article 8 clarified this: driver assistance programs are allowed as long as a driver can always manually override it.

The 1968 Vienna Convention on Highway Safety established international highway safety laws. The Convention’s Article 8 explicitly stated that only a human driver could legally drive a car; In 2012, Article 8 was amended[4] to allow a car to drive itself provided that a driver could override or switch off the system at any time. Drivers must be able to take the wheel while the car is driving.

The United Kingdom and Germany have been reviewing their domestic traffic safety and road laws and are moving towards making driverless car testing legal on public roads, but the European Union has no comprehensive autonomous vehicle legislation.


With the looming adoption of partially or fully automated vehicles, lawyers and manufacturers must address the question of liability in the case of serious accidents. Today, most accidents result from operator error. Thus, the operator is responsible for the damages, injuries, and deaths they cause. However, if the car they had driven had a defect in design or manufacture - if the tire blew or the axle snapped, for example - then the manufacturer can be held liable. The relevant standard is called strict liability: the manufacturer is strictly liable for any and all damages caused by faults in their product, and there is no requirement to prove negligence or ill intent[5].

For driverless cars, if the software driving the car has an error and causes a lethal accident, strict liability applies to the manufacturer of the software. The closest existing precedence are laws governing the software that runs pacemakers and other critical medical devices. Unfortunately, almost no case law exists for such software in cases of serious injury or death - the cases tend to be resolved with out of court settlements. Nonetheless, the cases that were heard in court have followed the standard of strict liability; if users can prove that an error exists in the software, the manufacturer can be held strictly liable[6].

Manufacturers contend that the burden of strict liability will stifle innovation when businesses spend too much time and money in court. However, this is strict liability working as designed: if a business truly believes that liability claims paid out due to a particular product would exceed the profit from selling it, then they ought not produce that product. Although manufacturers shoulder the entire liability burden, the overall reduction in injuries, death, and resultant lawsuits lessens the weight of strict liability. Unfortunately, due to the general lack of case law, there is no informed estimate of how much a self-driving car manufacturer would be required to pay in the event that they sold a buyer a self-crashing car.

One likely way forward involves the manufacturer creating a partially-automated vehicle that turns over control to the user in cases where the vehicle may experience an error (or at any time). With these modifications, manufacturers may be able to place liability back on the operator, however the value proposition of a self-driving car would be much diminished if the buyer must pay attention to the road at all times for liability reasons.


As driverless cars become closer to mainstream adoption, companies are furiously requesting patents to ensure they will receive the majority of the profits to be made [7]. When one thinks of driverless cars, often Google is the first company to come to mind because it was one of the first companies to investigate the space. Instead, automakers, not technology companies, have the most patents. Japan’s Toyota leads the race.

With that being said, will all these patents cause the autonomous vehicles market to become less competitive? Many experts think that they will have little to no impact on the industry for a few reasons [8]. Firstly, software has been one of the most difficult inventions to patent because software is built upon others’ software, can have short-term lifecycle due to continued innovation, or can be patented at too high of a level [9]. Another reason patented technologies won’t have a large impact on the industry is that patented technologies typically aren’t respected across international boundaries [10]. This might make some companies more reluctant to patent a technology since they will be releasing technological specifications for international companies to copy.

Empirically, patents have not been a large concern for car companies. Tesla, the electric car company, has already openly stated that they are, in good faith, releasing their current patents[11]. This means that any person or company can use Tesla’s patented technology without Tesla’s permission. However, some people say that this is a strategic decision to increase adoption of electric car technologies.


In 1994, car manufacturers started using black box data to test their cars’ crash safety. The data has since been used in numerous ways by insurance companies, police investigations, and regulatory agencies. However, it would not sit well with consumers if anyone could access this information because it can potentially keep track of driving habits and frequently-visited places. So far, 15 states have passed regulatory laws on who can pull this information with or without the owner’s permission. This would set a precedent and could be expanded on in the subject of black boxes in self-driving cars. To maintain an efficient algorithm to drive in specific areas, machine learning could be employed in such cars that could accrue so much more information besides what was originally tracked in the 1994 black box, making this information more sensitive.

When a computer drives a car, personal information like license plates and location can be stored on the cloud or other avenues. This issue has already appeared in the context of in-vehicle communications options. One such system, OnStar, provides security and safety features, as well as turn by turn navigation. The service also stores detailed information about the state of the vehicle, such as if a driver or passenger was wearing their seatbelt. Autonomous vehicles would likely collect even more information. Hackers could look to exploit this. Existing punishment for criminal behavior in this domain would include prison time and fines depending on the degree of theft. However, what the law has not yet taken into consideration for autonomous vehicles is legality for manslaughter through hacking.