Get Adobe Flash player

Legislative

Legislative Update - April 2017

Ma! No Hands!

We have seen sentient vehicles such as Kitt from the TV show Knight Rider, the Batmobile and the Audi from I, Robot. These are just some of the types of futuristic technological advancements that allows automobiles to “drive themselves”. This technology is here, and coming at us like a freight train. The best we can do as motorcyclists is to ensure our interests are protected as it develops and hope to stay out of it’s way. Technological advancements have been seen in history as both a betterment and a hazard to society based upon various viewpoints.

What is an Autonomous Vehicle?

An autonomous car (also known as a driverless car, auto, self-driving car, robotic car is a vehicle that is capable of sensing its environment and navigating without human input. In 2014, SAE International, an automotive standardization body, published a classification system based on six different levels. This classification system is based on the amount of driver intervention and attentiveness required, rather than the vehicle capabilities, although these are very closely related. In the United States in 2013, the National Highway Traffic Safety Administration (NHTSA) released a formal classification system, but abandoned this system when it adopted the SAE standard in September 2016.

SAE automated vehicle classifications:
  • Level 0: Automated system has no vehicle control, but may issue warnings.
  • Level 1: Driver must be ready to take control at any time. Automated system may include features such as Adaptive Cruise Control (ACC), Parking Assistance with automated steering, and Lane Keeping Assistance (LKA) Type II in any combination.
  • Level 2: The driver is obliged to detect objects and events and respond if the automated system fails to respond properly. The automated system executes accelerating, braking, and steering. The automated system can deactivate immediately upon takeover by the driver.
  • Level 3: Within known, limited environments (such as freeways), the driver can safely turn their attention away from driving tasks, but must still be prepared to take control when needed.
  • Level 4: The automated system can control the vehicle in all but a few environments such as severe weather. The driver must enable the automated system only when it is safe to do so. When enabled, driver attention is not required.
  • Level 5: Other than setting the destination and starting the system, no human intervention is required. The automatic system can drive to any location where it is legal to drive and make its own decisions.
Technology vs. The Reasoning and Judgment of a Human Being

There is good, and there is bad to this technology. Maybe in autos, this technology can prevent the Right-Of-Way violations, that have taken so many lives of those in the motorcycling community.

Technology can be a good thing, but just as we saw in the radical computerized stock market sell-offs a few years ago, it still hasn't replaced the reasoning and judgment of a human.

It has become increasingly clear over the last couple of years that the era of the autonomous car is upon us and with it comes increasing concerns over whether self-driving cars are properly equipped to ‘see’ motorcycles on the road and react accordingly. We have recently seen what appears to have been the first documented accident involving a self-driving car and a motorcycle. It happened on a Norwegian motorway, and involved a Tesla electric car – the brand that’s leading the way in production semiautonomous vehicles – and a young female rider. Reports suggested the car, with its ‘autopilot’ mode switched on, hit the back of the bike at high speed, leaving its rider seriously injured. Although it’s impossible to say whether or not the autopilot contributed to the accident, perhaps failing to ‘see’ the bike with its array of cameras and radars, the accident raised concerns among European motorcycle groups and as it should here in the United States as well.

Research also shows that drivers in autonomous cars react later when they have to intervene in a critical situation, compared to if they were driving manually. Do we really want to allow a slower reaction time from a driver? FEMA suggests we look at US research by John F. Lenkeit of Dynamic Research, Torrance, CA, which finds that existing forward collision warning systems give “inadequate results” for motorcycles in 41 percent of test cases, versus under 4 percent for cars.

Self-driving vehicles have been promoted to us as having the potential to eliminate most or all of the more than 30,000 annual traffic deaths in the US. If such selfdriving vehicles can fail to detect motorcycles, they can presumably fail to detect pedestrians as well. Then this question pops into mind: When a ball bounces into the street, does the autonomous car detect it? Does it then expect the child who may run after it?

Legislative Issues

In June 2011, the Nevada Legislature passed a law to authorize the use of autonomous cars. Nevada thus became the first jurisdiction in the world where autonomous vehicles might be legally operated on public roads. According to the law, the Nevada Department of Motor Vehicles (NDMV) is responsible for setting safety and performance standards and the agency is responsible for designating areas where autonomous cars may be tested. This legislation was supported by Google in an effort to legally conduct further testing of its Google driverless car. The Nevada law defines an autonomous vehicle to be "a motor vehicle that uses artificial intelligence, sensors and global positioning system coordinates to drive itself without the active intervention of a human operator." The law also acknowledges that the operator will not need to pay attention while the car is operating itself. Google had further lobbied for an exemption from a ban on distracted driving to permit occupants to send text messages while sitting behind the wheel, but this did not become law. Furthermore, Nevada's regulations require a person behind the wheel and one in the passenger’s seat during tests.

Ethical Issues

With the emergence of autonomous cars, there are various ethical issues arising. While morally, the introduction of autonomous vehicles to the mass market seems inevitable due to a reduction of crashes by up to 90% and their accessibility to disabled, elderly, and young passengers, there still remain some ethical issues that have not yet been fully solved. Those include, but are not limited to: The moral, financial, and criminal responsibility for crashes, the decisions a car is to make right before a (fatal) crash, privacy issues, and potential job loss.

Taking aside the question of legal liability and moral responsibility, the question arises how autonomous vehicles should be programmed to behave in an emergency situation where either passengers or other traffic participants are endangered. A very visual example of the moral dilemma that a software engineer or car manufacturer might face in programming the operating software is described in an ethical thought experiment, the trolley problem: a conductor of a trolley has the choice of staying on the planned track and running over 5 people, or turn the trolley onto a track where it would only kill one person, assuming there is no traffic on it. There are two main considerations that need to be addressed. First, on what moral basis would the decisions an autonomous vehicle would have to make be based on. Second, how could those be translated into software code. There is the motorcycle scenario, developed by Noah Goodall of the Virginia

Transportation Research Council, we can see the ethics of crash optimization at work. Recall that we limited ourselves to three available options: The car can be programmed to “decide” between rearending the truck, injuring you the owner/driver; striking a helmeted motorcyclist; or hitting one who is helmetless. At first it may seem that autonomous cars should privilege owners and occupants of the vehicles. But what about the fact that research indicates 80 percent of motorcycle crashes injure or kill a motorcyclist, while only 20 percent of passenger car crashes injure or kill an occupant? Although crashing into the truck will injure you, you have a much higher probability of survival and reduced injury in the crash compared to the motorcyclists.

Conclusion

The benefit of technology does not overwhelm the value of human life in any circumstance. Until an autonomous vehicle manufacturer can definitively prove that the automobiles would not be at fault and can properly protect a motorcyclist from a computer generated crash, they should not be allowed to share public roadways with them. It is ABATE of Arizona’s position that until technological advancements are enough to keep motorcyclists out of computer generated harm’s way, we as an organization, are against allowing autonomous vehicles on the public roadway.

Mike Infanzon
State Designated Lobbyist, ABATE of Arizona

Gehrig, Stefan K.; Stein, Fridtjof J. (1999). Dead reckoning and cartography using stereo vision for an autonomous car. IEEE/RSJ International Conference on Intelligent Robots and Systems. 3. Kyongju. pp. 1507–1512.

"U.S. Department of Transportation Releases Policy on Automated Vehicle Development". National Highway Traffic Safety Administration. 30 May 2013. SAE. (2014). [ SAE INTERNATIONAL STANDARD J3016]. Retrieved March 3, 2017, from http://www.sae.org/misc/pdfs/automated_ driving.pdf

Bike Social, Autonomous care to be tested with motorcycles. (2016, December). Retrieved March 3, 2017, from https://www.bennetts.co.uk/bikesocial/ne ws-and-views/news/2016/december/aut onomous-cars-to-be-tested-with-motorcycles

Merat, Natasha; Jamson, A. Hamish. "HOW DO DRIVERS BEHAVE IN A HIGHLY AUTOMATED CAR? " Institute for Transport Studies University of Leeds.

Cameron, K. (2016, October 31). Can autonomous cars detect motorcycles? Cycle World. Retrieved March 3, 2017, from http://www.cycleworld.com/canautonomous- cars-detect-motorcycles

Bill AB511 Nevada Legislature" (PDF). Nevada Legislature, 2011 "Preparing a nation for autonomous vehicles: Opportunities, barriers and policy recommendations.". Transportation Research Part A: Policy and Practice.

Patrick Lin, PhD, Op-Ed., The Robot Car of Tomorrow May Just Be Programmed to Hit You, WIRED, May 14, 2014 at available at https://www.wired.com/2014/05/therobot- car-of-tomorrow-might-just-be-programmed- to-hit-you/.

Jesse Kirkpatrick, The Ethical Quandary of Self-Driving Cars, SLATE (2016), http://www.slate.com/articles/technology/ future_tense/2016/06/self_driving_car s_crash_optimization_algorithms_offer_a n_ethical_quandary.html (last visited Mar 8, 2017).

Latest Masterlink
Support Businesses that Support ABATE