Monday, June 12, 2017

In Self-Driving Cars We Trust?

     While the idea of having a car drive itself is truly fascinating and seemingly absurd, the project for their development has been up and going for nearly a decade as testing for these vehicles have gone underway. These cars hold the ability to perform actions just like the human brain, sometimes in a quicker manner too; although, just how much trust can we put into a piece of technology? At what point do we as people say, “No thanks, I can do it myself”? While I am ecstatic to admire the equipment at work, these self-driving cars are a hazard in the making.  Driving is one aspect of life that would be better if robots were to stay out of it.

     “During the week ending March 8 [2017], the 43 active [self-driving] cars on the road only drove an average of close to 0.8 miles before the safety driver had to take over for one reason or another” (Bhuiyan). While these partial self-driving cars are still being developed and tested, how are self-driving car developers such as Uber able to predict the circumstances that may occur during a drive? People are putting their full trust and lives in the hands of these car manufacturers so these cars must be able to perform perfectly; however, the true safety these types of cars hold can never be fully satisfied as improbable and environmental factors cannot be calculated. Even though people incapable of driving may finally be able to use a car, the advancement of self-driving cars should be halted because of the car’s incapability to adapt, potential security issues, and impending technology issues.

The basic blueprint for a self-driving car

     First, self-driving cars should be banned from the roads because of their lack of adaptability. As of now, there is no algorithm in place to deal with human interactions in place. One of the top ways to make human-automation interaction safer is to “design automated systems that cooperate, coordinate, and collaborate with human operators” (Sethumadhaven). This means, for example, that if there were a road closure and an officer were directing traffic instead of the regular stoplights or stop sign, the car would be incapable of proceeding in the proper manner. Depending on the extent of “self-driving”, the driver would have to take over the car if that were to be an option: this means that drivers would still have to get a license, resulting in people incapable of driving unable to use these cars. In addition, it is unknown how quickly the GPS would be able to be updated (Sethumadhaven), which could lead to further problems when trying to get to the desired destination.

Would self-driving cars be able to make judgment calls?
     Additionally, the risk of the car getting hacked cannot be ignored. Better be safe than sorry with this, and the security for these cars just is not where it needs to be. Charlie Miller, a security researcher for Uber, says that the security team “will need to consider everything from the vast array of automation in driverless cars that can be remotely hijacked, to the possibility that passengers themselves could use their physical access to sabotage an unmanned vehicle” (Greenberg). There have already been examples of cars being hacked, causing car companies to increase their security; but, a level of security where the car is completely safe from a hacker is seemingly impossible. Also, having a hacker in control of a self-driving car could lead to severe accidents if the intent is corrupt. Uber faces a bigger challenge with their self-driving cars, having to protect their cars from hackers that may try to do their work inside of the car.

     Furthermore, a lot more problems may arise when it comes to the car’s technology. Protecting the massive camera found on top of the car may cause further difficulties along with the one’s previously mentioned. Not only is that giant, expensive piece of equipment exposed to being damaged by people, but also environmental factors. In fact, heavy rain and hail are able to directly damage the screen while events such as sandstorms or blizzards are able to disrupt the cameras. If even one camera is dysfunctional the whole car and everyone in it is in trouble. There does not even need to be a storm: what if the cameras just fail to recognize something is happening? To add to these problems, 99% of all roads are unable to be driven on since the proper data has not been collected (Bhuiyan), so how long will it take to make that percentage lower? While they collect more data to make these cars more universal, they can hold onto the release and further analyze the downsides and make the right call to postpone the project, since stopping the production is inevitable. 



                                              This video is a news report of an occurrence of a self-driving car crash

     Aside from the technological approach to the problems of these self-driving cars, there is the thoughts of the general public. I interviewed my dad on the situation on the situation to see what he had to say about the safety of these cars. “I for one would not want to own a self-driving car,” he says, “I would rather be in control than to put all my trust into a computer. While the idea is mind boggling, the reliability and maintenance required is too much. He concludes by saying, as a driver himself, he would feel less safe driving a normal car around self-driving cars over regular cars. Is our trust something we will have to give whether we like it or not? Whether you get a self-driving car or not, the performance they have will still have an effect on you.


     As of the writing of this blog, cars are not allowed to self-drive without a licensed driver behind the wheel. It is not clear if at any point the DMV will permit the driving of the car with a child in it or an ill person incapable of driving at the moment. The rule should not change as self-driving cars should not be given that much faith, because at the end of the day that is what this whole argument is about. That is all there is left to do at this point: trust. Trust that everything works out and there are no problems; or, trust the DMV opens their eyes about the problems surrounding self-driving cars and make them illegal, since companies like Uber themselves will not stop their development themselves.


Works Cited

Bhuiyan, Johana. "Uber's autonomous cars drove 20,354 miles and had to be taken over at every          
     mile, according to documents." Recode.net. Mar. 16, 2017,
     https://www.recode.net/2017/3/16/14938116/uber-travis-kalanick-self-driving-internal-metrics-       
     slow-progress

Greenberg, Andy. "Securing Driverless Cars From Hackers is Hard. Ask the Ex-Uber Guy Who    
     Protects Them." Wired.com, April 12, 2017, https://www.wired.com/2017/04/ubers-former-top-
     hacker-securing-autonomous-cars-really-hard-problem/

Sethumadhaven, Arathi. "Self-Driving Cars: Enabling Safer Human-Automation Interaction."
     Research Digest, April 2017, http://journals.sagepub.com/doi/pdf/10.1177/1064804617697283