It’s easy to adapt laws to new technologies when the ultimate person in control is a human being. We are now stepping into new territory as we move into a future with more automated systems and smarter devices. We are allowing machines to make decisions for us, leaving a big gap in the way we handle responsibility, liability and property.
One day robots and automated services will take care of many of the tasks we are now responsible for. Robots are not perfect, meaning some of their choices will not be the best. What will happen when a robot takes a neighbor’s property (broom, mop, whatever) and starts using it at your place, for example? Is it theft? And if it is, who is to blame?
Let’s take this debate down a notch and relate it to a topic we are now more accustomed to – self-driving cars like Google’s. These driverless vehicles are proven to be more safe and reliable than the human mind and body. We are flawed individuals that get easily distracted, and human error is as common as blinking your eyes.
Despite being better drivers than us, these self-driving vehicles live in a world full of chaos. Accidents are inevitable, and we will have to somehow bring these robotic vehicles into society. (Hell, the FBI is even worried terrorists will take advantage of them!). If one of these cars gets in an accident, who is at fault?
It would be irresponsible to say another driver is always at fault. It’s even more complicated if two driverless cars happen to get in a crash.
My vision of self-driving cars under the law
It makes the most sense that self-driving cars are treated somewhat like pets. If your dog destroys someone’s property, for example, you are inherently liable for such damage. The same applies to your kids – these are living, deciding beings that are under your responsibility.
Of course, sometimes animals are put to sleep, as well. Should the government put a car “to sleep”? It doesn’t work quite like that, but sometimes it may very well be the manufacturer’s fault (Google, in this case). This would then warrant a recall (and some lawsuits against the company).
Google is taking things slow with the production of this technology. Truth be told, these cars no longer need you to do anything but to tell them where to go. The Search Giant’s first batch will feature manual controls and the need for you to control some of the driving functions.
Some say they are not going all out right away because people would be scared, but I believe the real reason lies in legal issues. Giving you some control over the driving will make you liable for anything that may go wrong.
Do you have any other ideas of how driverless cars will be handled in our society? Let us know in the comments below.