Wednesday, December 10, 2014

The Ethics of Autonomous Cars

    In the Wired.com article that we had read in class, it discusses autonomous cars. In one of the scenarios, your autonomous car was out of control and could either crash into a mini cooper or an SUV. At first glance I would say SUV, because the SUV is bigger and therefore the person driving it would receive less damage then if I had crashed into the mini cooper. But if we were to program a car to have a mind-set to crash into a person just because their car might be bigger, then were choosing a life over another. Imagine a mother buying a large van, in order to have room for her kids and in hopes of it being safer. Well she goes out on the road and has no idea that if the autonomous car next to her were to go out of control it would choose to crash into her over the mini cooper on the other side.
    Another scenario is there is a motorcyclist without a helmet and one wearing a helmet. You have to choose which one to crash into. Once again at first glance I would most likely choose to crash into the one wearing the helmet, because he would statistically have a higher chance of living. If we were to program a car to hit a man wearing a helmet, versus one not, then were punishing the one man for most likely following the rules. This doesn't seem fair. In the end more people would end up not wearing helmets, in thinking that they would be safer.
    An idea that could be viewed as a solution would have the car make decisions through a random number generator. For example, before a crash the car would do a random algorithm and if the number is even the car turns right and if odd turns left. Personally, the whole concept of cars driving themselves is weird to me and I would rather just not have the idea all together. But it will eventually happen, and therefore a crashing system needs to be put in order. Having the car randomly choose what or where to crash is like a person making a random decision at the last moment. I honestly, think this idea is logical in a way that isn't logical. Meaning logically we would have the car be programmed to hit something that would cause the least damage, but since that can cause personnel problems the logical answer would have the car choose randomly.
    This comes to the question asked by Alexander Karasulas. If the driver is not making control decisions, should the driver be responsible for any outcomes at all? Clearly the person can not be held accountable if the car swerves and cut someones off or causes an accident due to a malfunction. But, if the person entered the car and messed with the controls, then yes they themselves can be held accountable for the outcome. For example, maybe someone who is drunk gets into an autonomous car and doesn't properly uses the car. An accident then occurs. The driver would be held accountable for not allowing the car to safely drive.

No comments:

Post a Comment