AI and the Drive of the Future

With TESLA being the darling of electronic and self-driving cars, and many following their lead, AI’s role in how we will be driven, no longer drive, is of significant importance. AI itself is in early stages and far from able to analyze the full environment around a car while in motion. Not only is it questionable the environment can be fully processed, but additionally, now attention is being given to decision making aspects affecting the passengers, and pedestrians.

In the article “Building a Moral Machine: Who Decides the Ethics of Self-Driving Cars?” by Thomas Hornigold the topic of how a Moral Machine is programmed is raised. Noted are large survey’s which ask many thousands of respondents what they would do in a given situation. From this, the thinking is that an answer is derived e.g. save the child running after a ball in the streets by swerving into the pole.

For the moment there does not seem to be another method but from this approach, a grand dilemma is created. If we build one Moral Machine e.g. the first truly self-driving car, would build another Moral Machine differently? I think naturally we would not. It would be a job done with no alternative. The study on what the majority would do was completed. This creates the problem that only one moral answer is set in stone.

In my car, I may want it to see things differently based on my morals than the cookie cutter answer that has replaced my view with set guidance. If a passenger says something e.g. a relative offering advice, I clearly do not need to follow it and rather go my own way. What buy-in will there be to bring people onboard with a Moral Machine that does not necessarily reflect their own, and in many instances would not. There is an unfounded presumption here that a Moral Machine will think better for us than our own thoughts. Perhaps, down the road, reality will be that we share what our opinions are and the driving reflects it. For now, I think everyone is in the dark. The truth is though it is likely some form of imposing of values will be necessary and that is troublesome.

Something has to be offered to replace our existing views and values. Perhaps that is safety or another essential human need. If nothing is offered a different course may result which does not accept this new Moral Machine if we are to call it that.

%d bloggers like this: