Paul's Travel Pictures

The Ethical Issues of Autonomous Cars
December 26th, 2022 By Paul Bettini

Main Menu            Home           Digital Cameras

Misc. Pictures            Articles            My Blog

 

Whether we like it or not, we are witnessing the birth of a new automotive era. With global warming, increased awareness and global consciousness at an all-time high, the automotive landscape is changing rapidly. This is much to the dismay of some SEMA Show exhibitors who depend on the deep-seated custom car culture and fervent racing enthusiasts in the USA to sell their products.

First, there was the electric car boom. You may think that it has something to do with the recent advances in electric motors and batteries, and you’d be right to a certain extent, but the truth of the matter is that electric vehicles have been around for as long as internal-combustion engine cars, if not slightly more. What is the reason we didn’t have any significant improvements in electric vehicles until now? Some would speculate it is because the technology didn’t benefit oil companies, and indirectly, the powerful countries exporting oil.

With electric vehicles being on the brink of replacing the normal, internal-combustion engine cars, significant improvements are expected, but every change inevitably has both positive and negative aspects.

There will eventually come a point where cars become so sophisticated that they’ll be able to drive themselves with no human intervention. For example, Tesla's "Autopilot" mode (which is technically still in "beta testing") is nearly able to drive itself, it just needs the driver to keep an eye on it and be ready to take over if necessary. At what point exactly does a self-driving car stop being just that, a car, and rather becomes an artificial intelligence that needs to make make life and death decisions?


Tesla Model S With AutoPilot At CES

Putting aside the joy driving brings to many people, and the amount of disappointed enthusiasts there are going to be once driving your own car is relegated to race tracks or off-road adventures only, let’s look at it from a different angle.


2020 VW Atlas CES Show Off-Road Concept Car

Currently, the driver is the primary party held responsible for any actions he or she undertakes with their car. For all intents and purposes, it’s under their control and ultimately, their responsibility. If they crash or take an innocent life away due to carelessness or substance abuse, it’s their fault.

Let’s imagine the following situation: a lethal threat appears in your lane and it requires you to either crash into it, swerve right into a motorcyclist or left into a large SUV. Now, if there’s a person operating the vehicle, it really depends on the circumstances. It’s a natural reaction over which you will have no control, your instincts simply take over and what happens, happens. But what if the car in question is autonomous and there are occupants inside it? How does it choose what’s right and what’s wrong?


BMW i8 Autonomous Car

The simple fact of the matter is that it can’t, not without assistance at least. It will have been programmed to act one way or another, either saving its occupants inside or swerving and hitting someone else at the chance that you survive. It’s no longer a reaction, but rather a decision. The car calculates all chances and outcomes of each possible scenario and does what it’s been programmed to do. When you think about it that way, you might say that the outcome of that particular situation has been known for months or even years beforehand, when the car was being programmed. It’s kind of creepy to think of it that way, isn’t it?

But who’s to blame in a situation like that? Is it you, the owner? Highly unlikely, you weren’t operating the vehicle. Is it the programmer? Not really, the car did exactly what it was programmed to do. That leaves the manufacturer, but we all know they won’t take the fall for it. If a manufacturer shifts responsibility over to the owners and makes them sign a contract stating that the owners assume liability for everything, is anyone really going to buy a vehicle from them?

Then, how do we decide who gets to live and die? If the car is programmed to protect its occupants, it means that it will perhaps deliberately crash into another person or persons to protect its owner. If it’s designed to save as many lives as possible at all costs, even at the expense of the lives of the occupants, is anyone really going to purchase the vehicle? Competition would rise between manufacturers, with the majority of sales going towards those who offer cars which protect its occupants.

Perhaps some manufacturers will choose to target consumers who believe in the principle of do the least harm even it if means that their lives would be sacrificed for the greater good. On the other hand, I can imagine luxury automotive manufacturers going the opposite route and pledging to save the lives of their well heeled owners at all costs.

The only real solution to this issue is to make all autonomous vehicles save as many lives as possible, even if it means risking the occupants inside. If we look at some statistics, we’ll conclude that more than 90% of all crashes happen because of human errors, effectively making them avoidable. Therefore, the more autonomous vehicles we have on the roads, the safer it gets. If we have only self-driving cars driving on the streets, the number of accidents will plummet. The only unpredictable objects on the road will be pedestrians, bicyclists, children and animals.

However, a small risk will always remain, and the vehicle gets to decide what happens. Robots may never be able to have cognitive thinking, making them dependent on information. For self-driving cars, they are dependent on road condition, weather, distance and speed data collected from radar, cameras, LiDAR and other sensors. A robot may know more data than a human being and be able to solve problems faster, but that’s all it can do. Given a new problem which it doesn’t have enough data for (or hasn’t been programmed to solve), it will completely freeze, not knowing what to compute. By contrast, a person can react in all situations, whether it is the right or wrong decision.


Ford Fusion LiDAR Sensors - CES Show

Imagine the following situation: a sudden obstacle appears in the middle of the road, and it’s blocking off your lane. For a human, the solution is simple. If there are cars behind, he will swerve around it and avoid it, even it if means crossing a double yellow line and breaking the law. For the computer, it might not be so easy. If we have law-abiding autonomous cars, the choice of crossing the line and swerving may not be in its options (or memory). Its only other option will be to perform a sudden stop, making the cars behind possibly crash into it. This is just an example, but you get the idea. Because a computer can’t think “outside the box”, or so to say, it will never be able to perform the decisions a human can make.


VW BUDD-E - Lounge Seating Hints At Self Driving Future

One possible solution is to implement moral principles into the computer’s algorithm, rather than logical ones, to make it more similar to human reactions. Easier said than done, we know. But what if it has to make a choice between swerving into a biker wearing a helmet, and one not wearing anything? Does it go for the one with the helmet because he has better chances of surviving? In which case, we’re effectively penalizing the law-obeying rider unfairly. Likewise, what if it swerves into the one not wearing helmet? The car will then be dealing out street justice, becoming something it isn’t. In both cases, it has to “target” someone, and that’s really the biggest issue in all of this.

How do we make autonomous cars behave like they had human drivers operating them in dangerous and reaction-based situations? Perhaps we aren’t supposed to. If we consider the fact that accidents will decrease by a massive 90% in a fully autonomous world, maybe that’s all it takes. By simply having a lot less casualties, we’ll be negating the fact that cars are the ones who decide who gets to live and not us.

Ask yourself this question: would you ever get into your self-driving vehicle, fully entrusting it with your life, knowing full well that it’s designed to save as many lives as it can rather than just protect you at all costs? 

As a person living with Multiple Sclerosis which might take me out of the driver's seat much earlier in my life than my parents or grandparents, I'm probably much more willing to take that risk than the average person.

I'm looking forward to discussing these issues with other self driving car enthusiasts at the upcoming 2023 SEMA Show held from October 31st to November 3rd 2023 at the Las Vegas Convention Center in Las Vegas, Nevada.

 

 

Main Menu            Home           Digital Cameras

Misc. Pictures            Articles            My Blog

Copyright 2024 © PaulsTravelPictures.com
 All Rights Reserved ®

Privacy Policy     About Paul & Author Contact Info