Ethical
Alice buys an autonomous vehicle from Bob, who sells
them. The autonomous vehicle has different settings, some more aggressive
(where the autonomous vehicle drives faster and brakes harder), and some
less. Alice sets the autonomous vehicle to its most aggressive setting.
One night on a dark and wet road, Alice hits a pedestrian, Carlos, who was
jaywalking. Carlos is badly hurt.
Bob: the
person that sales the car
Alice: car owner,
set the car to its most aggressive setting
Carlos:
jaywalked on the road, badly hurt by Alice’s car
When an
autonomous vehicle crashes or has an accident of some sort – who is
responsible, and to what degree?
Because the
legal framework for autonomous vehicles does not yet exist, designers can only
write programs to ensure the ethical conduct of autonomous vehicles. However,
all the ethical issues could not be prevented by writing the code, as programs
are simply instructions, and vehicles cannot think as humans do. Based on the case
above, there are some scenarios to divide the responsibilities:
If Alice was
driving the car on the street where there were lots of people, she should not
have turned the vehicle to the aggressive mode, and Alice was supposed to have
all the responsibilities. As the passenger of the autonomous vehicle, Alice
should be aware of that the vehicle did not yet have the ability to react
according to the emergencies such as jaywalkers, and she should turn the
vehicle onto the appropriate mode based on the street conditions.
If Carlos
was walking on the jaywalking restricted zone, he should be responsible for
injury. In such zone, the autonomous vehicle may be programed to have a greater
speed, and it may not be that sensitive to objects. Therefore, there may be
shorter amount of time for the vehicle to react. Carlos was supposed to follow
the law to not jaywalk.
Bob was not
responsible for this accident, if he explained the “aggressive mode” clearly to
Alice, and illustrated the conditions of using the mode and the corresponding
consequences. However, programmers of the vehicle should shoulder parts of the
responsibility, if the law did not say the zone prohibited jaywalking. Programmers
were supposed to set certain conditions of turning on the aggressive mode, such
as “pedestrian flow rate under a certain number”, or to have an automatic sensitive
mode if the vehicle were in aggressive mode for a long time, so that the
vehicle could have the opportunity to clearly detect the surroundings in
periods of time.
I would
suggest the government to put clearer signs to restrict jaywalking, and to
protect the safety of pedestrians. Also, if the pedestrian flow rate is huge in
a certain area, regulators should set rules to prevent autonomous vehicles from
turning to the aggressive mode. Passengers should be aware of the street
conditions, and carefully decide which mode to turn to. Programmers and
designers of the vehicle should set some conditions for using aggressive modes
to ensure safety.
Thank you for your thoughts. If you had to divide the responsibility into percentages for each person, what might they be? - The CDIO Academy Team
ReplyDelete