One important turning point in the development of transportation and technology is the progress made in autonomous driving. As we will integrate these machines into our lives in the next future, the interest in the moral evaluation of their decisional algorithms has substantially increased.(Evans, et al, 2020). Morality in Autonomous Vehicles (AV) is a tangled novel topic which is still covered by traditional moral standards. This technology has the burden to be required to assume complex decisions, some of which might also include unavoidable harm (Bonnefon et al., 2016), a condition in which at least one party is negatively affected due to accidents or manufacture errors. Although it may seem improbable, unlike humans they have to be programmed beforehand on the basis of behavioral rules and social norms (Faulhaber et al. 2018). Potential scenarios include deciding between advantaging the self, passengers or third parties, all situations that might force the AV to make a decision between two or more unpleasant and unsatisfactory choices. The fact that moral decisions have to be predetermined opens up the question of whether they should protect passengers or other parties, such as the AV owner or the majority of the characters involved.

Risk perception in traditional and autonomous and traditional driving scenarios

OZCELIK, ARMAGAN FIRAT
2023/2024

Abstract

One important turning point in the development of transportation and technology is the progress made in autonomous driving. As we will integrate these machines into our lives in the next future, the interest in the moral evaluation of their decisional algorithms has substantially increased.(Evans, et al, 2020). Morality in Autonomous Vehicles (AV) is a tangled novel topic which is still covered by traditional moral standards. This technology has the burden to be required to assume complex decisions, some of which might also include unavoidable harm (Bonnefon et al., 2016), a condition in which at least one party is negatively affected due to accidents or manufacture errors. Although it may seem improbable, unlike humans they have to be programmed beforehand on the basis of behavioral rules and social norms (Faulhaber et al. 2018). Potential scenarios include deciding between advantaging the self, passengers or third parties, all situations that might force the AV to make a decision between two or more unpleasant and unsatisfactory choices. The fact that moral decisions have to be predetermined opens up the question of whether they should protect passengers or other parties, such as the AV owner or the majority of the characters involved.
2023
Risk perception in traditional and autonomous and traditional driving scenarios
Morality
Autonomous Driving
MFQ
MFT
File in questo prodotto:
File Dimensione Formato  
Armagan_Ozcelik_Final_Thesis.pdf

accesso aperto

Dimensione 747.42 kB
Formato Adobe PDF
747.42 kB Adobe PDF Visualizza/Apri

The text of this website © Università degli studi di Padova. Full Text are published under a non-exclusive license. Metadata are under a CC0 License

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12608/69728