The dilemma of the self-driving car “Trolley” … Moral Machine

Hello, we’re FUNFUN.We often talk about the ethics of artificial intelligence while talking about self-driving cars.Trolley dilemma often comes up.Do you know what the trolley dilemma is?

Namwiki Reference Trolley Dilemma – Namu.wiki

A train with a broken brake system is running. If you leave it as it is, five people will die if you go straight, and one person will die if you change the track. What will you do?The issue of Trolley’s dilemma is the question, “Which should I choose when I am forced to sacrifice a few or many people?” Trolley’s dilemma: “Which should I choose when I have no choice but to sacrifice a few or many people?”

What should self-driving cars do when this happens?What would you do if the answers to the questions we thought about and answered could affect the judgment of self-driving cars?

Contract Thompson: // Arisato. Bell / Field O8 Kids 66 Hill 4

I have attached a YouTube video that appears on the moral machine site.This is the kind of problem.If you loosen it up, it’ll hurt your heart, Do I have to answer? I have a feeling that.You don’t have to worry too much because your answer may not affect or affect you.If you solve everything and choose “No” in the following parts, your decision may not affect you. It is often used in self-driving car classes, but I share it because I think it can be used as a class material.Recently, artificial intelligence and ethics issues have been raised a lot.It’s an important issue, but I think it’s easy to overlook.링크는요.https://www.moralmachine.net/hl/krIt is a space where the general public can participate and discuss human perspectives on ethical judgments based on Moral Machine artificial intelligence.www.moralmachine.netmanufacturer’s blog

error: Content is protected !!