Moral Machine

Moral Machine.
A platform for public participation in and discussion of the human perspective on machine-made moral decisions

Read in full here:

This thread was posted by one of our members via one of our news source trackers.

1 Like

Corresponding tweet for this thread:

Share link for this tweet.

1 Like

This was a bit daft as it’s such a limited set of options (surely there are more) - but it does make an interesting point.

Personally I think the people using the car should always be the ones at risk - because they are willingly taking that risk…

2 Likes

Yep, this exactly. We need better public transportation options in general, especially around rails.

3 Likes

Or bike paths and more people working remotely :003:

2 Likes

Yeah screw bike paths where I live. It’s either 14 degrees like it is now, or it’s 130 degrees (both in fahrenheit), doing that for 5 miles each way a day sounds like death, and that’s even before considering the dust storms and, quite literally, fire devils (dirt devils that spun up a grassfire and carries it along)!

2 Likes

That’s insane - 80°F to 90°F would be perfect for me, anything lower than 80 would be cold :lol:

1 Like

Yeah it’s a fun place to live, lol.

1 Like

I tend to agree with Aston, and think that machines should follow road rules and especially avoid trying to ascribe different value on different kinds of people. No, granny shouldn’t be weighted differently than 20-year-old triathlete. Generally I want to think that the car should not swerve into a “less bad” outcome, should not actively swerve into a new accident path that it expects to cause one death instead of continuing on the original accident path and causing two deaths, I don’t want cars “choosing” who lives or dies, it seems to lead toward a dystopian vista. But it does seem like there has to be a cutoff at some point. At the absurd extreme, should the car hit the last fertile/pregnant woman on earth, or actively swerve to hit the divorce lawyer next to her?

You probably recognize these as versions of the classic philosophical moral quandary questions, like the runaway train question. I think where machines are concerned, we’d do well to add more Immanuel Kant and less Jeremy Bentham, which is to say more of following the moral law/duty-based ethics (like “don’t kill, period”), and less utilitarianism/focusing on consequences (like “kill in order to save a greater number”).

I really wasn’t sure what to think when other species were involved. Ultimately I chose to not swerve into the pets to avoid hitting the barrier, again based on something of a non-interventionist principle, like what Aston was saying. But here we really do have to do some triage, right? Like a dog needs to be weighted higher than a pigeon, which is higher than a caterpillar.

Poor bugs. :bug:

3 Likes