Moral Machine

I tend to agree with Aston, and think that machines should follow road rules and especially avoid trying to ascribe different value on different kinds of people. No, granny shouldn’t be weighted differently than 20-year-old triathlete. Generally I want to think that the car should not swerve into a “less bad” outcome, should not actively swerve into a new accident path that it expects to cause one death instead of continuing on the original accident path and causing two deaths, I don’t want cars “choosing” who lives or dies, it seems to lead toward a dystopian vista. But it does seem like there has to be a cutoff at some point. At the absurd extreme, should the car hit the last fertile/pregnant woman on earth, or actively swerve to hit the divorce lawyer next to her?

You probably recognize these as versions of the classic philosophical moral quandary questions, like the runaway train question. I think where machines are concerned, we’d do well to add more Immanuel Kant and less Jeremy Bentham, which is to say more of following the moral law/duty-based ethics (like “don’t kill, period”), and less utilitarianism/focusing on consequences (like “kill in order to save a greater number”).

I really wasn’t sure what to think when other species were involved. Ultimately I chose to not swerve into the pets to avoid hitting the barrier, again based on something of a non-interventionist principle, like what Aston was saying. But here we really do have to do some triage, right? Like a dog needs to be weighted higher than a pigeon, which is higher than a caterpillar.

Poor bugs. :bug:

3 Likes