Link to module

Evaluated December 2021

This module is a reasonably straightforward group programming assignment for Computer Science I and is based on the so-called “Moral Machine” from MIT, which gamifies traffic fatalities and lets players choose which groups of individuals to kill. This scenario is framed as the classic “trolley problem” from ethics, but introduces additional demographic elements, many of which would not be obvious to a driver who was about to run into individuals at a speed likely to kill them. It asks students to play the game repeatedly to establish a set of moral priorities and to then attempt to replicate them in code, and finally to verify that they have done so. It fits naturally into a Computer Science I course giving students an early introduction to the essential nature of ethical reflection in the software development process. The lessons in this module can be completed in a single lab session.

The focus of the assignment is algorithmic fairness, as is made clear by the related readings suggested. The readings raise issues of biased data (e.g., COMPAS), power structures that are perpetuated by carceral justice and other social issues. It is best if the instructor does all the suggested readings, and it is helpful if they are knowledgeable about both definitions of algorithmic fairness and some social theory. A discussion of the assignment with someone from Philosophy and someone from a social science, such as Sociology, would be useful.

It directly covers material in Software Development Fundamentals/Fundamental Programming Concepts, Software Development Fundamentals/Fundamental Data Structures.

From the students’ point of view, this is a self-contained module, though they are asked to do at least one additional reading. It is likely that many of the students doing this assignment hope to have autonomous cars and will be interested in the topic of safety.

The instructor needs to develop their own assessment tools and learning objectives. The module is built on top of a piece of software that encourages students to think about moral priorities based on physical fitness and other potentially bias-related attributes of individuals. While the related readings are about real-life software that perpetuates (or perhaps in the case of one Pittsburgh-based instance, avoids perpetuating) historical inequities. Justin Li’s blog post suggests some ways to ameliorate some of issues that arise with this assignment.


The evaluation of this module was led by Judy Goldsmith and Patrick Anderson as part of the Mozilla Foundation Responsible Computer Science Challenge. Emanuelle Burton, Colleen Greer, Darakhshan Mir, Jaye Nias, Evan Peck and Marty J. Wolf also made contributions. These works are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.