Link to module

Evaluated December 2021

The module prompts Computer Science I students to consider how their algorithm might affect people with different socioeconomic backgrounds and people who experience life emergencies—those extenuating circumstances that might deviate from an algorithm designer’s expectations or assumptions. While the first half of the lesson is built around employer’s desires, the second half introduces those marginalized perspectives. Though the decision making considered here is at the individual and business level, the reflection questions are structured so students can consider the broader implications of these decisions. It fits naturally into a Computer Science I course giving students an early introduction to the essential nature of ethical reflection in the software development process. The lessons in this module can be completed in a single lab session.

It directly covers material in Software Development Fundamentals/Fundamental Programming Concepts, Software Development Fundamentals/Fundamental Data Structures.

The teaching materials are highly developed and made available in several forms, and the creator included links to their own write-ups about using the module, which include advice for implementation. The module also includes links to supplemental readings about algorithm bias so instructors and students alike can understand the social and ethical importance of the lessons.

Instructors need not have extensive interdisciplinary training to use the module, and while the module does not explicitly call for collaboration, it does not preclude collaboration. Connections with other scholars focused on social inequalities could assist with further development. Instructors should prepare to introduce the lesson and to facilitate discussion after students create their algorithms.

The module expects that students are proficient enough with python to use a template to build their algorithm. The module is divided into two steps: a first step in which students construct an algorithm and a second step in which students thinking about the possible shortcomings (including socioeconomic exclusion and bias) of their creation. Students with more experience studying social issues may anticipate issues that other students do not, but the lesson is dynamic enough that every student will learn something new about how their technical creations relate to social and ethical issues. For students who will be designing algorithms and/or working as computer programmers in industry, the lesson touches on something that nearly all of them will experience in their professional lives.

The instructor needs to develop their own assessment tools and learning objectives. Once the students create their algorithm, they are required to reflect upon their creation by responding to a series of open-ended questions. They could do this in a class discussion, in small group discussions or in written homework. One strong feature of the module is that it does not require students to have extensive background in ethical and social theory, and it does not suggest that there are easy answers to the problems posed in the lessons. In fact, the creator of the module explains that they tell their students that the answers are not easy and that even the instructor does not necessarily know a “right” answer.

The module activities give students experience and practice working through issues that they are likely to encounter in their professional lives. The lessons also introduce current industry concerns, such as fairness, inclusiveness, accountability and Value Sensitive Design.


The evaluation of this module was led by Patrick Anderson and Jaye Nias as part of the Mozilla Foundation Responsible Computer Science Challenge. Emanuelle Burton, Judy Goldsmith, Colleen Greer, Darakhshan Mir, Evan Peck and Marty J. Wolf also made contributions. These works are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.