Predictive Policing

Link to module

Evaluated December 2021

This three-part module is designed to serve as a final project (or rather, scaffolded series of projects) for a data structures or Computer Science II course. The first two portions, which are almost entirely technical, ask the students to analyze some crime data with an eye to predictive policing; the third portion includes a portfolio of readings to furnish sociopolitical context on racist policing practices (and the role of tech in exacerbating or laundering disparities in how members of different racial groups are treated) and asks students to revise and reconsider their earlier work. The entire project is team-based, and student teams are held accountable to working cooperatively rather than portioning up the work and each doing their share independently.

Rather than occupying a small, discrete chunk of the course calendar, this module will work best threaded through the back half of a semester-long course and undertaken primarily on the students’ own time, rather than during class meetings.

The one aspect of this module that does demand significant in-class time (or, at least, small-group discussion among each team of students) is the final, ethics-oriented portion. Devoting some class time to discussing the assigned readings for the third and final portion of the module will maximize the impact of this module for ethics education. It lays the foundation for a more general discussion about the intersection of tech and (in)justice, grounded in but not wedded to policing.

It directly covers material in Software Development Fundamentals/Fundamental Data Structures.

This module is designed to be taught as-is by the instructor and does not explicitly require any specialized knowledge beyond that which is contained in the readings for part 3. Due to the nature of the presentations and discussions, an instructor should have some experience integrating ethics into the curriculum. The sociopolitical issues at its center are tangled and politically fraught and tackling them successfully with students requires some confidence on the part of the instructor. There are student-facing rubrics for all parts of the assignments. The instructor who adopts this module will need to develop a method to evaluate students on, e.g., addressing specific sociotechnical issues in their revised design.

This module provides an opportunity for collaboration with a colleague (and potentially one of their classes) in criminology, critical race studies, history, or sociology to lead a day’s discussion toward the end of the semester. That colleague may have advice on how to evaluate students.

This module does not presume any special knowledge base of analytical skill set of its students, although some students may find value in discussions that help them understand the arguments in the readings. Students with a robust background in humanities and social science will likely grasp the readings’ core arguments on their own but would still benefit from discussing the readings with others, though this could be achieved by requiring each student team to discuss the readings among themselves.

This module slots cleanly and easily into a data structures course. Only the third portion is additive, on top of what a typical final project for DS would require, and that third portion builds so directly on the first two portions that it serves as a very efficient way to integrate substantive ethical reflection into a course.

This module is exemplary in the way it tackles a complex issue in American life. By moving progressively from a complex coding exercise to a substantive social analysis and then back into the world of coding to try to ameliorate some of the inevitable harms of the first round, it has students address a major set of structural injustices and offers them a practical crash course in why those problems are so intractable: because structural harms are easy to reproduce or exacerbate when one’s focus is purely technical, because non-technical issues are so easy to overlook when one is confronted with a technical problem, and because the technical problems we end up tackling are so often presented to us by those who oversee the existing harmful systems, and whose goals are often deeply at odds with those whom the system most harms.

An additional strength of this module is that it addresses a dimension of experience that is very common for many people in the US, but often invisible to those whom it does not affect directly. A further strength is that it initially engages all the students as programmers and experts, rather than as potential targets of policing: although some students will have much more direct experience of racist policing than others, no student is required to make themselves vulnerable by testifying to that experience directly. Rather, the multi-stage structure of the model establishes all the students as equals with respect to the technical work of the course, and then supplies outside sources to ground the sociopolitical discussion to which they will each bring their own perspectives and experiences. So long as this third portion is given its due in terms of discussion among students (instructor-led or otherwise, as the students’ needs dictate) this module is very likely to reorient many students’ worldviews.


The evaluation of this module was led by Emanuelle Burton and Evan Peck as part of the Mozilla Foundation Responsible Computer Science Challenge. Patrick Anderson, Judy Goldsmith, Colleen Greer, Darakhshan Mir, Jaye Nias, and Marty J. Wolf also made contributions. These works are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.