Transparency in Decision-Making Interfaces

Link to module

Evaluated December 2021

This module, Fairness, Accountability, Transparency, Privacy, and Surveillance in the Times of Covid are part of a complete course on data and how it is has been used, how data is used to gain insight, and how data is used to support decisions. This module has students study privacy policies from various social media companies through the lenses of risk, transparency, and accountability. It is suitable for a Computer Science 0 class, an intro data science class, a cross-disciplinary class on data or technology policy and possibly a security class. The module introduces students to the issue of risk and encourages them to identify how they would interpret more clearly key issues related to privacy.

It addresses knowledge areas Human-Computer Interaction/Human Factors and Security; Information Assurance and Security/Security Policy and Governance; Information Management/Information Management Concepts; and Information Management/Information Storage and Retrieval.

This module, taken as is, is a good choice for someone new to the delivery of responsible computer science. The readings in the module provide sufficient background for the instructor and students. An instructor with some familiarity with standard ethical theories will find delivering this module more straightforward, but it not essential for using this module. Additional background reading on risk and security from financial, social, ethical points of view would be of assistance in delivering this assignment.

An instructor may wish to assemble more recent links, but this is not necessary. There is a clear liberal bias in the writeup of this assignment and choice of media sources that the instructor will have to navigate with students, especially those who have a more conservative background.

The instructor will find developing learning outcomes for this module helpful both for students and for the development of grading and assessment criteria. It can be used to raise issues of fairness, privacy and security, inclusiveness, transparency, and accountability, and value sensitive design, or issues related to social class, race/ethnicity, gender, and so on. Developing an interpretation of what kinds of details students should provide in their write-up would benefit them and facilitate the evaluation process. This will require preparation by the instructor, potentially collaborating with faculty from other disciplines such as philosophy, psychology, and sociology.

This module also serves as a solid base for a faculty member looking to enhance their ability to develop responsible CS modules. This can be done by developing more questions to guide students as they were attempting to reframe language and approach. Further, identifying appropriate level reading to help students interpret the privacy policy interests and the legislative concerns would be a good exercise.


The evaluation of this module was led by Judy Goldsmith and Colleen Greer as part of the Mozilla Foundation Responsible Computer Science Challenge. Patrick Anderson, Emanuelle Burton,  Darakhshan Mir, Jaye Nias, Evan Peck, and Marty J. Wolf also made contributions. These works are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.