Evaluated December 2021
This module, Transparency in Decision-Making Interfaces, and Fairness, Accountability, Transparency, Privacy are part of a complete course on data and how it is has been used, how data is used to gain insight, and how data is used to support decisions. This module integrates public-facing documents (such as public-education materials about how a COVID contact-tracing program works and Palantir’s own self-description) from the surveillance tools it examines and discusses how these systems operate in the world. Central elements of this assignment drive home the point that even simple calculations reveal information that could be quite damaging to certain people, and it demonstrates how having basic data is a form of power, especially if those in possession of the data are also in possession of computing power.
This module covers material in Information Assurance and Security/Foundational Concepts in Security, Platform-Based Development/Mobile Platforms, Human-Computer Interaction/Human Factors and Security.
Instructors adopting this module will find that small programming projects are associated with the module. As this is “Lab 14” (of 14) from that course, there is a lot of expertise that is expected of the instructor and the students. Some of that expertise is technical—writing the Python needed to generate the results—and some of it is understanding nuances of terms like “rights” and “justice.” The module includes links to pertinent resources for both faculty and students to establish some of that background. Even though it does not rely heavily on knowing ethical and social theories, faculty adopting this module will find colleagues in sociology and philosophy helpful or even necessary prior to delivering this module. This module when situated with students with appropriate background could be used over a couple of class periods. An instructor with an upper-division Security or Human Computer Interaction course with some flexibility over what is covered would be able to fit this in. It could also be used in a Data Structures course, but the level of support that would be needed by students increases significantly. Instructors using this module will have to develop their own discussion questions to encourage students to actively engage the implications of the issues raised by the module.
Assessment for this module will need to be developed by the instructor. Instructors can deepen the learning experience for students by exposing solutionism as a problem and having students consider alternate non-technical solutions to problems. Instructors might find this to be a good place to direct discussion questions and, subsequently, a good location for an assessment. Students completing this module ought to understand more deeply the influence those in control of technology have on how societal changes.
The evaluation of this module was led by Marty J. Wolf and Colleen Greer as part of the Mozilla Foundation Responsible Computer Science Challenge. Patrick Anderson, Emanuelle Burton, Judy Goldsmith, Darakhshan Mir, Jaye Nias, and Evan Peck also made contributions. These works are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.