On The Bias
On the Bias: Understanding Error, Skew, and Systemic Harm in Machine Learning Algorithms
“If it is the case that inequity and injustice [are] woven into the very fabric of our societies, then that means each twist, coil, and code is a chance for us to weave new patterns, practices, and politics. The vastness of the problem will be its undoing once we accept that we are pattern makers.”
This project-based course examines the merging of data science and arts and design practices. In particular, we draw from artistic practices and programs to examine the concept of bias in data practices that affect ranking operations such as credit scoring, search engines, and news trends. We focus in particular on data practices that depend on computational algorithms such as machine learning algorithms. In everyday language, we might refer to algorithmic bias as the false or negative assumptions computational algorithms make about someone’s behavior. But when we look closer at algorithms, we see that what counts as biased is less clear. Does bias stem from the uneven selection of data used to train the algorithm? Does it depend on the algorithmic decisions to reinforce identity-based stereotypes? Are we talking about the developer making decisions based on racial or gender stereotypes? Or are we talking about something more complex and subtle—something working not only within the algorithm, but also outside it? While definitions of bias vary, two fields understand the concept of bias as technical and specific. In statistics, bias refers to a systematic error. In textiles, bias describes an intentional skew (as clear in phrases such as “cut fabric on the bias”). Drawing from a range of theoretical texts and artistic works, and weaving together textile and statistics literatures, we will read, analyze, and theorize bias. This coursework takes the fields of art and design as playgrounds for the mingling of ideas, multiple interpretations, and translations. We will encourage students to experiment with different materials and methods to critically represent, express, and challenge biased datasets and skewed machine learning systems.
Personal Statement by Instructors:
We believe that arts and design practice offer an important and uniquely critical perspective on bias in our everyday encounters with data. We see these practices as integral to data science education—offering vital tools for grappling with the structural impacts, limitations, and opportunities around AI systems. We designed this course to unpack some of the world views and assumptions baked into our everyday data-driven environments. We hope that—even in its hybrid format—you will feel the passion we have for what we do.
- Assistant Professor Afroditi Psarra Digital Arts and Experimental Media (DXARTS)
- Associate Professor Daniela Rosner Human Centered Design and Engineering(HCDE)
The course is synchronous and offered in a hybrid format: the lectures are remote and meet every Monday, and the studio portion is a combination of online and in-person meetings every Wednesday. You will receive a schedule of all the in-person meetings at the beginning of the course. The class will be divided into two groups (group A and B) for the in-person studio meetings.