Data Feminism | Catherine D’Ignazio

Summary of: Data Feminism
By: Catherine D’Ignazio

Introduction

Dive into the world of Data Feminism, a book by Catherine D’Ignazio that challenges the status quo of data-driven systems, revealing how they often perpetuate systemic oppression, privilege, and discrimination. The book explores intersectionality, the matrix of domination, and the ways in which algorithms and data projects can reinforce harmful stereotypes. You will learn about the importance of considering context, understanding history, and questioning classifications when working with data, as well as finding inspiration in the stories of groups and activists who have used data to challenge established power and drive positive change.

A Feminist Lens on Big Data

Feminists advocate for “co-liberation,” stressing that freedom for all is necessary. They use data to explain how oppressive systems harm individuals. Big data and AI are hailed predominantly by men and viewed in a skewed technoheroic light. The time has come for a feminist viewpoint to reframe this narrative. The term “intersectionality” refers to the intersection of disadvantaged identities. It demonstrates how suppression and privilege intersect, resulting in systemic discrimination. By acknowledging racism, sexism, homophobia, ableism, and classism as forms of systemic oppression, progressive change can occur.

Matrix of Domination

In her book, Sociologist Patricia Hill Collins introduces the concept of the “Matrix of Domination,” which refers to the interrelated domains of oppression that uphold societal systems of power. These domains include the “structural domain,” which codifies oppression through laws and institutions, and the “disciplinary domain,” which enforces hierarchies and bureaucracy. The “hegemonic domain” upholds oppressive systems through media and cultural dissemination, while the “interpersonal domain” affects individuals personally.

Collins argues that modern forms of oppression often target people of color and those living in poverty. She highlights how algorithms, which are often created by individuals who do not represent the world’s diversity, perpetuate a “privilege hazard.” For instance, a simple Google search for “three Black teenagers” versus “three Caucasian teenagers” reveals how search algorithms encode racism and perpetuate stereotypes.

Furthermore, the collection of massive datasets by well-funded organizations increases social asymmetry and serves purposes such as science, surveillance, and selling. Collins contends that without a firsthand experience of the lived reality of marginalized communities, individuals within dominant groups are limited in their ability to prevent harm, identify problems, and envision solutions.

Data Projects That Challenge Oppression

The use of data and technology is not a neutral process, as it can perpetuate discriminatory practices and power imbalances. The Detroit Geographic Expedition and Institute (DGEI) and the ProPublica report are two examples of data projects that challenge existing power structures. The DGEI created a map of deaths of young Black people in a Detroit neighborhood to push those in power to take action, while the Residential Security Map created by the Detroit Board of Commerce was used to discriminate against Black neighborhoods. In the case of the ProPublica report, software used to determine sentencing guidelines was found to be biased against Black defendants, resulting in longer sentences and higher bail. The investigation led to the passing of a bill ensuring that New York City’s algorithms lack bias. In order to challenge oppression, it is crucial to recognize that it is the problem, not bias, and to use a framework of “co-liberation” to break down power structures. Understanding the historical context of algorithms is also necessary to fully comprehend their potential impact on marginalized communities.

The Power of Emotion in Data Presentation

Periscopic, a data firm, created an interactive chart that showcased deaths due to firearms. The approach of representing deaths using an animated point irked some experts who believed emotion has no place in data presentation. However, data can only be interpreted by humans who are emotional beings and not just rational machines. The idea of objectivity and neutrality comes from a false binary that supports hierarchies. Researchers found that people understand and remember information presented through multisensory ways that tap into emotions. Therefore, displaying data with emotions is powerful and necessary for humans to comprehend the realities behind numbers.

Counting the Uncounted

The classification of data leads to the marginalization of nonbinary individuals and reinforces existing hierarchies. Patriarchy, or a male-dominated society, is upheld through systems of classification that define “normal” and exclude marginalized groups such as people of color, disabled people, gay people, and women. While identifying transgender individuals puts them at risk, failing to count them is not a solution. The author argues that data collection by communities can be empowering and healing, but the complexities of classification systems illustrate the need for reevaluation and questioning of the values and judgments encoded within them.

Want to read the full book summary?

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.
You need to agree with the terms to proceed