top of page

Click button to join the conversation!


Or, type in 'systemic racism' in search bar 


Search engines and racial bias against African Americans

#AfricanAmericans #AfricanAmericansAreReclassified #africanamericanstyles #africanamericansaintafricans #africanamericansforberniesanders #africanamericansanta #africanamericansaintafrican #africanamericanshort #africanamericansoldiers #africanAmericans #africanamericanspeaker #africanamericans4philmurphy #AfricanAmericansForBernie #africanamericansantaclaus #africanamericansarentafrican #africanamericanslaves #africanamericanstars #doyinchallenge #africanfashion #africanamerican #africanprint #africanbeauty #africanqueen #Africans #Africanart #africanstyle #Africanmusic #africangirl #africanamazing #africaninspired

Journal of Contemporary Archival Studies Volume 6 Article 8 2019 Review of Algorithms of Oppression: How Search Engines Reinforce Racism Yvonne C. Garrett Drew University, Follow this and additional works at: Part of the Archival Science Commons This Book Review is brought to you for free and open access by EliScholar – A Digital Platform for Scholarly Publishing at Yale. It has been accepted for inclusion in Journal of Contemporary Archival Studies by an authorized editor of EliScholar –

A Digital Platform for Scholarly Publishing at Yale. For more information, please contact Recommended Citation Garrett, Yvonne C. (2019) "Review of Algorithms of Oppression: How Search Engines Reinforce Racism," Journal of Contemporary Archival Studies: Vol. 6 , Article 8. Available at: Safiya Umoja Noble. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press, 2018. Serious researchers and reference librarians often express frustration that access to information has moved from the reference desk to results often based on relevancies linked to advertising dollars paid into the for-profit behemoth that is Google Search. In recent years, scholars, journalists, and industry insiders have heavily critiqued the idea that the algorithms behind forprofit search engines are somehow neutral or unbiased, but for those who still hold onto an assumption of objectivity and accuracy, Safiya Umoja Noble presents a clear and wellresearched argument against such naiveté.

The algorithms and the searches they drive are instead, Noble argues, a part of systemic structural oppression around race and gender. For Noble, Google Search’s algorithms are structured in a way that supports dominant narratives reflecting hegemonic frameworks, and these same frameworks are an integral part of the structured oppression of women and people of color. In other words, there is nothing neutral about the net. As Noble details in her text, her investigation of the politics of search engines began with a simple search for information on young black women. Typing “black girls” into Google Search, the results she received were horrifyingly racist and primarily pornographic in nature.

Noble suggests the term “technological redlining” as a way to describe the racial- and genderbased profiling enacted by the algorithms that run Google Search (1). Those who work with the mathematics that formulate these algorithms are human and thus fallible. In her introduction, Noble details the various elements and public cases of sexual bias and harassment at Google, including James Damore’s anti-diversity manifesto (2). For Noble, the bias exhibited by Google engineers and ongoing issues at Google around equitable treatment of women and people of color are part of the central problem with assuming any neutrality in algorithms created by these same engineers. Nobel argues that the solution is not only to have a broader public conversation focused on the marginalization by “artificial intelligentsia” of people who are already systematically oppressed but also to recognize and regulate the monopolies over information created by Google and Facebook (3).

Throughout the text, Noble presents evidence to support her assertion that faulty results from search engines are traceable to both human and machine errors, and result in dire consequences. Gender and race bias has become part of the foundational architecture of technology, creating significant additional marginalization for groups already suffering under hegemonic systems of oppression. In her introduction, Noble states that she considers her work a “practical project, the goal of which is to eliminate social injustice and change the ways in which people are oppressed with the aid of allegedly neutral technologies” (13).

The text is broken down into six chapters moving through topics including “Searching for Black Girls,” “Searching for People and Communities,” “Searching for Protections from Search Engines,” “The Future of Knowledge in the Public,” and “The Future of Information Culture.” Noble provides a succinct and detailed introduction giving the reader a clear guide to her arguments and how they are presented. Her first chapter focuses on the relationship between Google and public access to information; the implications of various Google Search results, particularly as they relate to historical and societal oppression of marginalized populations; and 1 Garrett: Algorithms of Oppression Published by EliScholar – A Digital Platform for Scholarly Publishing at Yale, 2019 the ease with which search results can be manipulated.

A point Noble initially makes in this chapter, which is repeated throughout the text, is the way Google shifts the responsibility for negative or “bad” searches onto searchers, refusing to acknowledge fault in the technology or those creating the algorithms central to Google Search (44). Noble presents significant research illustrating the predominance of the white male gaze on the internet leading to the objectification of women, particularly women of color. Her first chapter concludes with a critique of the concept of the internet as a “cybertopia” or an intangible space and place that is open to all without fear of marginalization or critique. For Noble, the internet, and particularly that aspect of it presented through use of Google Search’s algorithms, is far from utopian; instead, it is rife with gender and racial bias and actively takes part in structures of oppression.

Continuing the conversation in her second chapter, Noble discusses the ways Google Search reinforces racial and gender stereotypes. She presents several case studies using different keywords including “black girls,” “Latinas,” and “Asian girls.” In addition, she offers a critique of Google’s PageRank search protocols particularly as associated with searches focused on Trayvon Martin and #BlackLivesMatter. The third chapter features a complicated argument around the importance of not-for-profit search engines and equal access to accurate information. In her continuing critique of for-profit search engines, Noble discusses the dangers of the preponderance of false news available on Google Search using as an example the white supremacist/domestic terrorist Dylann Roof and his mass murder of African Americans at an African Methodist Episcopal church in Charleston, South Carolina, in 2015.

Noble also highlights the European Union’s enactment of “right to be forgotten” legislation and the importance of allowing individuals and social groups the ability to create their own digital narratives. Throughout this widely disparate and difficult chapter, Noble strives to support her central argument that information is not neutral and suggests that we all must work to reimagine an information culture focused around social equity. Chapters 4 and 5 focus on the ways the work of library and information professionals often participates in the broader oppression of marginalized people.

The author shows how ongoing library classification projects are foundational to many of the bias issues in Google Search. But Noble also sees library and information professionals as essential in building and cultivating more equitable classification systems. To this end, she suggests a move toward broadening the scholarship within the field of library and information science to include those fields that focus on marginalized populations including gender studies and black/African American studies.

This move would, Noble suggests, allow library and information professionals to better understand the ethical concerns involved in decision making in all aspects of knowledge production from classification projects to questions around digitization and access. Noble is particularly concerned about ongoing digitization projects that present information without any consideration for the privacy of individuals involved, particularly when that information represents marginalized communities whose participants are given no say in the release of their information to the global internet (132). This section of the book should prove particularly helpful to special librarians and archivists.

Collections are selective by nature, but we may not be aware that this act of selection can have a role in the broader project of oppression that Noble addresses. And while Noble calls for increased regulation around digitization projects and search engines, she also reminds us that research professionals (and by extension archives professionals) play a role in the collection, cataloguing, and dissemination of information. In chapter 5, Noble cites the 2 Journal of Contemporary Archival Studies, Vol. 6 [2019], Art. 8 actions taken by students and librarians at Dartmouth University that eventually led the Library of Congress to change the subject heading “illegal aliens” to “noncitizens,” but this is but a small example in a vast field that needs our individual and collective work (134).

While the bulk of the text explores the various issues detailed above and a consistent critique of Google Search, Noble also presents some ideas for solutions. In addition to expanding the scholarship within library and information sciences to include critical perspectives around social equity and marginalized groups, she also suggests more rigorous regulation of information environments and a move away from “the neoliberal capitalist project of commercial search” (133). Noble is deeply troubled by the rise not only of for-profit search engines but also assumptions made around the neutrality of artificial-intelligence decision making. Without the nuance of ethically balanced human decision making, information becomes a part of the larger structures of oppression that continue to harm marginalized populations.

Overall, Noble makes a largely successful case for her argument that search engines (particularly Google) are built on inherently biased algorithms. These codes are written by primarily white or Asian male engineers without any training in ethics or critical thinking that confronts hegemonic structures of racism and sexism. This lack of awareness on the part of engineers and refusal to move beyond inherently biased views lead to data failures by search engines that create algorithmic oppression instead of working to provide neutral access to accurate and comprehensive knowledge. Noble’s claims are supportable and generally clearly presented with jargon-free language. Although her arguments against digitization projects are somewhat limiting, her calls for the right to be forgotten certainly should come under consideration of any archives or collections professional. In addition, one obvious lack in her text is any conversation with those actively creating algorithms for Google Search. It would be interesting to hear their voices, if only to give additional dimension to Noble’s well-r