Blog Entry #5

In “Queerying Homophily,” Wendy Hui Kyong Chun explores pattern discrimination in the form of “homophily,” which they describe as the, “axiom that similarity breeds connection.” Network science is also looked into with the author noting, “at the most basic level, network science captures – that is, analyzes, articulates, imposes, instrumentalizes, and elaborates – connection. Chun goes on to formally define it as, “the study of the collection, management, analysis, interpretation, and presentation of relational data.” Chun analyzes the relationship between homophily and network science, stating that homophily grounds contemporary network science. Regarding this problematic duo, “Homophily reveals and creates boundaries within theoretically flat and diffuse networks…it is a tool for discovering bias and inequality and for perpetuating it in the name of “comfort,” predictability, and common sense. Network and data analyses compound and reflect discrimination embedded within society.” Chun seeks to bring to light how behavioral patterns are surveilled and how they not only reflect current society but how they are used to create segregation and discriminate against individuals, or “neighbors.” The author proposes that in order to break this performative and dramatic pantomime we must, “embrace network analyses and work with network sciences to create new algorithms, new hypotheses, new grounding axioms.” Chun calls for the need to encourage and welcome back discussions on critical theory (feminism, ethnic studies, psychoanalysis, etc.).

In Algorithms of Oppression, Safiya Noble also analyzes the issue of racial profiling, or what they call “technological redlining” on the Internet. The author explains how this translates online: “On the Internet and in our everyday uses of technology, discrimination is also embeded in computer code and, increasingly, in artificial intelligence technologies that we are reliant on, by choice or not.” The book seeks to, “make sense of the consequences of automated decision making through algorithms in society.” Objectivity via online platforms is more or less a lie. Many may assume technology jargon is relatively objective, but Noble believes they are anything but “neutral” or “benign.” Noble reveals the intentions of their work, stating the need for, “interdisciplinary research and scholarship in information studies and library and information science that intersects with gender and women’s studies, Black/African American studies, media studies, and communications to better describe and understand how algorithmically driven platforms are situated in intersectional sociohistorical contexts and embedded within social relations.” The other intention is the need for, “experts in the social sciences and digital humanities to engage in dialogue with activists and organizers, engineers, designers, information technologists, and public-policy makers before blunt artificial-intelligence decision making trumps nuanced human decision making.”

As open as digital humanities aspires to be, both Chun and Noble’s analyses realizes that through technological redlining and homophily, digital projects might not reach as wide of an audience one hopes or thinks they are reaching. The enforced sameness via network science might only show their work to a selected (“academic”) few. It also prejudges everyone who visits certain sites and puts them into boxes with labels that they are unknowingly placed under. It uproots the idea that the internet is this vast neutral zone where anyone can explore anything. That people who possessed varied interests are predetermined in their race, gender, socioeconomic status, by the sites they visit and interact with. The Internet reflects society in more ways than most realize, and so far for the worse, not better. And that must change.

Leave a comment