Week 2: Teaching Bias in Technology

The illusion of neutrality and machine-like objectivity are definitely understandings which permeate the use of technology. However, our discussion of the prejudice enmeshed with the algorithms in search engines like Google was something that made perfect sense to me. I don’t know a lot about algorithms or the way search engines function, but Dr. Noble’s explanation of “the man behind the curtain” was certainly easy enough for me to understand and sequentially evaluate. Therefore, I am under the impression that the difficulty in teaching bias in Google to undergraduates may be overestimated.

That being said, I was also reminded of the first time I came across scholarly work on the fact that technology is not neutral in Theory of Teaching, a mixed class of undergraduates and graduates, where we read Cynthia L. Selfe and Richard J. Selfe, Jr.’s 1994 article “The Politics of the Interface: Power and Its Exercise in Electronic Contact Zones”.  Cynthia L. Selfe and Richard J. Selfe, Jr. argue that computer interfaces create and perpetuate divides within society. They write:

The rhetoric of technology obscures the fact that, within our current educational system — even though computers are associated with the potential for great reform — they are not necessarily serving democratic ends. Computer interfaces, for example, are also sites within which the ideological and material legacies of racism, sexism, and colonialism are continuously written and re-written along with more positive cultural legacies (743). 

They go on to say that part of the way that interfaces, “order the virtual world according to a certain set of historical and social values that make up our culture,” is because of the fact that they have “grown out of the predominately male, white, middle-class, professional cultures associated with the military-industrial complex” (Selfe 745). The interfaces, as Cynthia and Richard Selfe have identified, reflect this demographic. They give examples of the terminology associated with modern capitalism and “white-collar inhabitants” of corporate culture such as “desktop,” “files,” “documents,” etc. They argue that the use of these terms to identify functions of the computer are a distinct reflection of the powerful ideology capitalism values and perpetuates.

For me, the author’s arguments were generally convincing, and impactful but I took issue with their suggestion that a “domestic” interface may better serve “women in the home” (746). However, the undergraduate students in the class were not persuaded at all, and openly laughed off the authors’ critique of the “the white pointer hand . . .  ubiquitous in the Macintosh primary interface.” The students couldn’t seem to get past the dated nature of the article in conjunction with some of the examples, and missed the important message. This instance is representative of the major difficulty in asking undergraduates to be critical of, and take seriously, the racial, and/or gender bias and capitalistic imperatives wrapped up in in technology. 

It’s possible that, because it is dated, the Selfe article isn’t the best way to get the message of biased technology across to younger students. I think the key is to stay very current, and to use live examples that hit home for students. For instance, Dr. Noble’s example of typing the word “beautiful” into Google images and the results being singularly white women is unquestionably problematic. Alternatively, spending class time examining search suggestions, like the ones in Dr. Noble’s presentation, would be equally impactful. 

While I wouldn’t use it to teach undergraduates, the takeaway from the Selfe article, that instructors must actively take every opportunity to teach their students and their colleagues to be critical of technology, is incredibly important and something for all instructors to follow: “We have to educate them to be technology critics as well as technology users. This recognition requires that composition teachers acquire the intellectual habits of reflecting on and discussing the cultural and ideological characteristics of technology — and the implications of these characteristics — in educational contexts” (743). 

a.) Category: Technology 

b.) Original Keywords: “teaching” “technology” “pedagogy” “algorithms” “google”
LC Terms: (I’m not sure if I was unable to work the website properly or if something was wrong with the search engine, my computer, or otherwise, but I didn’t get any results when I searched any of these terms.) 

One comment on “Week 2: Teaching Bias in Technology
  1. Alex, I’m so glad you’ve afforded me the opportunity to revisit the Selfes’ article from our Theory of Teaching and Writing class. If you remember correctly I was deeply skeptical / dubious of some of its claims, not because I didn’t agree with their argument that the political, the interface, and the digital were bound up with one another, but perhaps because I found the examples to be dated or problematic, e.g. the “white” pointer hand or their objection that “The interface does not, for example, represent the world in terms of a kitchen counter top… which would constitute the virtual world in different terms according to the values and orientations of, respectively, women in the home” (486-87). I also remember being resistant to this section in the text:

    This way of representing knowledge within computer environments, although not essentially limiting or exclusive by itself, becomes so when linked to a positivist value on rationality and logic as foundational ways of knowing that function to exclude other ways of knowing, such as association, intuition, or bricolage. This validation of positivism, rationality, hierarchy, and logic as the only authorized contexts for “knowing” and representing knowledge continues to inform and limit-many formal aspects of computer programming and technology design.

    I remember thinking “Well of course this is how computers are ‘organized’ — both in their interface and in their coding. What is the problem with ordering material in such a way that follows a ‘logical,’ hierarchical pathway?” Though it is still a challenge to understand how Strauss’s bricolage might be used as an ordering principle in computer design and programming, I now see that the Selfes — two voices that emerged quite early in critical technology studies — were already critiquing the very “language” and practice of computer coding. I see this critique in light of our discussion of algorithmic ordering — that a string of numbers might somehow contain an ideological value in an “alphabet” that operates under the guide of being “logical” and implicitly neutral.

    The Selfes also deconstruct the “technology as progress” narrative that pervaded (still pervades?) any discussion of emergent or new technologies: this narrative is undergirded by a “ middle-class, corporate culture; capitalism and the commodification of information; Standard English; and rationalistic ways of representing knowledge” (494). The fact that as early as 1994 — long before Google (or the internet more generally) had reached its ascendency — the Selfes understood that the commodification of information was bound up with the emergence of a technocracy: in turning these new processes of dissemination and aggregation into capital (though, arguably hasn’t the dissemination of knowledge always been capital?), the fiction of the computer as a democratic or neutral space becomes deeply complicated.

    So thank you Alex for reforegrounding this essay in my mind as I work through some of these issues and allowing me to revise my somewhat uncharitable opinion of their work!

Leave a Reply