Week 2: Technohegemony and a Teen Audience

Before answering the explicit questions posed in this prompt, I’d like to devote some of my response to addressing the issues Dr Noble raised in her essay and the larger issues implicated in a discussion of technology, the hegemony, and the inscription of gender, racial, and socioeconomic identities.

To begin, while I found Dr Noble’s work on this subject to be so brilliant and well-argued, I also found it somewhat devastating. I’d like to think that I knew, that I have always known on some level, that ideology works through all systems and all institutions, and that we are interpolated by every “voice” that speaks through these hegemonic mouthpieces. I’d like to think that I knew that the internet (and technology more generally) is not and has never been a “neutral” space inside of this. And yet, I also know that on some level I was comfortable operating under the fiction (a known-fiction) that whether in its coding or in its hardware, technology might somehow be able to exist outside of the ideologies endemic in these hegemonic systems. Dr Noble’s work threw into sharp relief how misguided and even dangerous a belief in this fiction of technological neutrality can be. I, like many of you, not only use Google, but I actually quite like it as well: its algorithms have lead me to breakthroughs in my own research and it has helped me discover some incredibly productive databases, tools, archives, etc. I’ve only listed its academic uses here, the mundane tasks whose completion is predicated on Google notwithstanding.

All of this in itself induced a sort of existential dread at the realization that (yet again) no space is safe or devoid of the workings of the hegemony (and further, its endemic sexism, racism, classism, all manner of bigotry). I willingly participate in these systems because they are convenient and because I have been conditioned to rely on their use, i.e. I am wholly dependent on them. A sort of malaise lingered for a few days after Dr Noble’s address to the class. I am not mimicking the cynicism the prompt so rightly predicts for rhetorical effect: I felt (feel) all of this and more. So how best to combat this in a “younger, less knowledgeable audience?” I haven’t the slightest idea (though I noticed Gabe posted a very comprehensive response to this question, so I shall have to read his post after I complete mine). I do however have some theories as to how to “translate” these concepts and concerns into a language that young people / the technologically illiterate might better understand (though is it fair to conflate the two?). I see the biggest obstacle to this task is how “normalized” this sort of technology has become. Though I will be the first to admit that I am dependent on Google (and other instruments of the technohegemony) for so many day to day tasks, I can still remember a time before the internet. This is in fact not the case with the freshman I taught last semester. I will use them as my fictional “audience” in this post, as they are the only age group with which I have had any professional teaching experience. The internet (and technology more generally) has become so enmeshed in their social, academic, and professional lives that I believe there must be a process of “defamiliarizing” those technologies. Without exposing to them the inworkings of these platforms, attempting to convince them of their function in maintaining and perpetuating a dominant ideology would be akin to arguing that a toaster or a Kitchenaid is an instrument of the hegemony (which in itself could be the subject of another blog post). This means, yes, “deconstructing” these devices — explaining that while hardware, software, and firmware coalesce in a single “object” (be it an iPhone, Macbook, or tablet), they not only serve distinct functions but each is bound up with its own implicit ideological machinations. This presupposes some knowledge of how these component parts work. I would advocate for the need to teach CompSci classes throughout a child’s education or at least provide some exposure to it. In understanding how something “works” even at a most basic level, I believe one is better able to see and understand the “work” that it is doing.

However, were this audience unable to understand the more technologically advanced cant resultant from “deconstructing” those devices, disrupting the idea that Google is in some way “neutral” would be a necessary first step. Perhaps likening “code” to our own “genetic coding” would be a helpful metaphor: while our genome is a random assemblage of nucleobases and hydrogen bonding, whose “mutations” and “genes” are selected for based on their advantageousness or viability, Google’s “genome” is designed, i.e. its “code” has been deliberately written. And could they imagine if someone — not an infallible deity and not chance, but a flesh and blood person with their own prejudices and assumptions and biases — were to “write their own code?” Would it not stand that what was being “encoded” would be the construct of an imperfect, ideologically-driven mind? Would it not stand that the product of such a mind would carry, however subtly, those same prejudices in its very genes? Andrew Feenberg’s essay “Democratic Rationalization: Technology, Power, and Freedom” posits this another way: “Technologies are selected… from among many possible configurations. Guiding the selection process are social codes established by the cultural and political struggles that define the horizon under which the technology will fall. Once introduced, technology offers a material validation of the cultural horizon to which it has been preformed. I call this the ‘bias’ of technology: apparently neutral, functional rationality is enlisted in support of the hegemony” (147). It is of course not “natural selection” to which he refers but a deliberate “selection” that is guided by “social codes” which participate in and stem from the dominant ideological ordering of the world.

I think the danger here is making this all seem so nefarious, so explicit and intentional: while this audience might resist a the claim that racial bias in Google’s search algorithm is deliberate, they perhaps would be more receptive to thinking of it as something operating outside of or perhaps below intentionality. In other words I believe figuring Google as some deliberately manipulative “Evil Corporation” would be met with resistance because it is such an essential and productive tool, so ingrained into day-to-day processes, that to insult Google would be to insult people’s dependence upon it. Further to this, in discussing the production and reproduction of “ideology,” I think most would assume that this is enacted by means of “repressive” instruments, or that which “prohibits” or “impedes” an exchange of information or belief, whereas Google is viewed by many as an instrument of “liberation” in this regard. It is too easy to eschew subtlety in favor of making powerful, impactful claims, especially when the process of fleshing out those subtleties can prove so daunting, but in making this a complex issue wherein Google at once participates in and reinforces the hegemony (footnote: I fully believe that the “hegemony” is a concept that can be explained to a “teen” audience, as I subjected my nineteen-year-old sister to a discussion of it last Easter break and my Rhetorical Arts class seemed to grasp the concept last semester), but perhaps does so on a level slightly below the deliberate instantiation of hegemonic values might better serve the audience we are attempting to address.

And if they are perhaps not so keen on the “genome” metaphor I’ve constructed here, perhaps it would be best to focus on the assumptions that they have about the way Google indexes websites, to dismantle the “vote and rank” supposition that is so prevalent in people’s minds. Regardless of the course, I think central to all of this is having conviction in the students’ ability to grasp this material. Complex issues, whether they are about technology or ideology or the way in which these two intersect, should not be reduced to platitudes or generalizations. Rather, they should be worked through in a language and with a patience that intervenes in a reductive “Google = Bad” narrative. Not because this claim doesn’t possess value or merit, but because attacking Google as a product or as a service might alienate those who are dependent on its use. Further to this, to suppose there are limits to what individuals are “capable” of understanding merely reinforces those same limits. As I said, I don’t think I can address latter half of this prompt. I’m effected by the same cynicism (though I’m not sure if its cynicism, so much as resignation and exhaustion) after having been forced to confront a fact I already knew to be true. If anyone has any ideas about how to help the “audience” we are meant to be working with here (or myself for that matter), I’d certainly be open to ideas!

Category: Technology

Keywords: pedagogy; programming; ideology; algorithms

(interestingly, the Library of Congress uses Google to search their own cache of Subject Headings)

3 comments on “Week 2: Technohegemony and a Teen Audience
  1. In reading everyone’s blog post, it seems Sarah, Alex, Allison, and I have all circled around certain issues in critical technology studies and pedagogy around Google: we want our students to critically engage in discussions about the far-reaching negative effects Google can have without dismissing it out of hand, but we also don’t want to hammer home any overly simplistic narrative about how Google is evil incarnate. I always try to set the bar for any lesson below “blowing minds” (it gets messy with all that grey matter splattered on the walls) and strive for more modest critical engagement rather than cynicism among my students (although the malaise you describe, Sarah, is certainly to be expected when discussing this heavy topic). I think one key point you bring up, Sarah, in chipping away at common perceptions of Google is the well marketed narrative of Google as an “instrument of ‘liberation.’ ” The entire Silicon Valley tech sector has cloaked its largely capitalistic endeavors in this revolutionary rhetoric about how a new app on our phones contains limitless potential and will usher in a Utopian era in which free time is boundless, communication is effortless, and all the information we seek is right at our fingertips. I think that narrative of ‘liberation’ is a great starting point in which to reflect on how our dependence on our devices becomes much more of a limiting construct in which hegemonic ideologies can be propagated.
    I think a nuanced approach to Google outside of the good-evil tech binary doesn’t need to be overly complex or difficult for high schoolers/undergrads to grasp, though. I think your ‘genome’ example reflects Siva Vaidhyanathan’s assertion (cited in Dr. Noble’s article) that “Rendering web content (pages) findable via search engines is an expressly social, economic, and human project,” not at all a process natural or innate to some biological standard. Furthermore, I think this to be true, no matter what we’re speculating about how much of Dr. Noble’s thesis our students will be likely to accept: Google is far more irresponsible, naive, myopic, or even delusional than they are nefarious. Many of the examples Dr. Noble presented us (like the “black girls” or “n**** house” searches) seemed to me to reveal the unexamined biases written into algorithms and coding by a very specific, non diverse group of programmers (83% male, 60% White, 3% Black and Latino, according to figures released in 2014), rather than a conscious, deliberate plot to reinforce hegemonies. Google, Facebook, Twitter, and other social media entities have pushed to make their products as broadly integrated into everyday use as possible, but with very little consideration for the ways in which their products could be used to do great harm within societies. Facebook only recently announced a ‘crackdown’ on illegal gun sales on their website after extensive pressure from the Attorney General, and as Dr. Noble repeatedly pointed out, Google only addresses the symptoms of a search engine that reinforces systemic bias when certain examples make for bad PR, and even then they chalk it up as a “glitch,” not at all indicative of a larger problem (pay no attention to the man behind the curtain!). I’ve heard many different NPR pieces questioning the ways the makers of these massive digital platforms generally launch new technologies on a global scale, but then repeatedly eschew responsibility when these same platforms are used explicitly for crimes, much less when they implicitly reinforce systemic biases for large swaths of tech users.
    I think the takeaway here for students can be simple: transparency and self-reflection from these corporations, and a healthy critical lens from consumers, are the bare-minimum requirement in the 21th social-digital-info-economic marketplace. Students can easily connect to feelings of outrage over an irresponsible, tone deaf response from a government/corporation to criticism; one easily accessible example is the NFL’s inadequate response to domestic violence charges, players’ mental health risks, and slew of other issues. If they can understand the egregious ways in which so many of Google’s hegemonic practices go completely unchallenged, they can start to see how the dangers of Google’s ever present, blind, capitalistic self-interest can greatly outweigh the possibility of any conscious, nefarious plots.

  2. Sarah I too suffer from that cynicism you speak of not simply with regards to technology, but towards this world and the amount of bigotry we see embedded in any and all systems that we find ourselves dependent of. Yet to speak specifically on the points you made I think you’ve found step one which is recognizing that these tools are normalized to the point of causing all of us a disservice. As Professor Noble alluded to and I think we’ve touched on this as well, very few people are thinking about these systems critically and I will add myself to that lot as I was personally unaware that the algorithms within Google were biased and reflective of a corporations financial relationship, although in retrospect it makes perfect sense. Anyhow, as you’ve said the manner in which something such as Google and technology is so normalized makes it even more difficult when your goal is to offer a counter narrative. Also it is largely detrimental when your audience is heavily reliant on said system. Although this is not to bar us from trying and I understand the notion of not wanting to present Google in a demonized way, I also don’t entirely believe it.

    I apologize as I will venture away from your post primarily because that brings to mind conversations we’ve had in class about presentation and the fear or incessant need to create something that is tolerable for the naysayers of bigotry or anyone in general. It is difficult for me to reconcile, although I believe I understand where this concept is coming from, this idea that we have to present difficult topics in such away that it doesn’t scare people off. Now with regards to technology and attempting to reach youth I find it a bit more reasonable although in principle it is the same as trying to present race and racism in such a way that it isn’t “off putting”. I don’t think I agree with either and I find that notion rather disconcerting specially when discussing oppression of any sort. What is a better alternative, I don’t know, but the idea that the cause and grievances of a group of people who have suffered an insurmountable amount must be brought up in a cautious fashion is, well, once again we find that said group must cater to those in power. As if its not okay to demand the things one should have rightfully received, but rather one should ask for them. Why so we can spare the people that require the delicate approach the guilt, the shaming? This may just be my cynicism on the topic of oppression and bigotry, but as I have said I understand why a “tactful/delicate” approach may be thought of as more successful or rather more appropriate, I just don’t know that I believe it. If they don’t believe in systematic oppression, the exploitation of people of color or in white privilege because they are of a generation that thinks everything was fixed during the era of emancipation/reconstruction or with a Black president (those that preach colorblindness) or because they are entirely invested in the mythology that is America then the manner in which we present the issues is of no accord. As I have said I know I veered off, but I struggle when we speak of needing to alter the approach in order to prevent certain groups or generations of people from automatically shutting down and refusing to listen, specially when what is being said is the truth. I can’t help, but believe that those who shut down, when it comes to a topic like this, will shut down not because of the approach, but because of their beliefs, so how then do you change them?

  3. In response to Sarah’s post, and because we are operating under the concept of hegemony, I find it useful to bring Raymond Williams into the conversation. Raymond Williams’s Television: Technology and Cultural Form , published in 1975 is an example of a social and political criticism of developing technology. Williams’s piece examines the power of the television; specifically as a means of mass communication and simultaneity. He warns that this type of technological power can be used for revolutionary or counter-revolutionary purposes.

    “Over a wide range from general television through commercial advertising to centralised information and data-processing systems, the technology that is now or is becoming available can be used to affect, to alter, and in some cases to control our whole social process. And it is ironic that the uses offer such extreme social choices. We could have, locally based yet internationally extended television systems, making possible communication and information-sharing on a scale that not long ago would have seemed utopian. These are the contemporary tools of the long revolution towards an educated and participatory democracy, and of the recovery of effective communication in complex urban and industrial societies. But they are also the tools of what would be, in context, a short and successful counter-revolution, in which, under cover of talk about choice and competition, a few para-national corporations, with their attendant states and agencies, could further reach into our lives, at every level from news to psycho-drama, until individual and collective response to many different kinds of experience and problem became almost limited to choice between their programmed possibilities” (Williams 151).

    Although Williams’s work precedes the computer and the internet, the way Williams identifies the social and economical motivations that are intrinsically tied to the development of technology is helpful for looking at Google, and other search engines and websites frequently used. While Williams does little to help any paralyzing cynicism over the reach of the hegemonic power through technology, he does state that new technology offers the “tools of the long revolution towards an educated and participatory democracy, and of the recovery of effective communication in complex urban and industrial societies,” and I think this is a truth (maybe an obvious one) of the internet today. In other words, new technology, and even Google, can also be a place for change, especially if we are critical of the ways in which we operate within it. Because the internet is so fluid and evolving at such a rapid pace, it is not unrealistic to expect changes, or alternatives that move away from, or even undermine, hegemonic powers.
    This all being said, it is sobering to realize that what Williams accurately identified in the television, are the same dangers of Google. In fact one can argue that the internet has superseded the television –for many people the computer IS the television. However, in the case of the computer, the in-built hegemony and the power, as Williams states, “to affect, to alter, and in some cases to control our whole social process” saturates the lives of the masses and operates under the dangerous façade of neutrality.

Leave a Reply