Too Cool for Internet Explorer

rinse, repeat . . . panty wad

July 29th, 2008 by slewfoot

Music: Apoptygma Berzerk: Harmonizer (2002)

Since we’re on the topic of the profession this week, I thought I’d share that my panties are in a wad about a review of a co-authored essay I’m revising for resubmission this week. Before I detail the character of said wad, however, the requisite preliminary wind-up: if you are a professor at a research institution, you are expected to publish books and articles to keep your job. The perils and joys of publishing have been discussed here frequently, so I won’t retread the usual grousing and encouraging tips (for beginning scholars, I’ve got some pages here that walk-through the process with an example). I will say, though, that if you’re going to publish, you will have to revise. Never expect to get an article accepted at first pass. If your article is good enough, you will receive a request to “revise and resubmit” the essay from the editor. You should always regard those as a foot in the door. Revise well and things are usually promising.

I’ve only had three articles accepted at first pass. Ever. Eight years. Three articles. I think it is quite telling that all three articles were not with “Communication Studies” journals. Eight years. Three articles. Not communication. As a recent flap on a professional listserver demonstrates, for many folks in “my field,” Communication Studies is synonymous with the National Communication Association. Now, while it is true that’s where I tend to associate my professional “home,” I like to think of my work as “interdisciplinary,” straddling cultural studies and performance studies and English rhetoric/composition, among other fuzzily defined fields (right now I’m trying to get my lern on in media ecology). It always seemed to me the term “communication” was so vacuous and huge I could stick anything I want under the banner. Hell, folks publish navel-gazing in the name of communication, so why can’t I write about poop can call it “communication?” (I know what you’re thinking, and my response is: exactly!)

In addition to the extreme reviewers one sometimes gets (profiles of types here), I have to say, perhaps, my hugest most biggest pet peeve is the statement, “this has nothing to do with communication.” What the eff does that mean? I’m sure y’all get this in your various fields: “this is not political science,” or, “this is not anthropology.” I can even understand how those statements can be justified. But it just sounds ridiculous for a Communication Studies scholar. What, exactly, isn’t communication? Hell, aside from publishing about rhetoric as a way of knowing, my advisor made a name for himself by writing about silence. I remember when at LSU, one of our most talented graduate students wrote a pretty darn interesting paper on a sit-com and submitted it for review at Communication Quarterly. The editor sent the essay back, without any reviews, saying that “this is not communication scholarship.” I was so mad I sent him a letter telling him it was his responsibility to publish an editorial statement that specified what is and is not communication research (he never did).

So, I’ve sufficiently built up this post to paste in the source of my ire. So, like, ummmm . . . me and a cherished mentor wrote an essay on a film. We argue the film expresses certain social anxieties. That’s the gist (I don’t want to give too much away, as it’s still in review). Anyhoo, so one of our reviewers writes this:

First and foremost, I’m not clear on your theoretical moorings or intentions for this project. Certainly, I understand that you are situating it in psychoanalysis, but you employ that as a methodological tool: what do you hope to explore, interrogate, trouble, or contribute to in terms of theory? Along these lines but notable on its own terms as well, this project really has nothing to do with Communication—the only Communication pieces that you cite are the ______ pieces (more on those in a moment) on [the film]. This analysis could just as easily have come out of Film Studies or English or Comparative Literature. While interdisciplinarity of approaches and lenses is more than appropriate—it is often a strength in criticism—you really don’t have that here so much as a methodical psychoanalytic critique that isn’t anchored to anything substantive outside of that, and given your interest in publishing in a Communication journal, the absence of that particular foundation is especially conspicuous. There are a number of Communication scholars (other than_____) that have taken up the crisis in masculinity, [the film], and psychoanalysis, but you reference none of them here.

Ok, aside from the obvious bullshittery of “there are a number of Communication scholars [who] have taken up the crisis of masculinity, [the film], and psychoanalysis”—this is a patent falsehood—what kind of boundary pissing is this? A reading of the film as expressive of social anxiety is not good enough?

No, it’s not for this reviewer. The problem is that s/he equivocates with “Communication” as either a substantive domain or as code for some sort of citation practice. Worse, I think the underlying warrant here is a conflation of those two things: “Communication” consists of whatever communication studies scholars are publishing in communication studies journals. To wit: Communication is what communication studies people publish. It’s not clear what journals the author has in mind, but I’m willing to play bingo on the NCA square: our scholarship has nothing to do with communication because we are not citing research published within NCA journals. This is among the most stupid demands of scholarship I can imagine, especially when we’re called on the carpet for not citing scholarship that doesn’t exist.

So, if I were to amend my reviewer profiles, I would now add the Disciplinarian: no matter what you write, it’s not “communication” and therefore deserving of rejection. The reviewer never tells you what communication is, of course, but assumes you understand it from some sort of citational protocol you apparently missed.

on catching-up

July 25th, 2008 by slewfoot

Music: Pandora: Steve Roach: Early Man (2001)

Since summer teaching and grading ended, I’ve been catching up on the “immediate to-do” list, which seems to get longer and longer every summer. This summer it has expanded to two whole index cards. Mon Dieu! Today I finished one of those items on the list:

  • agency encyclopedia entry
  • eclectic criticism chapter
  • ideology encyclopedia entry
  • review essay for journal
  • Fight Club paper revision with Tom
  • draft public address talk
  • vocalic projectic essay R&R
  • prep rhetorical criticism class
  • prep EVP talk for psychology department
  • contract with graffiti office to finish walls
  • contract to repair water damage
  • draft 6 myths of psychoanalysis essay with Chris
  • revise “Zombie Trouble” with Shaun
  • replace radiator fan
  • make public address conference arrangements
  • make NCA arrangements
  • make fall speaker series arrangements
  • write CD kitchen essays for August
  • finish textbook proposal
  • finish book chapter!

This, of course, is a lot to achieve in four three weeks. I don’t predict I will succeed, but at least I know what train (wreck) lies ahead. I tend to tackle to-do lists like this as one would credit card debt: pay down/write the smaller stuff first, then move on to the bigger stuff. Correction: do the stuff with deadlines first, beginning with the smaller stuff. So the encyclopedia entries and the reviewing come first, then the R&Rs, then the draft of the public address talk.

Regardless, here’s the latest bit, a small description of invention in “eclectic criticism” for undergraduates, which apparently I am said to do:

“The Rhetoric of Exorcism: George W. Bush and the Return of Political Demonology,” by Joshua Gunn. Published in the Western Journal of Communication 68.1 (Winter 2004): 1-23.

In the months that followed the attacks on U.S. soil on September 11, 2001, “we the people” were traumatized and frightened. In the fall of that year and the spring of the next, I remember I was frequently flying across the country in search of my first academic job. Like everyone else I encountered in an airport, I was an anxious traveler. In early 2002, I vividly recall sitting in an airport bar in Dallas on a layover when President George W. Bush delivered one of his many, many speeches on Nine-eleven. All of us in the bar were staring at the television set, transfixed, disturbed, worried. Bush’s speech was designed to comfort us, especially the folks in that bar, who were waiting to board an airplane. But I was unnerved by the speech, which used a number of biblical metaphors and the kind of language I remembered in church growing up. I was also startled that, as of yet, few media commentators were discussing the religious overtones of Bush’s speechcraft.

I was raised in an evangelical Baptist church in a congregation that—I realize in retrospect—was itching toward Pentecostalism. I vividly remember the fiery sermons delivered by Brother Snooks (yes, that was his name). I remember the discussions we had in church about “spiritual warfare,” about battling demons on a daily basis, about how listening to rock music was an invitation for demonic possession. Spiritual warfare is a concept, now fully formed in many evangelical and charismatic religious systems, that refers to the idea that mortal human beings must do battle with demons on a daily basis through prayer and various religious rituals, such as exorcism. When I heard Bush deliver his speeches after Nine-eleven, I couldn’t help but think of the language of spiritual warfare.

One day, out of simple curiosity, I decided to research who Bush’s speechwriters were. I discovered Michael John Gerson was Bush’s chief speechwriter from 2001-2006. Gerson is an evangelical Christian who was responsible for penning many of Bush’s more memorable, religious statements. It then occurred to me that the charismatic language of spiritual warfare was quite deliberate in Bush’s speeches. Over lunch meetings and casual conversation I began sharing my ideas with friends. I argued that Bush’s speeches spoke to two audiences: those who got the religious warfare references, and those who did not. I was surprised to learn that most of those who I shared my ideas with resisted my argument. I recall at one lunch meeting a colleague laughed aloud at the suggestion that Bush’s speechwriters were employing the language of demonology. Few bought my argument that my personal background in evangelical religious beliefs gave me the authority to make arguments. In other words, if I was going to convince people that Bush’s speeches since Nine-eleven were about spiritual warfare, I couldn’t appeal to my own authority and experience, as valid as both were to me. I had to appeal to something in the speeches themselves.

When I was thinking about these ideas—and as Bush continued to make even more religious references in his speeches—I was invited to interview for a job in California. Part of the interview consisted of a guest lecture in a rhetorical criticism class. I decided I might try-out my arguments on Bush’s speeches. I wanted to offer an argument to the students about what I saw in these presidential addresses, but I also needed to teach them something about rhetorical criticism. I cannot detail exactly how or why I landed on the approach that I did: it just came to me. Genre criticism was the way to go with Bush’s speeches.

Genres are patterns that are repeated in discourse of all kinds. Most of us are familiar with film or music genres: “romantic comedy,” “hip-hop,” “horror,” and “alternative rock” are just some examples. Speeches also have genres: a eulogy or presidential inaugural addresses, for example, are just two of many types of speech genres. Regardless of the kind of discourse, though, all genres have patterns that create expectations in people. So, if I was going to go to a funeral, I would expect to hear a eulogy, and that eulogy would praise the deceased. My expectations would be violated if I went to a funeral and a speaker began attacking the dead person as a liar and criminal.

Bush’s post-Nine-eleven speeches were designed to meet expectations, so there had to be some generic norms. I printed out and read all of Bush’s speeches from September 11, 2001 until November 11, 2002. As I read the speeches, I looked for repeated patterns that might form expectations in those who heard them. What I found were not so much formal patterns—that is, what a president should do and say to a general audience—as I did informal or indirect patterns that closely modeled an experience that religious people know well: the conversion experience. If you are an evangelical, you are probably familiar with this story (quite literally embedded in the song, “Amazing Grace”): a lost soul meets misfortune and woe until, having discovered Jesus and accepted him as one’s own “personal savior,” one is “born again.” Or to put the pattern alternately, “I was blind/but now I see.”

I noticed something else about Bush’s speeches too. He used a lot of demonic metaphors to describe “the enemy.” He kept talking about “smoking [the enemy] out of their holes” and purging other countries of “evil.” What I had already unconsciously noticed at the Dallas airport bar suddenly became clear to me: Bush’s speeches repeated a religious conversion narrative, but one that premised conversion on the casting out of evil. I was reminded immediately of the film The Exorcist, in which a couple of priests cast a demon out of a girl in order to save her soul. I concluded that Bush’s speeches were rhetorical exorcisms.

When I went on my job interview, I taught the students about genre, and then presented them with information about Spiritual Warfare, and in particular, about the casting out of demons. In charismatic literature, to cast out demons one has to name the demon, argue with it, battle with the demon, and then cast it out. I provided the students with a copy of Bush’s 2002 State of the Union address and asked them to locate the pattern of exorcism in the speech. They did, without much difficultly. At that moment I knew that I was on to something that others could see also if I just presented the argument in the right way. After my presentation, one of my colleagues urged me to write up the lecture as an article and publish it. So I did.

Writing criticism and teaching it, however, are two different things. I knew that the generic approach to criticism would not be enough, as more was going on in Bush’s speeches. It seemed to me that Bush’s speeches were not just patterned, but that the pattern was highly emotional and perhaps worked at levels we are not consciously aware of. So in writing up my criticism, I also decided to discuss myth (deep, culturally based patterns we all absorb as members of a community) and psychoanalysis (the idea that patterns repeat in our heads, not just out there in texts). The resulting rhetorical criticism was “eclectic” in its methodological approach: by combing close textual analysis with generic, mythic, and psychoanalytic criticism, I offered a reading of many of Bush’s speeches as an exorcism. Had I done a simple generic criticism, or not thought about how the speeches work emotionally, my reading of Bush’s rhetoric would have been much less insightful. It was only by “thinking outside of the box” and using the approach that best got at the dynamics of the speech that I could make a compelling case. To me, this is the key to eclectic criticism: you must let the object you are criticizing help you determine the method of analysis that you use.


July 23rd, 2008 by slewfoot

Music: Austin City Limits 2008 (PBS): My Morning Jacket/Deathcab FC

Writing for clarity and an audience to which one is not accustomed is difficult. I also enjoy the challenge. I also enjoy jigsaw puzzles. I am completing a new, very difficult jigsaw puzzle on my coffee table this week, probably the next couple of weeks. It depicts a bunch of butterflies: one is blue. The rest of the butterflies are all variations of orange. 1000 pieces. Jesús has already discovered and chewed up two of those pieces. Completing the puzzle will thus lack the normal satisfaction of doing a puzzle, as two pieces will be missing. But I digress. On agency:


Agency is a concept that is generally understood as a capacity to act or cause change. The person who—or thing which—acts or causes change is termed an “agent.” In communication theory, agency is most commonly associated with people, as opposed to animals or things. To communicate, an agent must have the capacity, or agency, to do so. Consequently, most communication theories assume the existence of agency. Not all communication theories, however, require agency to be human in origin. Until the late twentieth century, agency was a relatively straightforward concept in communication studies. In light of human irrationality and evil in the past century, however, a number of scholars in the humanities have called many of our assumptions about human agency into question.

Terminological Confusion

The notion of an “agent” and the capacity of “agency” are often confused or conflated with closely related, but nevertheless distinct, concepts. Chief among them are “the subject,” a philosophical concept that refers to a typical or “paradigm,” self-conscious human being, and “subjectivity,” a concept that refers to the conscious awareness of oneself as a subject.

Originally, being a subject meant that one was ruled by, or under the legal control of, a king or prince, but gradually the term came to denote one’s status as a citizen beholden to the laws of a given government or nation state (e.g., “Josh is a subject of the United States”). In philosophical circles, however, the subject has come to denote a perceiving human being who is conscious of him or herself as a human being. In this philosophical sense, the subject is discussed in relation to “the object,” which refers to that which is perceived by the subject, or alternately, that which the subject knows he or she is not. The philosophical distinction between the subject and object as categories, however, is not stable and the meaning can change from one context to the next. In psychoanalysis, for example, the subject denotes a self-conscious person, but the object denotes another person whom the subject loves, hates, is ambivalent about, and so on (e.g., the infant subject loves the maternal object, mother).

A subject who self-consciously acts our causes change is said to possess agency. Hence, a subject with agency is an agent. An agent does not necessarily need to be a subject, however, nor does a subject necessarily possess agency. To complicate matters, agency is often confused with the term “subjectivity” as well. Whereas the subject denotes a self-conscious person, subjectivity refers to consciousness of one’s perceptions as an individual or discrete subject. Consciousness of oneself as a discrete individual (subjectivity) does not mean that one has agency or is an agent. Only an awareness of one’s ability or capacity to act (subjectivity) imbues the subject with agency.

In sum: agency is the capacity to act; the agent is the source or location of agency; the subject is a self-conscious human being; and subjectivity is consciousness as a subject. All of these concepts are implicated in the idea of communication.

Agency and Modern Philosophy

Contemporary understandings of agency can be linked to eighteenth century Western thought, often termed “the Enlightenment.” Although Enlightenment thought is not easily summarized, key among its goals were the use of reason to improve society and understand the natural world. In Enlightenment thought we find agency and the subject tied together in complex ways. For example, just prior to the Enlightenment the philosopher Rene Descartes reasoned that absent any knowledge or sensory perception whatsoever, an individual could know one thing: it thinks, therefore it exists (this argument is known as “the cogito“). Insofar as thinking is a type of action, this “it” that thinks is an “agent,” but it is not necessarily a “subject.” The it or agent that thinks is not a subject until it is conscious of itself as an agent who thinks (subjectivity). The Enlightenment thinker Emmanuel Kant extended Descartes’ argument about this most basic kernel of knowledge-something exists that is thinking/acting, therefore agency and an agent exist. Yet self-conscious knowledge, he suggested, depends on exposure to the world outside our minds, or the empirical world. In other words, to be subjects we have to have sensory experience. Subjectivity, consequently, is wholly “in our heads,” but requires a confrontation with the external world. The resulting concept of the “transcendental subject” advanced by Kant consisted in the necessity of both a thinking thing independent of the outside world, and the necessity of that outside world to make the thinking thing conscious of itself (subjectivity). For Kant, all subsequent knowledge subsequent to fact of self-existence is impossible without sensory experience. The meaning of the external world, however, is entirely dependent on the way in which human mind works. This view implies that the paradigm, self-conscious human being, or subject, is destined to become an agent and, thus, harbors a incipient agency at birth.

After Kant, the concept of the subject emerged as the relatively stable notion of a self-conscious agent. Consequently, in the mature subject agency was understood as the ability to cause change or act by making choices. In other words, the subject was believed to have agency because he or she could cause change by choosing among alternative actions. Insofar as choosing was key characteristic of the agency of the modern subject, Enlightenment thinkers associated agency with freedom, and by extension, individual autonomy: one became an autonomous subject by understanding and accepting his or her freedom, using reason to make choices.

Because of the influence of modern philosophy, agency became associated with self-transparency, self-knowledge, and rational choice-making. Because choice-making was understood as a component of human agency, today agency is often associated with matters of epistemology (how we come to knowledge), ethics (how we discern right from wrong), and politics (how to act collectively in the face of uncertain outcomes). In the social sciences, agency is also understood as a component of one’s self-perception as autonomous. Owing to these associations, in educational settings “giving agency” to students is often expressed as a goal of teaching: by working with students on their communication stills, it is thought, communication educators can help students to better realize their agency and become social, moral, and political actors in the public sphere and in private life.

The Posthumanist Critique of Agency

The Enlightenment view of agency and subjectivity is classically humanist, meaning that it is party to a larger perspective on the world termed humanism. In general, humanism is the view that human beings have a special status in the universe, a status that is superior to the supernatural or Divine on the one had, and a status that cannot be resigned to scientific naturalism or biologism on the other. It is commonly assumed the “humanist subject” is an autonomous, self-transparent, fully conscious agent who acts rationally by making choices. In the nineteenth century, this view of the subject and agency was challenged by a number of thinkers. For example, Friedrich Wilhelm Nietzsche argued that humans were motivated by the “will to power” and made choices that were typically self-interested. Karl Marx argued that human choices were constrained by material circumstances and frequently animated by the interests of those in power (“ideology”). Sigmund Freud argued that the choices of human subjects were often irrational and motivated by unconscious desires. Together, the critiques of the Enlightenment agency advanced by Nietzsche, Marx, and Freud laid the groundwork for what would come to be known as “posthumanism,” a view that would rigorously dispute human subjectivity as the seat of agency.

Although difficult to define, posthumanism is the idea that human being is only one of many types of beings in the universe and, as such, has no special status or value (other than, of course, what human beings assign to themselves). More specifically, in the theoretical humanities posthumanism mounts a critique of the subject as self-transparent, autonomous, choice-making, and rational. Understandably, if the human subject is not characterized by these qualities, then the Enlightenment notion of human agency as rational choice-making is also questioned by posthumanism. Many twentieth century thinkers associated with posthumanism, such as Judith Butler, Jacques Derrida, Michel Foucault, and Jacques Lacan, for example, would not deny that human agency consists of choices; they would question, however, the extent to which such choices were conscious or reasoned, arguing that they are constrained by larger forces such as language, ideology, social norms, the threat of imminent death, and so on.

The frequent rationale for questioning the fully-conscious, rational, choice-making capacity of human subjects concerns world wars, torture, genocide, and other atrocities caused by human beings. For example, although it is unquestionably the case that many Nazi war criminals made conscious decisions to do evil, it is also the case that many Nazi sympathizers aided and abetted such evil without consciously doing so. In the latter instance, the status of agency in the conduct of evil is unclear. Furthermore, insofar as human reason can be used toward evil ends (e.g., the rationally calculated extermination of millions of Jewish people during the second world war), posthumanism questions the value once afforded to reason by Enlightenment thinkers.

Because the problem of evil poses complex questions about the character of agency without any clear answers, posthumanist thinkers prefer to leave the status of the human subject open, as if the concept of the subject is a question itself, never to be fully answered. Agency after the posthumanist critique in the theoretical humanities is thus disassociated from full consciousness, choice-making, freedom, and autonomy, becoming a term for the capacity to act. The agent, in turn, can be anything that causes change or action.

Agency in Rhetorical Studies

Owing to the Enlightenment legacy of agency, scholars who study persuasive speaking and writing (“rhetoric”) have traditionally taken the Enlightenment subject for granted. Since the days of Plato, Aristotle, and the ancient Greeks before the common era, rhetoricians understood the persuasive process to involve a speaker or writer who consciously developed his or her rhetoric by making conscious choices. A persuader would select a topic, then proceed to outline his or her essay or speech, selecting some arguments and ignoring others. The rhetor would choose the appropriate language and tone of her address, analyze her intended audience to help her adapt to their expectations, and so on. These assumptions about the persuader tend to assume a self-transparent, autonomous subject.

In the twentieth century, however, the influence of Nietzsche, Marx, and Freud on rhetorical studies began to shift focus from the agency of the rhetor to the active understanding of audiences (a psychological move). The work of Kenneth Burke was particularly influential in this regard. Burke argued that persuasion was not the result of arguments offered by a rhetor, but rather, the result of “identification,” or the ability of persuader and persuadee to understand each other as sharing a common identity in some fundamental way (“consubstantiality”). Diane Davis has even suggested that Burke’s redefinition of persuasion leads us to the domain of the unconscious and the possibility that persuasion is akin to hypnosis. If this is the case, then agency in persuasive encounter is difficult to locate in any one individual, as it is a shared, unconscious, and dynamic relation between two or more people.

Because of the posthumanist critique of human subjectivity, one finds a variety of positions on the concept of agency among rhetorical scholars. There is no consensus among them about what agency means; some would even dispute this summary. Crudely, these positions can be reduced to three: (a) rhetoricians who continue to defend the Enlightenment subject and agency as conscious-choice making (humanistic agency) ; (b) rhetoricians who understand agency as a complex negotiation of conscious intent and structural limitation (dialectical agency); and (c) rhetoricians who narrowly define agency as a capacity to act, and the subject as an open question (posthumanist agency).

Agency in Social Science

Among social scientific scholars in communication studies, the concept of agency has been less controversial and the literature is decidedly larger in volume and scope. In various theories of communication from a scientific standpoint, agency is assumed to be the capacity to act and is usually associated with human subjects, as the preponderance of studies of communication concern humans. Owing to centuries-old discussions in modern philosophy discussed above, much of the work in social scientific communication theory associates agency with autonomy. More specifically, agency in communication theory can be traced to social scientific studies that investigate individuals’ self-perceptions of autonomy, control, and free choice in respect to a number of cognate concepts, including Piaget’s investigations of “agency,” Bandura’s studies on the “locus of control,” and various explorations of “attributional” or “explanatory style.” These and similar studies, in turn, are indebted to classical investigations by Brehm on “reactance” and Goffman’s theory of “facework”: Brehm’s work investigated how subjects reacted to perceptions of constraint, and Goffman’s focused on the ways in which subjects tend to work to preserve perceptions of autonomy and respect for others.

Closely related to common understandings of agency in social science is the concept of “power,” and a number of studies in the area of social and management psychology have focused on how various power structures (social, cultural, economic, relational, and so on) influences one’s perception of agency and autonomy in interpersonal dynamics. This research overlaps with scholarship conducted in organizational communication studies. Because organizational environments often foreground a tension between the human subject and the housing institution, agency has been a fecund topic of discussion and debate: to what degree do organizational norms constrain the agency of the individual? To what degree do organizational structures empower an employee? Actor Network theory has been particularly influential among organizational scholars in answering these and related questions.

Finally, owing to the powerful role of non-human structures organizations, it stands to reason that the agency of non-human things is an important dynamic worthy of study. Although the idea of non-human agency has been operative of the fields of linguistics and sociology for decades, it has only become a topic of concern in organizational communication studies in the twenty first century. In this respect, François Cooren and others have argued that non-human agencies, especially what he terms “textual agencies,” are crucial for understanding organizational cultures.

See also

Actor-Network Theory, Axiology, Facework Theories, Ideology, Postmodern Theory, Poststructuralism, Power and Power Relations, Relational Control Theory, Spectatorship

Further Readings

Bandura, A. (1997). Self-efficacy: The exercise of control. New York: W.H. Freeman.

Best, S. & Kellner, D. (1991). Postmodern theory: Critical interrogations. New York: Guilford Press.

Brehm, J. W. (1966). A theory of psychological reactance. New York: Academic Press.

Burke, K. (1969). A rhetoric of motives. Berkeley: University of California Press.

Buchanan, G.M. & Seligman, M.E.P. (1997). Explanatory Style. New Jersey: Lawrence Erlbaum Associates.

Butler, J. (1999). Gender trouble: Feminism and the subversion of identity (2nd. ed.). New York: Routledge.

Campbell, K. K. (2005). Agency: Promiscuous and protean. Communication and Critical/Cultural Studies, 2, 1-19.

Castor, T. & Cooren, F. (2005). The Role of Human and Non-Human Agents in Problem-Formulation. Paper presented at the annual meeting of the International Communication Association, New York, NY.

Cooren, F. (2006). Between Semiotics and Pragmatics: Opening Language Studies to Textual Agency. Paper presented at the annual meeting of the International Communication Association, Dresden, Germany.

Davis, D. (2008). Identification: Burke and Freud on who you are. Rhetoric Society Quarterly, 38, 132-147.

Geisler, C. (2004). ‘How ought we to understand the concept of rhetorical agency?’ Report from the ARS. Rhetoric Society Quarterly, 34, 9-17.

Gruenfeld, D.H., Inesi, M.E., Magee, J.C. & Galinsky, A.D. (2008). Power and objectification of social targets. Journal of Personality & Social Psychology, 95, 111-127.

Law, J. & Hassard, J. (Eds.) (1999). Actor network theory and after. Malden, MA: Oxford.

Lundberg, C. & Gunn, J. (2005). ‘Ouija board, are there any communications?’ Agency, ontotheology, and the death of the humanist subject, or, continuing the ARS conversation. Rhetoric Society Quarterly, 35, 83-105.

Putnam, L. L. & Pacoanowsky, M.E. (1983). Communication and organizations: An interpretive approach. Newbury Park, CA: Sage.

Rotter, J. (1966). Generalized expectancies for internal versus external control of reinforcement. Psychological Monographs, 80, 1-28.

Thibault, P. J. (2006). Agency and consciousness in discourse: Self-other dynamics as a complex system. New York: Continuum.

Weick, M. & Guinote, A. (2008). When subjective experiences matter: Power increases reliance on the ease of retrieval. Journal of Personality & Social Psychology, 94, 956-970.

Williams, C. (2001). Contemporary French philosophy: Modernity and the persistence of the Subject. New York: Continuum.


July 19th, 2008 by slewfoot

Music: Steve Roach: Early Man (2001)

A draft of an encyclopedia entry written with undergraduates in mind:


An ideology is a concept that refers to the collective beliefs, attitudes, and values of a given group of people, from social cliques and small communities to an audience or an entire nation. Although ideologies can be positive, most scholars who study or critique them focus on those that cause harm or suffering. For example, in Western societies the ideology of “individualism” is believed to be positive, while the ideology that promotes the idea men are superior to women, “sexism,” is believed to be negative. Consequently, sexism is studied and critiqued more heavily than individualism, although both ideologies are operative in the United States. In general, it is believed that ideologies work largely unconsciously and tend to promote the status quo, usually by supporting those individuals who are in power. Although the concept derives from the materialist theories of Karl Marx, the use of ideology is not limited to materialist contexts. Today, the notion of “ideology” is widely assumed and referenced in a variety of communicative contexts.

Marxist Origins

Now commonly assumed in communication scholarship, Marx’s main philosophical argument is that the way the world is materially arranged determines how we think about it. Until his philosophy, it was widely assumed that society as we know it is the product of human ingenuity: a group of individuals got together and dreamed-up the way society should look and function, and then went about making society in conformity with that dream. If this was truly the case, suggested Marx, then why hasn’t utopian thinking brought about a better world? When Marx was working out his philosophy in the mid- to late nineteenth century, he witnessed an increasingly prosperous class of people (capitalists) exploiting poorer people for profit. Factories were inhumane and people—sometimes even children—worked long hours for a meager wage. Despite the increasing successes and growing wealth of the individuals who owned the factories, their workers were getting poorer, even dying. Observing how willingly the working class accepted their poor conditions, Marx concluded something was wrong; thought had become “inverted” or turned upside down from what it should be. Ideology was the concept that Marx developed to help explain how this inverted thought came about.

Although it is true that one must imagine and then create a blueprint for a building before it is built, Marx argued that the ideas behind the blueprint were actually influenced by material conditions including: (a) what resources were available for building; (b) who owned the resources for building; (c) what class of individuals was ruling society; and so on. Marx argued, in other words, that the building imagined by an architect and then subsequently built would reflect the way the world was materially arranged at the time, ultimately serving the interests of those in power (e.g., those who owned the resources and means for making things).

Analogously, Marx argued that state governments tend to support the material and political interests of a dominant group of people (the “ruling class”). For example, it is often taught in American schools that the “founding fathers” of the United States of America gathered together at the Philadelphia Convention in 1787 and invented the current government system, which is designed to serve “the people.” A Marxist perspective, however, would emphasize that the government structure created at this convention only reinforced and stabilized the status quo: to this day, the government created by the founding fathers continues to support the most empowered in American society, which are wealthy white men. In sum, Marx reversed the way we think about thinking: it’s not that we dream-up a better world and then create it; rather, it is that the material, concrete world pre-exists us, and that whatever we create will conform to the constraints of this preexisting, material world. This view is known as “materialism.”

What, then, continues to maintain the existing material arrangement of society? Why do governments continue to support those in power? Even though technology is constantly changing our material and communicative interactions, why does it seem the same group of people always continues to benefit? In other words, despite obvious, dynamic change, why do political and state structures seem to stay the same? Marx’s answer is “ideology.” For him, ideology was fundamentally an inversion of the materialist view. If materialism is the idea that the concrete arrangement of the world influences how we think about it, then the ideology is the inverse notion that thought changes material reality. For Marx, then, ideology referred to something negative. Fundamentally, ideology referred to a kind of “inverted consciousness” that is incapable of seeing the fundamental contradictions of material reality that might lead to radical change. An individual under the sway of ideology, for example, believes that social class (e.g., rich and poor) is a natural arrangement and not the product of oppression and force. Because ideology is so powerful, argued Marx, only a violent, material disruption could change how we think about the world: revolution.

Positive and Neutral Ideology After Marx

After Marx’s death in 1883 the concept of ideology expanded to include new meanings, some of which were positive. Vladimir Lenin was most influential in shifting the negative connotation of ideology toward a more neutral connotation. If Marxism mounted a critique of the status quo and its commonly held beliefs, attitudes, and values as an inversion of material conditions, then such a critique must be coming from an alternative position with its own beliefs, attitudes, and values. In other words, Marxism is itself an ideology. Consequently, Lenin argued that ideology must be understood as the political consciousness of a given group of people, most especially that of an economic and social class. After Lenin, the concept of ideology became “neutral” when it was understood that the working class, whom Marxism champions, was ideologically opposed to the capitalist ideology of those in power, the wealthy ruling class.

After Lenin, the most influential thinker of ideology was Antonio Gramsci, who further expanded the concept to denote a set of representations or mental images of reality that is gleaned from a given culture’s legal and economic systems, as well as art and other forms of community expression. For Gramsci, this concept of the world also included codes for social behavior and action. Consequently, if a given groups’ ideology was pervasive, then they had “hegemony” or a tacit, largely unconscious control over social behaviors, forms of art, economics, and the law. If a group’s ideology has hegemony, then their beliefs, attitudes, and values seem natural and like common sense. Like Lenin, Gramsci believed ideology was neutral and governed the political consciousness of a given group, however, he argued a group’s ideology only achieves hegemony over others through contest and struggle. Gramsci argued, however, that hegemony is increasingly achieved without direct force or coercion, and often with unwitting the help of intellectuals.

Althusser and Ideology

Perhaps the most recent and influential thinker on ideology today is Louis Althusser, a French Marxist thinker who Jorge Larrain has argued sought to reconcile the negative and neutral understandings of ideology. Although Althusser would agree with Gramsci that ideology is struggled over, he expanded the concept further by adding a psychological dimension: ideology concerns the imagined relationship that individuals harbor about their real, material conditions. In other words, ideology concerns how a given person thinks about his or her relationship to the “real world.” Althusser argues that we have to understand ideology as a kind of necessary illusion, which we borrow from the world outside to make sense of our identity and purpose in life. No one of us, suggests Althusser, has direct access the real, material world; our relationship to the world is filtered through and by representations (at the very least, by language itself). Ideology is the main source of those representations. Consequently, some of us grow up and reckon with our real conditions of existence as Chinese citizens, while others of us contend with material reality as evangelical Baptists from the Southern United States. In this respect, for Althusser ideology is unavoidable and necessary because is the very basis of identity itself.

Althusser’s contribution to the concept of ideology cannot be underestimated, for it underlies a relatively recent theoretical movement, “post-Marxism,” that has had a strong impact on communication theory. For Althusser, one needs to incorporate an ideology to become a self-conscious person. If I am a Marxist, for example, then I know material conditions directly influence what is thinkable, that my purpose in life is to uphold the ideology of the working class, and so on. If I am a Christian, then I know material reality is but an illusion of a greater, spiritual reality, that Jesus will return to earth again, and so on. In either case, ideology gives me a sense of who I am and what my relationship to the “real world” is about. Absent ideology, I cannot “know” who I am. Hence, every communicative encounter with another person is in some sense an ideological negotiation.

Another important element of Althusser’s understanding of ideology is that it is diffuse and dynamic. For an individual to assume a set of beliefs, attitudes, and values about, say, the importance of capitalism, he or she must be confronted by them in multiple venues. A given ideology is not promoted by one person or even a class of persons, but rather by multiple agencies working simultaneously and in concert: the mass media; the educational system; economic and legal structures; the family; and so on. For example, let us use the ideology of individualism, which consists in the belief, attitude, and value that every person is unique and should take personal responsibility for his or her destiny.

One is not born to value individuality, but learns it through multiple agencies over a long period of time. As a youngster one is told about one’s unique and special character by one’s parents; the family teaches individualism. At the church, synagogue, or mosque one is taught that Deity has a unique plan for her life; religion teaches individualism. On television, talk show hosts tout the virtues of individual achievement and personal responsibility; the media teaches individualism. At school, one is given her own desk, told to bring her own materials to class, and is cautioned that she should keep her eyes on her own paper, because one’s grade is determined by singular, individual effort; school teaches individualism. In this way, different agencies—the family, the media, the education system—work to instill and reinforce the ideology of individualism. Borrowing a concept from psychoanalysis, Althusser terms the way in which multiple sources perpetuate a given ideology “overdetermination.”

The Concept of Ideology Today

Since Althusser’s attempt at compromise, the concept of ideology has been freed of its Marxist origins. Absent the materialist tie, the concept of ideology differs from one context to the next. In the popular media, ideology is frequently used as a synonym for one’s political orientation. In academic work, however, the concept of ideology is associated with scholars who conduct criticism or critique culture (e.g., the mass media). Generally, it remains the case that those who study ideology are interested, as Terry Eagleton has remarked, “in the ways in which people may come to invest in their own unhappiness.” Although some ideologies—for example, the Christian ideology of loving one’s enemy—can promote good things, in general scholars are interested in the ways in which ideology can harm and oppress people, and often without their noticing it.

Owing to the psychological turn of Althusser and the more popular work of mass movement scholars such as Eric Hoffer, however, in the last half-century ideology has taken on the connotations of political brainwashing. Unwilling to believe that individuals are “dupes” of ideology, many scholars abandoned the concept. Coupled with what is sometimes termed the “poststructural turn” in theoretical debates of the late twentieth century, this negative connotation has also led some scholars to call for abandoning the concept because it is self-defeating. In a charge that recalls Lenin’s reworking of Marx’s negative conception, some critics argue that ideology critique presumes a privileged vantage external to ideology for the critique to be possible. Such a presumption is, in fact, ideological itself and, consequently, any claim to discern hidden or obscured contradictions is itself as an ideological ruse. Instead, critics of ideology have argued for abandoning the concept in favor of Michel Foucault’s conception of “discourse” or “power/knowledge.” Contemporary defenders of ideology and ideological critique frequently counter by returning to Lenin or Gramsci’s more complicated notions of ideology as reflecting a deeper, material contradiction or antagonism, or by arguing that the abandonment of ideology critique is motivated by an investment in the status quo.

More recently, Slavoj Zizek has defended the utility of ideology for scholarship by offering a Leninist re-reading of the concept. He suggests that an ideology can only be known in contrast to a competing ideology. Insofar as ideology denotes the collective beliefs, attitudes, and values of a given group of people, one cannot become conscious of another set of beliefs, values, and attitudes unless there is a conflict between the sets. Consequently, ideology critique is not self-defeating, but rather self-interested.

Ideology in Communication Studies

In the field of Communication Studies, ideology is most frequently encountered as a category in rhetorical criticism and organizational communication studies. For rhetorical scholars, “ideological criticism” is a form of scholarship in which “texts” are closely scrutinized in order to uncover the hidden beliefs, attitudes, and values promoted by and/or influencing them. One popular method among rhetoricians for studying ideology across different texts is known as “ideographic criticism.” This method traces a singular term, or an “ideograph,” across multiple texts, which is itself symptomatic of a much larger, external set of beliefs, attitudes and values. In organizational communication studies, ideology has been studied among organizations in order to show how one becomes dominant or hegemonic, influencing organizational cultures. Finally, symbolic convergence theory straddles both social scientific and rhetorical approaches to communication by tracking ideology in terms of “fantasy themes” or “visions” that are created and exchanged among small groups of people working toward a common goal.

See also

Critical Organizational Communication, Critical Rhetoric, Cultural Studies, Ideological Rhetoric, Marxist Theory, Materiality of Discourse, Rhetorical Theory, Symbolic Convergence Theory

Further Readings

Althusser, L. (1971). Lenin and philosophy and other essays (G. Brewster, Trans.). New York: Oxford University Press.

Bormann, E.G. (2001). The force of fantasy: Restoring the American dream (2nd ed.). Carbondale: Southern Illinois University Press.

Eagleton, Terry. (1991). Ideology: An introduction. New York: Verso.

Gramsci, A. (1971). Selections from the prison notebooks. (Q. Hoare & G. N. Smith, Trans. & Eds.). New York: International Publishers.

Gunn, J. & Treat, S. (2005). Zombie trouble: A propaedeutic on ideological subjectification and the unconscious. Quarterly Journal of Speech, 91, 144-174.

Hoffer, Eric. (1951). The true believer: Thoughts on the nature of mass movements. New York: Harper & Row.

Larrain, J. (1979). The concept of ideology. Athens: University of Georgia Press.

Larrain, J. (1983). Marxism and Ideology. London: Macmillan.

McGee, M.C. (1980). ‘The ideograph’: A link between rhetoric and ideology. Quarterly Journal of Speech, 66, 1-16.

Mumby, D.K. (1988). Communication and power in organizations: Discourse, ideology, domination. Norwood, NJ: Ablex.

Sholle, D.J. (1998). Critical studies: From the theory of ideology to power/knowledge. Critical Studies in Mass Communication, 5, 16-41.

Wander, P. (1983). The ideological turn in modern criticism. Central States Speech Journal, 34, 1-18.

Zizek, S. (1995). “Introduction.” In S. Zizek (Ed.), Mapping Ideology (pp. 1-33). New York: Verso.

death of the movie star

July 18th, 2008 by slewfoot

Music: Let’s Active: Big Plans for Everybody (1986)

This week a network morning show aired a segment titled “death of the movie star,” in which the narrating journalist bemoaned the end of the traditional (or “classical”) Hollywood system. Perhaps as little as two decades ago, a big-name “star” could anchor a film and almost guarantee a positive return. Not so today, said the journalist, as she listed off a series of big names attached to recent Hollywood flops . . . but with one exception, of course: Angelina Jolie’s role in Wanted is proving the star is not dead yet! Of course, the story was on NBC’s Today and Wanted is produced by Universal, and Universal owns NBC. So what explains this strange contradiction? Why would a media company sound the death knell for the star system at the very same time as it attempts to fetishize a star? Thinkers associated with the Frankfurt School have an answer.

The classical Hollywood system—which took east coast theatre norms and subjected them to industrial standardization—eventually learned that they could not rely (at least initially) on the charisma of the stage star. “In the case of film,” explains Walter Benjamin, “the fact that the actor represents someone else before the audience matters much less than the fact that he represents himself before the apparatus.” By “apparatus” Benjamin means the set, the camera, and the director—an industrial set-up, as it were. This evaporates the mystique of the actor, argues Benjamin, because the actor is not making a direct connection with audiences and adapting his or her performance to them. Instead, the actor must imagine she is performing for “the masses” and must accept a kind of “self-alienation.” The actor, in other words, relinquishes his or her connection to the performance to the camera, as well as his or her immediate connection to an audience. “While he stands before the apparatus,” continues Benjamin, “he knows that in the end he is confronting the masses. It is they who will control him. Those who are not visible, not present while he executes his performance, are precisely those who control it. This invisibility heightens the authority of their control.” If this is hard to imagine, one can remember Buggle’s “Video Killed the Radio Star,” which is premised in a similar logic: inscription in the field of vision and in the age of technical reproducibility forces a deeper self-alienation for the musician. Now he or she must “look good” for the “masses.” New Wave artists understood this, which gave rise to their phenomenal success.

The problem with Benjamin’s account, of course, is that “the masses,” like “the filmic audience,” doesn’t really exist. These are projections of . . . the Hollywood system. Certainly box-office draw is a measure of something—something like “the mass interest”—but this ignores the fairly limited band of choice the “masses” have with Hollywood film: Dark Knight or Hellboy II? Nevertheless, Benjamin suggests that the loss of “aura” once possessed by the theatre star is supplanted by something else, “cult value”—the value of circulation and transaction, the ability of using the shot of a film actor as a “mirror image” that can be transported—just as much as the supposed power of “the masses” (which Benjamin suggests is the camera obscura of Hitler’s rhetoric). The “cult of the movie star” helps to preserve “that magic of the personality that has long been no more than the putrid magic of its [film’s] own commodity character . . . .” What is this “cult value?” We know it today as celebrity.

Benjamin made these observations in 1936, a decade after Rudy Valentino’s untimely death caused something of hysteric outbreak in the states, and just three years before Gone With the Wind and The Wizard of Oz would wow audiences with the magic of Technicolor, Gable, and Garland. Yet his observations were prescient, as the inevitable result—the abandonment of the cult of the star by Hollywood—would take nearly sixty years to happen. Far more insidious than the gradual disclosure of the star as just another piece of the Hollywood machine is the displacement of the actor’s control to “the audience,” which means the system itself. Benjamin argued this shift thins-out or evaporates altogether the spirit of political change (“revolutionary spirit”) and replaces it with the ideology of capitalism. Adorno is even more lucid:

Whereas today in material production the mechanism of supply and demand is disintegrating, in the superstructure it still operates as a check in the rulers’ favour. The consumers are the workers and employees, the farmers and lower middle class. Capitalist production so confines them, body and soul, that they fall helpless victims to what is offered them. As naturally as the ruled always took the morality imposed upon them more seriously than did the rulers themselves, the deceived masses are today captivated by the myth of success even more than the successful are. Immovably, they insist on the very ideology which enslaves them. The misplaced love of the common people for the wrong which is done them is a greater force than the cunning of the authorities. It is stronger even than the rigorism of the Hays Office [a censorship bureau], just as in certain great times in history it has inflamed greater forces that were turned against it, namely, the terror of the tribunals. It calls for Mickey Rooney in preference to the tragic Garbo, for Donald Duck instead of Betty Boop. The industry submits to the vote which it has itself inspired. What is a loss for the firm which cannot fully exploit a contract with a declining star is a legitimate expense for the system as a whole. By craftily sanctioning the demand for rubbish it inaugurates total harmony. The connoisseur and the expert are despised for their pretentious claim to know better than the others, even though culture is democratic and distributes its privileges to all. In view of the ideological truce, the conformism of the buyers and the effrontery of the producers who supply them prevail. The result is a constant reproduction of the same thing.

In other words, like all organisms, the Hollywood system is self-correcting in respect to its “product.” A century of self-correction has led to the “constant reproduction of the same thing,” not simply the same tired plot after the next—but the same star.

The standardization (and thus liquidation) of stardom is perhaps no more conspicuous than with VH-1’s Behind the Music show, in which the following career path is usually narrated: (a) star starts small and gets his/her big break; (b) everything’s coming up roses; (c) just at the apex of making-it big tragedy ensues; (d) star almost loses it all, or does in fact lose it all (Tawny, no?). If there was a Hollywood version for the movie star (whatever happened to Deborah Winger? Sean Young?) we’d likely see a similar script. At least, however, Winger and Young are strong, independent individuals in “real life” no matter how their careers are scripted. What’s frightening today is that, insofar as “real life is becoming indistinguishable from the movies,” as Adorno argued in 1944, spectators and stars alike can no longer distinguish between media-induced fantasy and “reality”—as they are, to a very real extent, one and the same (cf. Britney Spears; Baudrillard).

So what’s the story, here? I’ve been suggesting that stardom has been under dynamic transformation since the 1920s: as its cult value strengthened, its “content”—the control, individuality, uniqueness, and politics of the actor—has weakened gradually until only cult value was left. In postmodernity, pure cult value is perhaps best exemplified by Paris Hilton, a celebrity known for her circulation (in the economy of women, then on the Internet circuit, and finally, in the celebrity circuit buoyed by tabloid photographers). Movie stars are just as interchangeable. Thus the utter ridiculousness of Universal’s proclamation that somehow Angelina Jolie has escaped the evaporation of the “movie star” in their film Wanted. In a qualified sense, she’s not a movie star; she’s a baby-crazed, child-adopting celebrity who is married to another celebrity and who happens to make “movies.” Let’s not call Wanted a “film,” either.

cramped blog

July 16th, 2008 by slewfoot

Music: Sam Sparro: self-titled (2008)

Yesterday I received a new embosser that says “Library of Dr. Juice,” and I just went nuts branding in the library. I now have a seriously sore right hand, which makes blogging—and other fun things one does with the right hand—difficult. Consequently, tonight’s entry is a love of labor, writing through the pain, like Peaches does when she’s . . . you know. But I don’t have the stamina of Peaches. So I’ll keep this brief.

Enter Righeira, the phenomenal Italian synthpop group who scored it big in Europe with their most excellent, “Let’s Go to the Beach.” Check out how Stefano Rota and Stefano Righi move to the beat and sing into their wrist-watch televisions:

Of course, that hit didn’t make it too far over here stateside. The real winner was “I ain’t got no money,” which, of course, Timbaland totally plagiarized for “The Way I Are”:

That 80s-hop-back-and-forth dance is most super, and brings me back to, um, elementary school. I danced that way too! . . . with the Electric Bugaloo. Still do, in fact. Woohoo! And fortunately, as my buddy Mirko alerted in a timely sent email, Righerira are back to (synth)rock our world! Check out this hot shit! (note the PS3 around Stephano’s neck, a nice nod to the “Let’s Go to the Beach” video):

Aieeee, my hand hurts from typing. Must rest.

jesús wubs yew

July 13th, 2008 by slewfoot

Music: The Beautiful South: Paint it Red (2000)

losing a religion that was almost mine

July 11th, 2008 by slewfoot

Music: Van Morrison: Veedon Fleece (1974)

The Jesse Jackson flap yesterday finally motivated me to post something of substance—or toward substance abuse (hello, my friend tequila!). As anyone remotely close to a screen will know, yesterday the news broke that Jesse Jackson was “talkin’ trash” about Obama on FOX. Apparently unaware his microphone was hot, Jackson said to a colleague that Obama has “been talking down to black people” and that he wanted to “cut his nuts off.” These comments have circulated widely because, presumably, it demonstrates division among blacks about Obama. The underlying warrant here is that all black people, especially black politicians, think alike and stand in solidarity. The news also created an opportunity for Obama supporters to spin this as good news: white people don’t like Jackson, therefore, this is a nice distancing moment that will draw more whities toward the Big O.

It’s a shame, however, that Jackson’s “point” (pun intended) was eclipsed by his countless apologies. Jackson is angry with Obama for amplifying his “personal responsibility” rhetoric in recent weeks, instead of focusing on larger, structural issues, like “racial justice and urban policy and jobs and health care.” Obama has apparently been speaking on parental responsibility for years, and has a fairly standard line on absent fathers (I agree with the problem of absent fathers, I would simply disagree that said fathers must be male). But what Jesse’s “loving criticism” was to be about was the way in which Obama has intoned a therapeutic, Horatio Alger-style—or Oprah-style, take your pick— rhetoric that downplays the social-cultural and material causes of social ills—-and the deeper reasons for single-parent households (which, less face it, are not the province of African Americans, but all Americans). Obama, in other words, has amplified his personal responsibility rhetoric, moving to the right, for votes. For someone who was a has spent his life working toward structural change, the kind of change that one cannot create by oneself, Jackson thinks that by going post-race, Obama might turn into just another white-guy, neo-liberal president.

If you’ve read my blog for any expanse of time—and in particular, my thoughts about how Obama threw Rev. Wright and African American rhetorical traditions under the bus—you’ll know why I am similarly sympathetic to Jesse Jackson. I am an Obama supporter and my vote will be for him, unquestionably, but I too am disappointed with Obama’s rhetorical drift to the right. Yes, it is true he has always been more “centrist” or “conservative” than people realize, but his votes on issues have been fairly progressive and his rhetoric has seemed to build on a Left-style romantic idiom that signified allegiance with those civil rights leaders of the past who worked toward structural change. What we’re witnessing in Obama’s rhetoric, in other words, is a retreat from the preacherly persona and civil religion style. I don’t have time this afternoon to present snippets of text, but its there: the move has been from a flirtatious, religious crooning toward an issues-focused, personal-responsibility “blame the individual” type rhetoric.

For a very brief moment I was about to let myself go into Obamania, but my cynical reserve and distaste for political kitsch kept me on the shores of Nader (not that I would ever vote for Nader; I just wish there were a true, third or fourth party in this political life). Obama’s rhetorical drift to the right in recent weeks (not so much his stand on policy, which seems consistent) troubles me. Although Jackson is ridiculously unpopular, I’d vote for him over Obama.

That said, yes yes, I noticed that Jackson threatened castration. Of course, from a psychoanalytic vantage castration is the power of the father, what the child fears. Castration represents one’s entrance into self-consciousness and the symbolic world. He who claims the power of castration claims the agency of language. I’ve seen Jackson speak in person twice, and both times he whipped me into frenzy. The man is an amazing rhetor. By claiming to want to cut Obama’s nuts off, Jackson is threatening to remove Obama’s rhetorical power, to muffle his speech. The motive for wanting to do so is obvious: Jackson dislikes Obama’s rhetoric.

The problem with such a sentiment, aside from its meanness, is simply that no one has the phallus. Jackson is deluded if he believes he has it, just as deluded as Obama who believes the problem among African Americans is that fathers don’t claim it. Hence, the irony of Jackson’s statement.

So I say to both Jackson and Obama: It’s the structure, stupid.

good weekend

July 6th, 2008 by slewfoot

Music: The Walkmen: Everyone Who Pretended to Like Me Is Gone (2002)

Nothing is more cheering than having five people you love descend on your home for a holiday. Family is great, don’t get me wrong, but you get to pick your friends (and they you), and when your house is full of people you picked, things tend to get cheerful. We had a great time. Thursday night Shaun and Emily arrived, and we gabbed into the night over Etouffée, Mean-Ass Joshritas, and board games. Yogita, Christopher, and Tracy arrived on Friday, and we made our way to Hayes county for a slammin’ Fourth of July party (replete with live music . . . you can’t beat that!).

Yesterday I worked a bit as my friends toured Austin, then we had a nice time at Manuels and then, later, one of my favorite hangs, the Carousel. Today Shaun, Emily and I toured some Austin, drove out to the Salt Lick for some BBQ, then we hit the Hindu Temple In-the-Middle-of-Nowhere-Texas.

I owe others writing; twas hard to get done with guests—so it’s back to the grind tomorrow. But I know what should take priority: guests. Seriously. Academic rule number Positively 4th Street: once you’re making a living, buddies come before work. Amen. So mote it be.

Gallery of our good time is here.


July 4th, 2008 by slewfoot

Music: Ulrich Schnauss: Goodbye (2007)

Shaun and Emily, Independence Day guests, are on their way to Toy Joy as I take a break to catch up on some computing this morning. This afternoon we’re driving out to Hayes county for a backyard Fourth of July bash, which will feature not one, not two, but three funk bands (the new Jack in the Box commercial, “Don’t Stop the Funk,” keeps playing in my head). If I don’t hear a cover of the Ohio Player’s “Fire” I’m gonna be mighty funkapointed.

Anyway, taking out the trash this morning I noticed a bumper sticker on Emily’s Shaun’s car: “I pledge allegiance, not blind obedience.” The sentiment is apt today, a time when it is relatively unfashionable to be patriotic as an academic. Same as it ever was, frankly, only on this side of Nine-eleven even more so. Thinking about this bumper-sticker defiance also helped me to remember a short essay by Richard Rorty from 1994, “The Unpatriotic Academy.”

Rorty’s argument was advanced during the last (serious) gasp of identity politics as a viable scholarly approach to social change. At first blush he is dismissing so-called “multiculturalism” as antipatriotic and promoting sectarian division. But his argument is much more complicated: “a nation cannot reform itself unless it takes pride in itself—unless it has an identity, rejoices in it, reflects upon it, and tries to live up to it.” In other words, identity is important, both particular and collective; “a sense of shared identity is not an evil. It is an absolutely essential component of citizenship, of any attempt to take our country and its problems seriously.” Assembling under the idea of a nation, in other words, is the precondition of social change. You and I cannot call for a better world, for ending hunger, for bringing our troops home unless we give a shit about being “American.” Rorty is taking identity politics at its absolute word and saying: right on, lets get down with “the people” too, that constituted body that is comprised of me in my generic sense, divested of my particulars.

The risk of patriotism is blind nationalism. The patriot is critical. The patriot demands independence from the tyranny of colonialism. The patriots said, “screw you, Britain, we’re American.” Nationalism, on the other hand, is a diluted patriotism, patriotism without critical reasoning. Patriotism requires shooting yourself in the foot every now and again; it’s not easy to be a true patriot (to say, for example, that you support the troops while you in no way support the president, and so on). Nationalism, like suicide, is painless.

I consider myself a patriot, and I think part of my job—the teaching part, to be more precise—is in a sense a patriotic duty. Teaching others (and always myself) to think more critically, to ask more from our news sources, to participate in deliberation outside the classroom, is an attempt to create the conditions of social change. Rorty speaks of figures to be proud of: Ralph Waldo Emerson and Martin Luther King. These people we claim as “ours.” But more importantly are the values that these figures promoted: equality for all, justice, the pursuit of property happiness. Even if these ideals create a lot of lip service, because we intone them relentlessly we get to call people on the carpet who are intolerant or who strive to make others unequal, we get to call-out those who would disenfranchise others. I’m proud to be able to do that and to shamelessly teach that. You cannot call-out all the evildoers all the time (sometimes you need to leave a foot un-shot), but even so, we leave in a country where one can speak to and confront power. You cannot do that everywhere in the world.

Of course, the university trends toward business. My courses are focused on higher-level critical thinking, while the more basic courses of my department, of my field, are often geared toward the marketplace. And perhaps this is why it is hard to declare one is patriot in the academy today: capitalism globalizes, it is a giant blob that erodes the boundaries of the national imaginary. Today we think of ourselves in respect to global forces, no longer a sovereign people, but a cynical body of standing stock to be toyed with by global companies in search of cheap labor. Capitalism in the academy makes every class into workforce preparedness; don’t think, just make—just speak. Lose the “um” and “like” kid, and remember, in a boardroom you only put three ideas per PowerPoint slide!

So when this nation-less, workforce bias is the backdrop of our undergraduates’ college experiences—this workforce preparedness training (and standardized testing for “accountability” isn’t far behind, you just wait)—are we surprised to see apathetic, bleary-eyed students who can’t even say the pledge of allegiance? Students who, rather, spout-off talking point scripts from their favorite cable news television channel whenever politics is discussed in the classroom? Patriotism has become an unthinking nationalism, and so we educators start to think patriotism is evil. Patriotism has become a talking point, push-buttom-click-click stump speech. Patriotism has been claimed by the talk-show right in the name of nationalist evil (the FOX flap over the Obamas’ fist-bump gesture comes to mind).

Postmodernity poses quite a pickle for patriotism: as our mediated world continues to globalize, as media companies supplant resource companies as the true power-brokers in the world, and as our universities are increasingly absorbed by their business schools and athletics programs (viz., media outlets), an engaged, deliberative citizenry is no more because Mr. Smith Goes to Washington no longer informs our national fantasy. The image of citizenship has shifted from a good person speaking well in public to an 18-24 year old young person texting a vote for American Idol.

Folks have been saying this for decades, of course. But when we think about patriotism and the nation state, we’re dealing with an imaginary or fantasy structure. That structure is held in place by an infrastructure that is (Baudrillard be damned) increasingly virtual, spun, floating . . . . As a consequence, what it means to be an “American” has become whether or not you watch FOX or CNN. To be a “patriot” you have to wholly subscribe to the two-dimensional political candidates that reduce “the Nation” to a shadow puppet play on the wall of a cave.

I’m tired of non-academics and media pundits assuming because I am a professor and decidedly left of center that somehow I am unpatriotic. Not true: I pledge allegiance; I combat blind obedience. Timothy Leary was a patriot; he taught me to question authority, but he didn’t teach me to abandon a sense of national community (in fact, he relied on precisely that to get his message out). I am a patriot, I celebrate the fourth of July because, for me, this day represents my right to speak freely and support the equality of everyone (not just legal, but social too). This day also represents my right to be critical of the implosion of politics and popular culture. As a patriot, I also claim the right to party, to grill and smoke meats, and to blow shit up with firecrackers.