Activism

Empathy and Education: Fight or Flight

“A good teacher will lead the horse to water; an excellent teacher will make the horse thirsty first.” – Mario Cortes

Inside the academic classroom, we instructors face a number of pedagogical challenges, ranging from constant apprehension regarding proper time management, to confusion over how to best incorporate new media technologies in diverse lesson plans. If the multitudes of our profession may be encompassed by so simplistic a maxim, a good amount of the efforts toward leading our students toward the proverbial well of knowledge involves acknowledging the limits of our ability to engage, and the students’ ability to stay engaged.

Try as we might to liven up lectures on nineteenth-century textual portrayals of class and gender struggles, or lead animated discussion on symbolic content and elements of stylistic form, just to name a couple of personal examples, the passion of an instructor may not always yield a similar investment from those they teach. Here, the learning curve inherent in pedagogy applies to us as well. We acknowledge that students may have chosen to take our course for the purpose of filling out credit hours, anticipate the potential difficulties of teaching the disinterested, and yet do our best to construct inclusive syllabi, encourage open discussion, and foster an environment defined by dialectical learning.

Even in the face of such apathy, within the classroom setting, an instructor retains the authority to insist on certain standards of behavior. Students are expected to pay attention to the material, despite their personal level of enthusiasm for the subject, or lack thereof, and often must display their acquired knowledge through active participation.

Outside of the classroom, however, the authority to instruct has always been a tenuous thing at best, undercut by the style of one’s delivery, the power of one’s rhetoric, and the ongoing struggle to make one’s voice heard at all. There are no quantitative grades to earn in what so many have termed the “real world” outside of academic institutions; no controlled learning environment in which anyone is obligated to respect the notion of a “safe space,” and certainly no imperative to engage in critical discussion or any measure of empathetic self-reflection.

Moreover, in the wake of the U.S. Presidential election, the anti-intellectual impulse now seems to be morphing into a frightening American norm. Never mind leading horses to water – in a “post truth” world, if words aren’t enough, what is left?

fine

Artist: K.C. Green, 2013 Source: Gunshowcomic.com

Empathy, many say. Following a seemingly never-ending election season distinguished early on by threatening speech, stunningly vitriolic ideological premises, and outlandish promises now turned very real dangers, those grieving for the loss of a democratic ideal were told to empathize with those we had grown to view with fear, anger, and even disgust. Among increasingly convoluted dissections of what the concept of empathy means,[1] voices from all over the political spectrum, mainstream news outlets, and media platforms urged those on the “losing” side to swallow the bitter pill – at least for the next four years – and unite. Accept. Get over it.

In other words: don’t fight.

But for many of us, there is no other choice. At the end of the day, we are thinkers. Letting things go unquestioned, unexamined, and unanalyzed is something we cannot do. Easy acceptance and complacency go hand-in hand, joined together in a desperate flight from grappling with our own mistakes, and pushing to change what we cannot tolerate, much less endure.

Instructors, researchers, public thinkers and scholars affiliated with the academy have all been students at one point or another. As such, we consider the intellectual process as one requiring constant and self-conscious revision – not only must we often admit our own shortcomings, but we must also anticipate learning from those we may initially oppose.

Crafting a common vocabulary is perhaps the first step toward building a rapport with bored or uninterested students, but deconstructing the complexities of hegemonic ideology and the semantic battle over what has been fashionably debated and dismissed as “identity politics” takes the concentrated work of months, if not years. Effective communication becomes much more difficult with the assumption that empathy and cooperative understanding rests upon mutual mute compliance, instead of examination and accountability. Engaging in productive discussions with political opponents is far from impossible. Historically, however, conversations require equal measures of willingness to listen and learn from all those involved.

How do we reach those who see no reward in critical reflection, and harbor no desire for intellectual engagement? To what extent are we meant to empathize and “break bread” [2] with those who would much rather imagine the well of knowledge empty, than deign to be led anywhere?

In an Op-Ed piece from The New York Times, R. Derek Black shares another personal narrative tracing the unlearning of hatred-driven ideology through experiences at a liberal college:

“Through many talks with devoted and diverse people there – people who chose to invite me into their dorms and conversations rather than ostracize me – I began to realize the damage I had done. Ever since, I have been trying to make up for it…

People have approached me looking for a way to change the minds of Trump voters, but I can’t offer any magic technique. That kind of persuasion happens in person-to-person interactions and it requires a lot of honest listening on both sides. For me, the conversations that led me to change my views started because I couldn’t understand why anyone would fear me…

I never would have begun my own conversations without first experiencing clear and passionate outrage to what I believed from those I interacted with. Now is the time for me to pass on that outrage by clearly and unremittingly denouncing the people who used a wave of white anger to take the White House.”[3]

On one hand, there are no easy answers. But on the other, admittedly, easy answers aren’t our forte. We press for deeper truths than that.

Buck up, Academics. We have our work cut out for us.


[1] In this short interview promoting his new monograph, Against Empathy: The Case for Rational Compassion, Yale psychologist Paul Bloom attempts to distinguish between what he terms “cognitive empathy” and “emotional empathy.” The former, he argues, is a mental exercise based upon rational thought; the latter is based solely in affective feeling, and actually “distorts goodness” in “direct[ing] our moral decision-making [and] reflects our biases.” Bloom’s argument, as presented in this interview, contradicts itself when he disparages empathetic feeling, yet then doubles back and claims “We need love, compassion and kindness.”

[2] In what has since been criticized as a short-sighted commentary reflecting a lack of knowledge on the lived experiences of Black (and fellow minority) Americans, Trevor Noah’s Op-Ed piece boldly states, “We should give no quarter to intolerance and injustice in this world, but we can be steadfast on the subject of Mr. Trump’s unfitness for office while still reaching out to reason with his supporters. We can be unwavering in our commitment to racial equality while still breaking bread with the same racist people who’ve opposed us.” (“Trevor Noah: Let’s Not Be Divided. Divided People Are Easier to Rule.” The New York Times. 5 December 2016.)

[3] “Why I Left White Nationalism.” Black, R. Derek. The New York Times. 26 November 2016.


Vicky Cheng is a fourth-year Ph.D. student whose research and teaching interests center on nineteenth-century British literature and culture, with a specific focus on queer and feminist readings of Victorian texts. Her proposed dissertation project finds its structure through queer methodology, and will investigate Victorian novels and conflicting representations of gendered bodies within. Other scholarly interests include mediations between textual description and visualization, the structures of power surrounding the interplay of non-normative bodies and disruptive desires, and the complexities of embodied sexualities.

Empathy and Education: The Double Burden (Part II)

In the numerous fields comprising that artistic and cultural field we call “the humanities,” we who self-identify as scholars must constantly be on the defense regarding our own choice of profession. An increasingly corporatized world sees banks encouraging ballerinas and actors to become engineers and botanists instead, and federal agencies such as the CBO actively suggesting reducing federal funding for the Arts and Humanities, since “such programs may not provide social benefits that equal or exceed their costs.”

This cacophony joins with countless other voices in our own lives: those cautioning us about the shrinking opportunities of the academic job market, who gently chastise us for dabbling in a passion instead of pursuing a career that will prove economically viable, and otherwise reminding us that the humanities are not where the dollars – or pounds or euros, among other forms of financial credit – lie. There is no Wall Street of literature, no actual stock market of philosophical ideas, and little funding to be found in dusty bookshelves and puzzling over words, ideas, and their meanings.

Why even bother?

As the old adage goes, “Those who don’t study history are doomed to repeat it.” A bastardized proverb, perhaps, with uncertain origins, and appropriated right and left – often by the political and ideological Left and Right – for various ends. The myth of linear progress haunts us with these lessons of the not-so-distant past. Especially in the awareness of unavoidable pitfalls, regressions, and obstructions in the hard-fought effort forward and upwards, we take into consideration the wisdom of looking over our shoulders and consulting voices that tell tales of suffering and horror never to happen again.

For those of us working in the fields of analyzing literature and encouraging critical thought, our reasons for choosing to engage with such materials on a day-to-day basis have long found ethical expression in empathy. We aim to broaden awareness of self and others, and to celebrate multicultural differences by considering multiple avenues of theoretical exploration. This is why we construct syllabi with an eye toward incorporating more writers outside the realms of canonical literature, the majority of these names belonging to women writers, and writers of color. For many of us teaching at the collegiate level, or in higher education in general, critiquing the norms of institutions, modeling thoughtful self-reflexivity, and teaching students how to close-read all goes hand-in-hand.

On some level, either personally or with boisterous confidence, we all wish to believe in our role to “Make America Smart Again.” Our faith in education fueled our optimism in a future defined by intelligence and inclusivity, and many a liberal-leaning Op-Ed piece declared the one advantage of Britain’s recent referendum to leave the European Union as both instruction and a tale of warning:

“One of the few good things about Britain’s vote to leave the European Union is the rich curriculum of lessons it offers leaders and electorates in other democracies…

Across Europe and in the United States, politicians can either respond to these cries of protest or face something worse than Brexit.”[1]

Was such belief a stroke of overconfidence?

Following November 8th, with electoral results and statistics rushing in from all sides, bleak disappointment followed closely by crushing realization began to settle in. These gut-reactions mingled with irritation at the instantaneous, yet contradictory impulse to assign blame:

“Why Did College-Educated White Women Vote for Trump?” (The New York Times)

“Blame Trump’s Victory on College-Educated Whites, Not the Working-Class” (New Republic)

“Trump Won Because College-Educated Americans are Out of Touch” (The Washington Post)

Such was, and still is enough to shake one’s faith in purposeful education. In the face of all this, what is the point of what we teach? These are the questions to haunt us now: does the work of our lives actually take any root? Should intellectuals shoulder the blame of having morphed into snobbish cultural elites?

Does investment in efforts toward empathy really yield any ideological change?

merriamwebster

 

In the days and weeks that have followed the 2016 Presidential Election, attempting to navigate and teach in this new reality has proven unsettling. All of a sudden, we have swerved from the academic postmodern into a maelstrom of media-influenced misinformation, Twitter rants, and unprecedented threats against freedom of speech, critique,[2] and intellectual or creative expression.

Welcome to the new American age, where everything about knowledge is made up, and apparently, points of truth and facts no longer matter. While Merriam-Webster considers its top result of 2016, The Oxford Dictionary has chosen “post-truth” as its word of the year. As NPR reports, “The word has been around for a few decades or so, but according to the Oxford Dictionary, there has been a spike in frequency of usage since Brexit and an even bigger jump since the period before the American presidential election…feelings, identifications, anxieties and fantasies, that’s what actuated the electorate. Not arguments. Not facts.

Perhaps this struggle we now face started long before Election Day; now, it seems more urgent than ever. From a fake news epidemic of so virulent a strain that that Pope Frances felt compelled to condemn the “sin” of perpetuating misleading information, to a linguistics battle over how to address the Ku Klux Klan-backed “Alt-Right” White Supremacy movement, words, ideas, and the ideological weight they hold have become weapons and flashpoints.

Caption: “Hey! A Message to Media Normalizing the Alt-Right”

Source: Late Night with Seth Myers, 7 December 2016

Speaking truth to power has never been an easy task, and the struggle against the normalization of silencing dissent is, and will remain difficult. While we elegize and self-reflect, we also turn to writers such as Zadie Smith to remind us that “history is not erased by change…progress is never permanent, will always be threatened, must be redoubled, restated, and reimagined if it is to survive.”[3] Likewise, Chimamanda Ngozi Adichie speaks of the dangers of complacency and neutrality – and goes a step further to remind us of the boundaries of empathy:

“Now is the time to resist the slightest extension in the boundaries of what is right and just. Now is the time to speak up and to wear as a badge of honor the opprobrium of bigots. Now is the time to confront the weak core at the heart of America’s addiction to optimism; it allows too little room for resilience, and too much for fragility. Hazy visions of ‘healing’ and ‘not becoming the hate we hate’ sound dangerously like appeasement. The responsibility to forge unity belongs not to the denigrated but to the denigrators. The premise for empathy has to be equal humanity; it is an injustice to demand that the maligned identify with those who question their humanity.”[4]

Words can obfuscate, enlighten, and entrap – and these complexities are elements we anticipate and enjoy when working with literary texts and critical theories. Although the questions surrounding a liberal or humanities-affiliated education may still haunt us, nowhere else can one find a space more prepared for the deconstruction of flashy rhetoric and the unpacking of ideology. Beyond the humanities, critical engagement with disparate voices, texts, and the ideas they represent pertain to disciplines all across the board, and intellectual endeavors of all stripes. We have many more lessons to teach, and much left to learn. This is our task, and may we rise to meet it.

[1] “Learning from Britain’s Unnecessary Crisis.” E.J. Dionne Jr. The Washington Post. 26 June 2016.

[2] Most recently, the union president representing workers at the Indianapolis branch of Carrier Corp. criticized the business deal the President-elect enacted late last month. Chuck Jones, the leader of United Steelworkers Local 1999, challenged Trump to authenticate his claims, and soon afterwards began receiving anonymous death threats.

[3] “On Optimism and Despair.” Zadie Smith. The New York Review of Books. 22 December 2016 Issue.

[4] “Now is the Time to Talk About what we are Actually Talking About.” Chimamanda Ngozi Adichie. The New Yorker. 2 December 2016.


Vicky Cheng is a fourth-year Ph.D. student whose research and teaching interests center on nineteenth-century British literature and culture, with a specific focus on queer and feminist readings of Victorian texts. Her proposed dissertation project finds its structure through queer methodology, and will investigate Victorian novels and conflicting representations of gendered bodies within. Other scholarly interests include mediations between textual description and visualization, the structures of power surrounding the interplay of non-normative bodies and disruptive desires, and the complexities of embodied sexualities.

Empathy and Education: The Double Burden (Part 1)

A couple of weeks ago, toward the end of our class’s unit on “Thrills, Sensations, and the Ethics of Nonfiction,” I assigned my students the University of Chicago’s Welcome Letter to the Class of 2020 alongside Sara Ahmed’s thought-provoking “Against Students” (June 2015). The former, a document separately decried or praised as patronizing and oppressive or timely and appropriate, comes from a private University that prides itself as “one of the world’s leading and most influential institutions of higher learning,”[1] and has a notorious reputation among academics for fostering an ultra-competitive – and potentially hazardous – environment for its students.

Following a word of congratulations, the letter states:

“Our commitment to academic freedom means that we do not support so-called ‘trigger warnings,’ we do not cancel invited speakers because their topics might prove controversial, and we do not condone the creation of intellectual ‘safe spaces’ where individuals can retreat from ideas and perspectives at odds with their own.

Fostering the free exchange of ideas reinforces a related University priority – building a campus that welcomes people of all backgrounds. Diversity of opinion and background is a fundamental strength of our community. The members of our community must have the freedom to espouse and explore a wide range of ideas.”

A number of think pieces had their say, and the talking heads gave comment. In response, educators and administrators from various institutions defended their policy of creating safe spaces and giving trigger warnings; using the same terminology, they all argued for the same purpose: academic freedom and “moral responsibility.” Proponents of the University of Chicago’s pedagogical stance lauded this strike against so-called “political correctness,” insisting that incoming students should stop expecting a protective safety net to cushion controversial speech and difficult issues. Safe spaces, it was implied, or outright declared, are a cocoon of muffled sensitivities freshmen ought to have outgrown by their first semester of college.

Ahmed’s piece, while predating the University of Chicago’s letter by almost a year, exposes similar “sweeping” generalizations made in critiques of higher education, while laying bare the ideological contradictions the letter claims to espouse. Students who are often blamed as oversensitive, coddled, and otherwise too entitled to address “difficult issues” bear the brunt of critique in the wider battle of, and backlash against the dreaded brand of PC-neoliberalism. In actuality, those who oppose trigger warnings often do so at the expense of marginalized groups and students as a whole, and not in service of a wider range of critical discussion.

“The idea that students have become a problem because they are too sensitive relates to a wider public discourse that describes offendability as a form of moral weakness and as a restriction on “our” freedom of speech. Much contemporary racism works by positioning the others as too easily offendable, which is how some come to assert their right to occupy space by being offensive…

This is how harassment can be justified as an expression of academic freedom.”

Rhetorically, those who use this toxic, masculinist mantra to “man up and quit being so offended” imagine its directed audience as a bunch of whiny, thin-skinned spoiled brats. It has become a “no guts, no restriction of hateful speech, no glory” approach modified for instructional spaces. Unsurprisingly, it represents yet another attack upon we Millennials of the generation of participation trophies; we special snowflakes-turned-Social Justice Warriors; we who dare protest for a minimum wage of $15/hour, refuse to consider any human being “illegal,” and demand equal rights under the law for an ever-expanding catalogue of identities, intersectionalities, and sexualities.

PC.png

The thing about we who make it our job to deal in words is that we know what they say about us. Sometimes, we respond with sarcasm and memes.

Apparently, to many, intellectual boldness – or the tricky concept of free speech in general – is incompatible with thoughtfulness, compassion, or the necessity of imagining and reflecting upon the consequences of such speech. But at its core, intellectual efforts rest upon a foundation of empathetic engagement, curiosity, and responsible efforts to give voice to those who have previously been silenced.

For the most part, we who teach are expected to keep personal politics out of the classroom. Each student ought to have their say, and must not fear their grade may suffer due to a difference of religious, political, or personal ideological belief. The classroom is a place for critical engagement and analytical inquiry, but it should not act as a place of conversion, or the base of any particular soapbox.

On the other hand, we introduce students to the concept of ideology, and invite them to critically question previously held beliefs; we encourage students to critique ideas, and not the individual espousing them. Disagreement should not deter discussion, so long as speech remains respectful and productive. We are all here to learn, is the unspoken catchphrase of the liberal arts education, and we learn best when we question what it is we think we know.

I presented the University of Chicago’s welcome letter to my class without trepidation – not because I expected every student to agree with the material, or to contest it straight away; rather, their job was to consider the rhetorical strategies being employed, and foster an interpretive reading based upon textual evidence. Thus far, we had studied texts through the framework of social critique and purposeful writing, interrogating the usefulness of nonfiction texts that have outlived their writers. We questioned the boundaries of truth and fiction, fantasy and reality, and spent a good portion of the semester discussing the importance of readers’ ethical responses to texts presenting themselves as unproblematic, factual, and objective. They held productive class discussions on tone-policing, white privilege, and the conflation of violence with sensational journalism and the commodification of wartime horror. These students, most of them incoming freshmen, rose quickly to the challenge of tackling these subjects, with vigor and great respect for the material, and one another.

The students of this generation “aren’t snowflakes, and they don’t melt,” Yale professor Steven Berry writes, in admiration of the resiliency of students who were still able to attend class and complete an exam the morning of November 9th. The same resiliency we admire in our students becomes so much more difficult to embody when we, students and scholars and educators alike, consider how much more dangerous our world has suddenly become.

Ten days after the U.S. election, eight hundred sixty-seven hate incidents were reported to the Southern Poverty Law Center, the majority of these occurring in K-12 schools. Since then, an organization named Turning Point USA, which purports to “fight for free speech and the right for professors to say whatever they wish,” has created a Professor Watchlist, with profiles of “professors that advance a radical agenda in lecture halls” – the majority of those listed professors being women and persons of color.

post-election-hate

“Ten Days After: Harassment and Intimidation in the Aftermath of the Election” Source: Southern Poverty Law Center, https://www.splcenter.org/20161129/ten-days-after-harassment-and-intimidation-aftermath-election

Without giving into paranoia, the project of providing safe spaces appears more daunting than ever. Despite this, while the classroom may not be a pulpit or a soapbox, it nevertheless remains a platform for instruction. Our determination to forge ahead despite fear and anger represents both the privilege and the burden of educating with empathy, and an ethical responsibility we owe to ourselves, and those we aim to instruct.

[1] This quote comes from the University of Chicago’s Wikipedia page (https://en.wikipedia.org/wiki/University_of_Chicago); the university’s homepage and admissions proudly greets visitors as “a private, nondenominational, culturally rich and ethnically diverse coeducational research university…committed to educating extraordinary people regardless of race, gender, religion, or financial ability.” (http://www.uchicago.edu/)


Vicky Cheng is a fourth-year Ph.D. student whose research and teaching interests center on nineteenth-century British literature and culture, with a specific focus on queer and feminist readings of Victorian texts. Her proposed dissertation project finds its structure through queer methodology, and will investigate Victorian novels and conflicting representations of gendered bodies within. Other scholarly interests include mediations between textual description and visualization, the structures of power surrounding the interplay of non-normative bodies and disruptive desires, and the complexities of embodied sexualities.

Hidden mental health troubles in the ivory tower (13 Nov. 2015)

An initial reason for not sharing my experiences with depression was a persistent fear that people would think I was not strong enough for academia. My identity was so tightly wrapped up in my productivity, my latest department seminar, and my C.V. that the very thought of someone questioning my academic grit was enough to keep me from seeking treatment or even admitting to myself that something was wrong.

Fig 1

Fig 1: photo credit: D.A. Sonnenfeld

But I did have enough grit to excel in academia; I was tough as nails, strong as diamond, but that had very little bearing on my being strong enough to care for myself. Fortunately, around this time, I ran across a post by a favorite scientist blogger. He queried how many of his readers took a prescription drug, any drug, to enable successful academic performance. One in three of the over 150 reader responses in his unscientific, yet illuminating, poll confirmed the professional need for prescription drugs. One in three. These results were posted when I still shied away from talk therapy, let alone medication. It dawned on me that muscling through mental illness wasn’t the only option. Moreover, pushing through might not be a very good option.

A trip through academicsblogs suggests that not only is mental illness is pervasive in academia, but there is a paucity of research on mental health in the ivory tower. Being a scientist myself, I tried to find some nice, tidy statistics about the prevalence of mental illness in academia versus the general public, but repeatedly came up empty handed. The best evidence comes from two studies from the U.K. and Australia. A survey from the UK indicates nearly half of all academics report high or very high stress levels, though specific connections to anxiety, depression, or other disorders were not explored. Additionally, the magazine New Scientist reports that an Australian study found three to four times the incidence of mental health among academics compared to a general population. Unfortunately, the Australian study is behind a pay wall that, even with my University credentials, I can’t access to explore further.

There are a few factors that I propose contribute to the frequency of mental illness in academia, particularly among graduate students. Anyone who’s spent time in a graduate program or has loved someone working on their graduate degree knows the pressure to achieve can be intense. Graduate research can be an isolating experience as you zoom in on an ever-narrowing topic of study. Academia is also filled with rejection. Rejection of manuscripts, unfunded grant proposals, failed experiments, tenuous committee meetings, poorly received presentations, and the list continues. Unless you have a supervisor dedicated to championing academia’s infrequent successes, which I fortunately did, all the perceived failures can lead to a demoralizing collection of years.

Fig 2

Fig 2: photo credit: Greg Dunham

Another factor that’s less discussed, but I think is important to consider, is the predisposition of academics. I can only speak specifically to my observations in my little corner of Biology, but I suspect there is great overlap with other disciplines. We’re a detail and data-oriented bunch, trained to engage in the rational rather than the emotional side of our brains. We tend to be over-achievers, the highest achieving of which can still feel their contributions to science are not enough. Partitioning off important emotions, or even ignoring them in favor of the path to achievement, certainly did not help me with self-awareness.

In my experience, I used the academic pursuit to deny myself care. I tried to logic my way out of depression – I had a great partner and friends, I was successful in my work, it was simply illogical that I felt the way I did. In my last grasps to ignore that something was very wrong I turned harder into my research, attempting to fill my emptiness with data collection. It didn’t work.

One of the initially perplexing aspects of my depression was the timing. Depression didn’t follow a series of rejections, arise at a period of particularly high stress, or spring from a volatile relationship with my advisor and colleagues; depression hit when things were going well. After much discussion with my therapist, we decided that it was precisely the lack of academic or professional pressures to fixate on that unveiled the trouble underneath. My depression was not situational in the sense of a stressful external event causing my symptoms. It was clinical. It was major depressive disorder.

Fig 3

Fig 3: photo credit: Fresaj

In my case, genetic and early family environment most influenced my depression. Depression shows up on both sides of my family tree, for certain in at least the most recent generations when it’s become more societally acceptable to discuss mental health. I’d prefer not to delve into the early family environment portion, but I will say that overt abuse isn’t the only thing that can compromise a secure childhood. In short, many factors insidiously aligned to lead to my depression.

I continue to be frustrated at the lack of discussion of mental health in academia, despite its pervasiveness. At no point during any of the orientations I attended as graduate students was there mention of coping with mental illness while in grad school. If the existence of mental health facilities on campus were discussed, it was brief enough to be promptly forgotten. Discussions with fellow graduate students revealed that I am certainly not the only one to deal with depression. I’m also not the only one who has been hospitalized while in grad school. I can’t help but think that if I was aware of how common mental illness is in academia and if I knew that there is no shame in obtaining treatment, then I may have sought help much sooner.


As a Biology Ph.D. candidate, Liz Droge-Young studies the incredibly promiscuous red flour beetle. When not watching beetles mate, she covers the latest science news on campus for Syracuse University’s College of Arts & Sciences communication department. She is also a mental health advocate, a voracious consumer of movies, and a lover of cheese.

“The Illusion of Choice”: Forced Freedom in Mr. Robot and Late Capitalist Society (30 October 2015)

I experience a fleeting feeling of freedom whenever I go to the grocery store.  It offers me a reprieve from the stress and anxiety that creeps up on a daily basis as I worry about deadlines approaching or what I’ll do next after I finish graduate school. And then there’s always the peripheral flutter of unending concerns about issues that most people are able to accept as out of their control––rampant deforestation; rising PH levels in the ocean; increasingly endangered coral reefs, polar bears, and countless other species; the 50 million people in the U.S. who experience food insecurity; the factory workers in third-world countries without decent rights or wages making my clothes; the innocent victims of wars perpetuated by military-industrial complexes; the staggering racial injustice of the U.S. prison-industrial complex…the list literally could go on forever.

It’s no wonder that I get in a rut sometimes as I encounter more staggering statistics and tragic stories. I tend to feel debilitated in these moments when I must confront the fact that I’m just one individual who does not have the time, talent, or resources to combat all evil at once, and so it will be time calm down.   So I go out of doors and, when it’s too cold to appreciate nature, I will go to a grocery store looking for comfort food, clearing my head by distracting myself with, ironically, more stacks of stuff.

It’s not a habit I’m proud of and that I want to remediate, and so the first thing I have to do is understand it.  It seems to me that what is tantalizing about the experience of shopping is the ability to exercise some kind of control through the act of consumer choice.  Perhaps as someone who constantly feels like her life is barely under control, the ability to swipe a card to pay for stuff somehow is empowering, inevitably stemming from the sordid allure of ownership.  But of course it’s only a temporary feeling.  Once the chocolate bar is gone, it’s back to square one, and I then realize I don’t own the things that I buy:  the things that I buy own me…

*

It’s not very often that one can turn to a network television show in order to illustrate just how vice-like global capitalism’s grip is on everyday life, at least in any way that’s meaningful, yet this is exactly what I have recently discovered in USA’s new show Mr. Robot.  Its main character, Elliot, is a genius hacker who suffers from social anxiety and craves world revolution.  Although he works as a techie at a cybersecurity firm to pay the bills, in his free time he hacks into the various accounts of people he suspects to be petty criminals and, like a digital Batman, anonymously tips the police or blackmails the evil-doers into righting their wrongs if he stumbles across illegal or immoral conduct.  But what the entirety of the show is predominately about is Elliot and a group of other hacker individuals known as “fsociety” who are attempting to do the impossible:  completely overthrow the corporate overlords, redistribute the wealth entirely, and usher in a new era freed from the systemic acts of injustice perpetrated by the greed of the excessively wealthy.

robot1 It would be impossible for me to summarize here even just the main plot points of the first season, and at any rate what I want to talk about is the second episode in particular in which Elliot grapples with the question all progressively-minded millennials like yours truly battle with daily: Do any of our choices really matter?  At this point in the show, Elliot has already been inducted into fsociety but remains timid and wary of the revolutionary candor of its leader, Mr. Robot, who has proposed that their next exploit involve blowing up a facility where all of the crucial servers for E Corps (also derogatorily referred to as “Evil Corps”) are located.  The problem with the plan, like so many violent acts of rebellion, is that the destruction from the blast would also inevitably entail the deaths of many people in the town adjacent to the facility, something Mr. Robot insists is merely a price they have to pay for the revolutionary cause. Elliot refuses to endanger the lives of innocent civilians.  Mr. Robot rolls his eyes.  He tells Elliot that in life, like in computer code, there are people who are “ones” and people who are “zeroes”––people who act vs. people who don’t; heroes vs. cowards. Elliot shrugs him off in the moment but clearly remains vexed as he attempts to return to a normal life. While sitting through a therapy session in which he usually remains silent, when asked how he’s feeling Elliot uncharacteristically decides to oblige his therapist’s request for specifics by launching into a slow, melancholy monologue:

How do we know if we’re in control? That we’re not just making the best of what comes at us and that’s it and trying to constantly to pick between two shitty options… Coke and Pepsi. McDonald’s or Burger King. Hyundai or Honda…It’s all part of the same blur, right? Just out of focus enough.  The illusion of choice.  And half of us can’t even pick our own cable––our gas, electric, the water we drink, our health insurance.  Even if we did, would it matter?  Our only option is Blue Cross or Blue Shield.  What the fuck is the difference?  Aren’t they the same? Nah, man… Our choices are prepaid for us.  A long time ago…

What’s the point, right?  Might as well do nothing.
This is not an unfamiliar attitude; articles are written about millennial malaise more and more these days as moments of activism like Occupy Wall Street rear their heads for an exciting moment only to dissipate and the status quo continue.  Scholars have weighed in on the cause of hesitation among young people like Elliot who know that injustice exists but nevertheless believe there’s little to nothing they can do about it.  There are many explanations, primary among them the fact that fear and anxiety is at an all-time high for millennials for whom “student debt is at its highest” with a “fear of unemployment and poverty” as a result.  It’s no wonder America’s youth is afraid of challenging the establishment when what they’re worried most about is putting food on a table for one.  I myself have suffered from similar fears, although my own therapy via career counseling has begin to allay some of my anxiety about entering soon into “the real world”––but the fact that I, and so many others, need reassurance is telling in itself.  My counselor has told me time and again “I wish you would be more confident.” I wish I could too.

robot2

Enough said.

What Elliot expresses above and continuously throughout Mr. Robot is an implicit awareness of existing within what the critical theorist Jean Baudrillard called “simulacra”–– that is, when “reality” disappears as it is subsumed by the models or maps that seek to not only represent reality, but to overtake it, in effect becoming “hyperreal.” What was once the representation of reality becomes reality, and this then means the two cannot be separated nor distinguished from one another.  We no longer travel, for example, without consulting Google Maps. In fact, we locate ourselves in relation to this digital representation of streets and addresses to the point that we can no longer navigate without it; the little red pin on the map and the actual place are one and the same.  When Elliot laments that the choices we make are “illusions” already predetermined for us, he is expressing the anxiety of living within simulacra wherein “we are confronted with a precession of simulacra; that is, the representation [that] precedes and determines the real.”  How many of us choose to deviate from the path determined by GPS or feel anxious when we seemed to have taken the wrong turn?  We only go where maps will lead us. Ergo, Elliot’s comment that, in reality, our options are limited and so is our power, which is the reason why Elliot concludes that one “might as well do nothing.”

Yet because we are implicated in a system, there is no choice that can be made that will not impact another person somewhere in the world. If Elliot decides to “do nothing” and let the corporations continue to exist with impunity, he will likewise have agreed to others’ lives be negatively affected when he had the option (as his therapist reminds him) to do something. Contrary to Mr. Robot’s dismissal of his moral compass, Elliot’s fear of hurting others in the pursuit of revolution is a real fear that should be taken seriously, for it is the quintessential dilemma for people of conscience throughout the world who are painfully aware and wary of the fact that their actions will inevitably affect someone, somewhere, somehow.  For example, in the election season right now, though I am a die-hard supporter of Bernie Sanders’s campaign, I nevertheless wonder what might happen if we tax Wall Street speculation so ruthlessly.  Will they move their operations elsewhere to countries whose government’s have abysmal labor laws, thus exploiting potentially even more third-world workers than we already do now? The answer seems to me to be, honestly, “Maybe.”

In fact, there are infinite possibilities when it comes to the consequences of our actions, which is what makes the precautionary contemplation of worst-case scenarios cease to be useful after a certain point, especially when it inhibits further action.  In Absolute Recoil, Slavoj Žižek discusses the notion of “radical acts of freedom,” which he insists “are possible only under the condition of predestination” wherein we “know we are predestined, but we don’t know how we are predestined, i.e., which of our choices is predetermined,” and yet paradoxically it is in “this terrifying situation in which we have to decide what to do, knowing that our decision is decided in advance, [which] is perhaps the only case of real freedom, of the unbearable burden of a really free choice––we know that what we will do is predestined, but we still have to take a risk and subjectively choose what is predestined” or, if considering the “simulacra,” what is predetermined (68).

robot3

Oxymorons are popular in critical theory, as is staring gravely into space.

The beauty of Mr. Robot and critical theory is that it forces us to see our incessant anxieties about the efficacy or consequences of our own actions as ultimately ones that come from fear of our own freedom.  To run in the other direction, to “do nothing,” or to do what is safe or neutral, inevitably perpetuates the violence that, today, is mostly hidden from us as the simulacra distorts the reality lying just underneath its veil.  The question of whether or not anything we do actually “matters” often comes from the fearful question, as it does for Elliot, that what we will do will matter in harmful way.  While the simulacra may predetermine the parameters of our reality, it does not mean we are without power to intervene.

Which leads me back to my own initial questions for my blog series as I wrap up my time with Metathesis this month:  Do they “matter,” the messages popular culture send us? Do we need to spend our time deciphering texts or television shows for hidden ideologies?  Why should we keep English departments around? Why bother with critical theory?  With the help of Mr. Robot, I’ve come to the following conclusion: To be able to decipher cultural “codes” is itself a kind of hacking.  It is a project that when done seriously, and with the intention of changing the world, has real power just as Elliot does so long as he chooses to recognize it.  There is one crucial difference though: Whereas not all of us have the gift of deciphering code and understanding complex data, we do have the gift of thought and critical thinking.  The most tantalizing belief of our global capitalist, “post-modern” world is that our choices do not matter, a belief that prevents thinking too much out of fear of futility––i.e., “What’s the point, right? Might as well do nothing…”

But if there’s one thing critical theory teaches us it is that what is “true” is not objective, nor is it relative, nor is it a given.  What is “true” is tied to power relations and therefore to systems that create logics.  If all there is, then, is power, and if we are here to empower the disempowered, then that must mean we have to begin to interrupt the program to bring a more important message and, most importantly of all, not be afraid to.  We are in control of more than what we choose to eat or wear, maybe more in control than many of us want to admit. But if that’s the price we pay for our freedom, might as well do something.


Liana Willis is a second-year English M.A. student genuinely interested in all branches of critical theory, but in particular traditional Marxist and neo-Marxist cultural materialisms.  When not teaching, reading, consulting, or writing, she can be found somewhere nearby discreetly practicing yoga asanas and wishing she could be sleeping right now.

“Show me a good time”?: Madonna, Drake, and Police Brutality

If you’re fortunate enough to have the self-control to avoid at least moving your cursor over the “trending” links on Facebook: apparently, Madonna kissed Drake at Coachella, and to paraphrase Drake “it was it was [sic] not the best.” I base that reading on Drake’s body language: stunned immobility, a wide what is happening gesture, and then hands on his lips, hunched over. Expertise in affect theory seems a bit unnecessary, here; his response could hardly be more overt.

14-1428984093-madonna-drake-kiss

I’m interested in this kiss not for the celebrity gossip, but because I see it an important piece of the current conversation about racism in the United States—and most importantly, as an important site for thinking about how to think through the intersectionality of oppression.

 

Walter Scott’s murder two weeks ago should ameliorate any reticence about the reality of violence against black men. As I listened to the NPR story, they announced that they were going to play an audio clip of the protesters, whom I fully expected to chant something about the police, or “black lives matter.” Instead, they chanted a different activist slogan and hashtag: All lives matter. This particular chant rose to prominence in response to the slogan “black lives matter” as a way to call attention to the broad oppression that marginalized populations face. In its brief life, “all lives matter” has received due criticism from private bloggers all the way through Judith Butler, who sums up the critique with succinctness that should shock anyone who has ever read Gender Trouble:

 

It is true that all lives matter, but it is equally true that not all lives are understood to matter— which is precisely why it is most important to name the lives that have not mattered, and are struggling to matter in the way they deserve.

 

To chant “all lives matter” in response to what is perhaps the most blatantly obvious in a series of state-perpetuated crimes that specifically target black men fundamentally misses the point: that these murders happen because black lives are readily swept aside in the flows of power that permeate American culture. Affirming life through mutual respect (a la Appiah) is a perfectly laudable ethics, but it does not address the tangible legal, institutional, and cultural issues that contribute to the systematic assault on black bodies. “All lives matter” is a positive message—but it but it offers a philosophical abstraction in response to a political problem.

 

More importantly, “all lives” flattens bodies through equivalence. In other words, in its attempt to find commonality, “all lives” erases difference. Cut back to Drake and Madonna. As the internet is wont to be, the internet was very confused about how to respond. Of course, many people suggested that Drake enjoyed it. Drake himself even posted an image on instragram, with the caption “Don’t misinterpret my shock!! I got to make out with the queen.” The picture Drake chose offers a brief moment that appears consensual in an event that seemed predominantly nonconsensual.

 

thumb_featured_5_3

 

Some objected that Drake’s reaction implied that Madonna is disgusting, and so reinforced the idea that women cease to be attractive after they reach a certain age. The Huffington Post pointed toward John Travolta’s sexual harassment of Scarlett Johansson at the Oscars, and asked why Madonna received less criticism than Travolta. All of these responses are part of the same discourse: a discourse that flattens black bodies into mere intensities of violence and sexuality, and through that flattening, dismisses their bodies as bodies that do not matter.

 

Madonna’s kiss is hardly the first direct exploitation of black musicians by white musicians in recent (let alone longer) memory. I don’t mean the exploitation of culture, like Iggy Azalea’s bizarre code-switching (which Saturday Night Live fabulously lampoons), or the fact that every song Meghan Trainor sings is a poor rendition of doo-wop. I mean the exploitation of black bodies as sex-objects—the transformation of black bodies into just lumps of sexual matter. Think Miley Cyrus’s VMA performance, or Taylor Swift’s music video for “Shake it off” (intentionally not linked to images), which transform the black background dancers into mere ciphers for sex.

 

And here, we come to the sticking point. The Huffington Post’s article points fingers at an apparent gender bias, and asks: what if Madonna were a man, and Drake a woman? This is precisely the wrong question, driven by a similar impulse to “all lives matter.” Contrary to the impulse behind the discourses of sexual assault that have circulated around Madonna and Drake, one sexual assault does not equal all sexual assaults. Feminists, Madonna included, have struggled against the physical and emotional violence patriarchy directs at them; but that violence is fundamentally different than the violence directed at black men and women (which, of course, fundamentally differ from one another).

 

Madonna’s kiss was not sexual assault in the same way John Travolta’s kiss was: it was sexual assault in a different way. Violence against black men like Walter Scott is not the same as violence against black women, or Hispanic men or women: these violences differ. To argue that people should or should not be more or less upset because Madonna is a woman misses the critical intersection of race and gender. Drake is not merely a man; he is a black man in a culture that insists on coding black bodies as objects of pure violence and sex. Where a kind of pop-liberalism draws equivalence through common struggle, intersectionality underlines the political and pragmatic differences in the application of oppression.

History’s Fiction Problem: “Selma” and the Value of Fictionalized History

In a recent piece for SalonAndrew Burstein and Nancy Isenberg take aim at both Selma, the newly released film about the activism of Martin Luther King, Jr. Through Selma, they critique Hollywood more broadly for its lack of anything truly meaningful to say about history.  In the process, they also dismiss seemingly all (or at least most) historical fiction. They suggest that there is a measure of historical truth that historical fiction can obtain—but only if it remains firmly ensconced in the responsible, well-trained hands of those housed in the discipline of history.  Fiction’s tendencies to romanticize and to provide narrative closure, they seem to suggest, works against a nuanced appreciation of history.

Skepticism from trained historians is nothing new; historical fiction has increasingly earned the ire of many historians.  Such critiques almost invariably revolve around questions of “accuracy,” as historians ruthlessly pick apart the novels, films, and television series for every incident that is not “how it really was.”  Burstein and Isenberg voice a common desire among many of those who study history, for they suggest that in films “romantic truthiness supplants history.”

Such a critique overlooks so much of the richness and complexity that fiction, in film, in television, in novels, in poetry can offer to readers trained to be able to see it.  True, there are many flaws in these expressions of history, but isn’t it time to stop pretending that they don’t have any historical value, or that they don’t have a particular vision of the truth to offer?  Isn’t it more productive to study the ways in which these texts work, to look at conventions of narrative and other aesthetic considerations, to situate them in their political moments—not just to find out what they say about their present moment, but about how that moment understands history?  Work like Burstein’s and Isenberg’s poses the danger of foreclosing on any possibility of appreciating and studying these texts in all of their complexity, and shores up the already incredibly tenuous distinction between fiction and truth as if one does not have something to say about the other.

I currently teach a course entitled “Race and Literary Texts.”  Part of my intentions while designing my syllabus was to include fiction that helped to make clear to my students the ways in which history, the accumulated sediments of past actions and processes, continue to intrude on the present.  Utilizing texts ranging from Toni Morrison’s novel A Mercy to Richard Wright’s Native Son, my pedagogy emphasizes reading literary texts as theoretical texts. We take them seriously as theories of history, and draw out the ways in which they articulate historical visions. This is an incredibly rewarding experience, as we negotiate the ways in which writers, poets, directors, and studios grapple with the how to engage with the intractable problems posed by the past.

For our first close reading activity, we read the vexing poem “The Change,” by Tony Hoagland.  I love and hate this poem, for it represents so much of what I will attempt to convey to my students this semester.  In this poem, the speaker observes a tennis match between a white European and a young black woman from Alabama, secretly hoping that the former will win. Through the match, he wrestles with the intractable nature of history, of momentous (and, to the speaker at least, cataclysmic) social change.  While I condemn the poem’s obvious racism and white paranoia, I can’t help but acknowledge the ways in which it seeks to articulate a theory of history, to wrench a measure of intelligibility out of the chaos and terror of historical change (to riff slightly on Philip Toynbee’s famous statement about good writers grappling against the intractableness of modern English).  When the speaker says:

There are moments when history

passes you so close

you can smell its breath,

you can reach your hand out

and touch it on its flank

one can almost feel him grappling with the idea of history as experience, of the individual come face to face with the terrifying nearness of forces over which he has no control.  The line breaks struggle formally to come to terms with the effects of history, with the sense that a moment is simultaneously passing and has already passed.  Indeed, by the end of the poem he seems to have done so: the last phrase “we were changed” echoes like the closing of some door. The mantra forms a powerful reminder not only of the contradictions of history–as both ongoing process and recollection of the past–but also of the exclusionary power of “we.”  This is in many ways an elegy for white hegemony, and while I find it personally repugnant, I acknowledge that it does offer truth about history—even if it’s one with which we vehemently disagree.

Fiction, whether in the form of the printed word or the moving image, can offer us meaningful and powerful insights into the workings of history.  As Brittney Cooper puts it so forcefully in her own Salon take on the question of historical storytelling in Selma:  “being more accurate does not mean one has told more truth.  Read any Toni Morrison novel and you’ll learn that novels often tell far more truth than autobiography. DuVernay tells us many truths in this film about the affective and emotive dimensions of black politics, about the intimacy of black struggle, about the spirit of people intimately acquainted with daily assaults on their humanity.”  To continue to overlook these texts’ engagements with the past is to do both the texts and us a grave disservice. This shouldn’t stop us from critiquing those theories of history that continue to marginalize and disenfranchise those who have long been excluded from power, of course.  But it’s time that, instead of constantly critiquing and wringing our hands, we move into doing something more interesting and more fruitful: to engage in a more thoughtful and nuanced exploration of the relationship between fiction and history.

 


T.J. is a Ph.D. Candidate in Film and TV Studies in the Department of English. His dissertation examines theories of history as articulated in epic films and TV series set in antiquity. He teaches courses on film, popular culture, race, and gender, and in his free time enjoys watching The Golden Girls and nerding out over the works of J.R.R. Tolkien and their various adaptations. He frequently blogs at Queerly Different. You can follow him on Twitter @tjwest3.

Feminism doesn’t (t)werk that way: “Booty Culture,” race, and pop feminism

As Pippa Middleton recently remarked, “What is it with this American booty culture? It seems to me to be a form of obsession.”

pirate booty
Who doesn’t love the booty?**

Whether we’re talking about Miley Cyrus’s twerking, Nicki Minaj’s “Anaconda,” Meghan Trainor’s “All About That Bass,” Jennifer Lopez and Iggy Azalea’s “Booty,” Kim Kardashian’s “break the internet” photos, Rihanna and Shakira’s “Can’t Remember to Forget You,” or even Taylor Swift crawling between the legs of her mostly black twerking dancers (whose faces we never see) in “Shake It Off,” the discourse of the “booty” is currently almost everywhere in mainstream American culture. One half expects to see mainstream television programs take up the issue in a bid for ratings. Next week on Modern Family: the token angsty teen girl is even more angsty than usual because her step-grandmother has a better butt than she does.

sad booty
Image credit: A-Little-Kitty

Vogue has declared, in fairly jejune fashion, that the booty obsession is just the fulfillment of discourses we weren’t ready for 13 years ago — we weren’t “ready for the jelly” in 2001 when Destiny’s Child came out as “Bootylicious,” but we are ready now that J.Lo and white rapper Iggy Azalea are asking us to “Throw up your hands if you love a big booty.” Nicki Minaj is the fulfillment of the promise of J.Lo’s green Versace dress.

Others, including Yomi Adegoke and Susana Morris (of Auburn University, and co-founder of the generally awesome Crunk Feminist Collective) have discussed the booty obsession as cultural appropriation of what has been a desirable body type within black American culture. This appropriation was made perhaps most clear when Miley Cyrus, who has been donning the trappings of black rachet culture for several years now, photoshopped Nicki Minaj’s “Anaconda” cover photo, literally whitening Minaj’s skin and replacing Minaj’s face with her own. (The racism in the Kim Kardashian photos is also strikingly baldfaced.) The racial appropriation within “booty culture” is more than troubling, particularly during a time when the most pervasive images of black people within mainstream culture are photos of men like Michael Brown and Eric Garner, or the young Cleveland boy with a toy gun who was shot seconds after police arrived on scene. What has come to be thought of as the “black body type” in our culture has become acceptable and celebrated in the mainstream, but this (and other) cultural mainstreaming has not affected the systemic racism that oppresses black Americans. Mainstream culture incorporates and makes equally-available the body type without truly incorporating or making equal the bodies themselves.

Also, the sudden pervasiveness of “booty culture” seems suspicious given how it has taken focus off of the previous, somewhat overlapping female pop star controversy: contemporary feminism. Regardless of whether we personally think Beyonce‘s, Taylor Swift‘s, Miley Cyrus’s, Pharell‘s, Rihanna’s, or Nicki Minaj‘s feminism is substantively advancing equality or just substantively cashing in on millennial desire for commodity activism, conversations were taking place. There were daily opportunities to discuss *feminisms* and break out of the post-feminist backlash discourse of “one man-hating sexually-repressed feminism only for women who are just angry that they can’t consume their way to pretty.” The everydayness of pop feminism, and yes, the trendiness, created space for these conversations to be framed as relevant and timely.

Beyonce

Over the summer, articles were calling 2014 the “year of pop feminism.” Now, it is the year of the booty. Yes, the booty obsession has emphasized “different body types.” But the focus remains on the body. For women, the body is an asset, a marketable commodity, but that also makes women, to some extent, an object, playing into the traditional “to-be-seen”-ness, the “desire to be desired” (to use Mulvey and Doane’s phrases).

Thus, Beyonce’s last album, which contains a voiceover with Chimamanda Ngozi Adichie explaining the definition of feminism, discursively becomes the album wherein she started a song with the lyric “Let me sit this ass on you.” Tthe conversation returns to its cultural comfort zone — not how we could achieve gender equality so neither women NOR men are disciplined or punished into outmoded and damaging gender roles, but how women can empower themselves by, in bell hooks’ words, playing into “tropes of the existing, imperialist, white supremacist, patriarchal capitalist structure of female sexuality.”

 

** This post contains no photos of booty. The writer does not wish to participate in the continued objectification of other women by including gifs or images that turn those women into faceless body parts or mark out their bodies as exchangeable.


Lindsey Decker is a fifth-year Ph.D. candidate studying Film and Television in the Department of English.  Her dissertation examines questions of transnational cinema in self-reflexive British horror films.

Leave your Message, not your Trash

On a frigid yet sunny day in January 2014, I happened to find myself a couple of blocks away from the annual March for Life in Washington, DC. I was in the capitol visiting the Folger Shakespeare Library for some research, and had arrived early in the morning for a long day of archival exploration (or, let’s face it, geeking out over old books). As the day went on and I occasionally stepped out for food or sunlight, I slowly realized what else was happening that day on the Hill. It was a special year for the March—the 40th anniversary—and thousands had managed to show up despite the 10-degree weather and recent city-stalling snowstorm. I myself am avidly pro-choice (and have been since I read The Cider House Rules in high school) so I will admit I was less than pleased to find myself among the throng of pro-life advocates. But I tried not to begrudge them their right to free speech, and instead went about my day just hoping that by the time I exited the archive for my evening commute, the hullabaloo would be over.

When I finally left the Folger, the march had finished and individuals were making their way out of DC. Yet what remained in their wake was the trash. Heaped in garbage bins up and down the streets were mounds of signs, flyers, stickers and other protest paraphernalia from that day’s rally.  I first encountered the one below on the corner of 2nd and C street, SE, a block away from Independence Avenue. As I continued making my way to the Capitol South Metro stop, I came upon a large, discarded mass of signs apparently left by protestors afraid or unwilling to take them into the Metro station. There, gleaming under the setting winter sun, they lay discarded. As I made my decent down the escalator, I could see signs and flyers littered across the tiled floor, soaked in snow and mud from the previous day’s snowstorm; an overall-clad metro employee worked diligently to pick up the signs and place them in an already overflowing trash can.

Welshans 21.1

I am positive that the amount of trash left by this protest is not unique.  In fact, the conservative internet was abuzz with critiques of similar trash heaps left behind by climate protesters in New York City in September. Those critiques highlight the apparent hypocrisy of a protest which championed environmental stewardship, yet left masses of trash in its wake.  Upon seeing the litter left by those attending the March for Life, I was taken aback by a similar sense of hypocrisy. A mere two weeks before the protest, Pope Francis had delivered his New Year’s Address to the Vatican Diplomatic Corps which included, among other things, a critique of “the throwaway culture.” This culture, wherein individuals frequently throw away “food and despensible objects” with impunity, upholds the value system that encourages women to discard unborn fetuses like food waste, the Pope claimed.

In this same address, the Pope also noted that “the greedy exploitation of environmental resources” is also a “threat to peace,” and that Catholics are called to pursue “policies respectful of this earth which is our common home.” In his New Year’s address Pope Francis called for an end to a culture of excessive trash and an increase in environmental activism. On that January day, I could not help but read the streets around me, littered with the snow-soaked signage of that day’s protest, as symbolic of the contradiction between the protestors’ message and its aftermath. If the individuals present were protesting the “throwaway culture” that can lead to abortions, they were doing so in a way that no doubt provided local landfills with an influx of trash.

The current protestors in Hong Kong have been praised, among other things, for their demonstration of environmental stewardship. As one protestor told the New York Times, “In this protest, we want to show our citizenship and our will to have a democratic government. Although this cleanup is a small thing, it is something that shows the values that all Hong Kong citizens should have.” For demonstrators in Hong Kong, their commitment to reducing conspicuous waste underscores their activist commitments; they see the connection between environmental rights and human rights.

Whatever the protest, it is worth considering the message conveyed by protest paraphernalia both during the active protests and after. The trash left by those marching against global warming in effect fueled the right’s criticism of the movement. Similarly, I could not take seriously a march that championed the sacredness of life, yet seemed to care so little for the planet on which future lives will live—or the lives of those who would spend over-time hours restoring the city to its pre-march condition.  Yes, posters and signs are an effective means of communicating a message at a particular moment in time. But it behooves us to consider where those signs end up when we are done.

 


Melissa Welshans is a PhD Candidate in English at Syracuse University and is currently working on her dissertation The Many Types of Marriage: Gender, Marriage and Biblical Typology in Early Modern England. Melissa’s research is concerned with issues of gender and sexuality in early modern England, especially as it pertains to the institution of marriage. In her free time Melissa practices her nail art skills and snuggles with her husband and their two cats.