academia

Special Edition: How I Misplaced my Faith

[5 minute read]

Last month, when teaching a Metathesis post I previously wrote about being a Catholic scholar, I felt like a bit of a fraud. My intention in using this post was to give my students a look at my research on a rare book they had examined for class. However, when one of my students immediately remarked that the book smelled “you know, like when you’re at Easter Mass, and the priest is using incense”, my response was one of disconnect, rather than recognition. Between submitting my syllabus for approval in April and teaching the content in September, I had misplaced my faith somewhere.

Somewhere, I say, but I know exactly where I misplaced it. I left it in the run-down Amtrak station in Schenectady, New York: a tiny room with a manual train schedule, a contaminated drinking fountain, and an air freshener that whined every quarter hour. I know I left it there because I spent my layover from Syracuse to Montréal in an airport-style seat bank, squished between my piles of luggage, reading Kaya Oakes’s The Nones Are Alright.

AshleyOct1

My unglamorous road from Damascus

Oakes, a freelance writer and lecturer at the University of California, Berkeley, who will be giving the annual Borgognoni lecture on Monday, compiled this collection of first-hand narratives to represent the faith processes of those who belong to (as the subtitle describes) “A New Generation of Believers, Seekers, and Those in Between.” The book finds its premise in the 2012 Pew report on American religion, which identified that over a third of Americans have no religious affiliation. Some are “nones” — spiritual but not religious, they might be seeking a religion where they feel at home, or they might not. Some are “dones” — spiritually burned by their previous religious affiliation, they seek no association with formal religion. Some have never had religious affiliation; some had it, but found themselves unable to believe anymore. Whatever their motivations, a large population of Americans do not identify with religion as an institution, or as we previously knew it.

Image2An updated version of the Pew Research Center’s findings from 2014.

With her book, Oakes looks beyond the numbers of the report to compile and showcase the stories of these “nones.” The pages are populated by lifelong nonbelievers, sudden converts to atheism, and exploratory practitioners of multiple faiths, as well as exiled divorcees, gay ex-Jesuits, and women scalded by institutional sexism. But as I sat in the chilly station, one story about a Jewish seminarian-turned-Jewish atheist almost seemed to be talking about me. This man had built his life and his career around institutional Judaism. But although he was able to negotiate a personal agreement whereby he would teach nontraditional classes in Hebrew school and observe Jewish holidays, after reflection, he discovered that he could not bring himself to worship a being in whose existence he could no longer believe.

TitleCoverKaya Oakes’s The Nones are Alright (Orbis Books, 2015)

Although I started my degree as a (technically) non-practicing Catholic and described myself to colleagues as more intellectually than spiritually interested in Catholicism, within a year I was fully embedded in my research on Early Modern Catholicism, both academically and personally. I felt like I’d finally embraced — with a few provisos and quid pro quos — the faith I’d grown up in for my own. I was a Catholic scholar writing about Catholicism with aspirations of tenure at a hippie Catholic college. Sometimes it all seemed a little excessive; the other Catholic scholars I interacted with (who weren’t Jesuits) led much more diverse lives. But I had a brand, a kind of a fandom, and the symmetry made so much sense.

Yet here I was, beneath the dingy fluorescent lights of the train station, where the phrase “agnostic Catholic” struck me with such a resonance that I felt as if the text had directly addressed me. I’d never been able to completely buy into large chunks of the catechism. In the meanwhile, I practiced. The rites and rituals, but also the leadership positions and committee work — I practiced and participated in these because they seemed meaningful, because I could, and because I should. I’d always just assumed, or hoped, that someday, someone would explain it all to me in a way that I could believe in. I realized now that I’d confused faith with trust: and the more I distrusted the systems of oppression embedded in the Church (or that the Church was in bed with), the less I could truly believe that it all was true. I didn’t know what I believed anymore. And so I found myself in a little city, in a tiny Amtrak station, in a kind of long-distance communion with these “nones” — these people with whom I’d sympathized, but never empathized with before – now, my new fellow travelers.

I say I’ve misplaced my faith, because I wonder if it’s still around here somewhere. Like the Winnie-the-Pooh headband I’d misplaced as a child, that I had known must have been in my childhood bedroom somewhere, and which I’m still half-convinced is in one of the boxes my family never unpacked after our big move nineteen years ago. Maybe someday I’ll find that headband; maybe one day I’ll stop feeling like an imposter when I go to mass, or write for religious magazines.

newstationLet me know if you see my faith in the lost-and-found.

Schenectady tore down its Amtrak station a few weeks after I passed through. An artist’s rendering of the future new station depicted an elegant, white, modern building, ostensibly with computerized schedules and clean drinking water. Maybe I’ll find my faith still there when I next pass through. Maybe I’ll be a believer again. Or maybe I won’t: maybe I’ll always be a seeker. Or, maybe, I’ll be somewhere in between.


Kaya Oakes will be leading a discussion for graduate students about her work on Monday, October 9, 10:30 AM – 12:30 PM in Hall of Languages 504, Syracuse University. RSVP to Ashley O’Mara (amomara@syr.edu) for readings.

Ashley O’Mara is a PhD student and teaching associate in the Syracuse University English program. She studies asexuality, celibacy, and the queer politics of Catholicism after the Reformation in Early Modern English literature. In her down time, she writes creative nonfiction and listens to Mashrou’ Leila. She has very strong opinions about hummus.

Advertisements

Dear Diary…

Dear Diary,

Today I find myself in graduate school, I look around and still wonder how it is that I came to be here. In the fourth grade I cried while reading The Lord of the Rings because I believed that one of my favorite characters died. I would sneak out of the lunchroom to read The Wheel of Time in middle school, escaping to a future world in which the moon landing was known as the time people learned to fly in the stomach of firebirds. Chuck Palahnuik nursed me through high school anxieties, Bukowski through post-bachelor part-time coffee shop employment. Some time later I interned at a Fortune 500 company and Woolf taught me that a cubicle was not a room. Arthur had a vassal who disrupted the court after obtaining the love of a Fair Queen; I compared labor strategies of multinational national companies between liberal and coordinated market economies – every mythos has its own magic.

Mythos are comforting; they provide a sense of stability that belies chaos.
A narrative of elisions asserting its authority over origin that must be taken on belief.

What little evidence remains of a body’s passage through time and space would do little to comfort an empiricist, but I choose to dream. In time I will come to question their authenticity, were they ever my dreams or an overexposure to fantasy novels as a child? This is really an anxiety over whether or not I have an interiority – a crack in my phone renders the seamless continuity between body and technology an illusion. Were the avant-garde the last of the humanists?  

…legs wrapped around your stomach kissing the back of your neck…despondent and watching little flakes of gold twirling in the wind – 50 degrees on 9th of November…

I found myself in graduate school, lucid enough to know that I was not dreaming. A semester spent discussing the permeation of melancholy, mornings spent at the diner down the street reading over coffee and hash browns. A car full of strangers traveled six hours to make their voices heard, nihilism would not be revolutionary.  

I will feel like a pastiche of the materials I confront, and take comfort in that we are all hybrids. I will grow sick of melancholy, consider returning to it for my next paper, settle on the fact that affect is separate from materiality and so it becomes a question of mediation.

Then I laugh.

I spend time pulling from the stacks, and although at times have emitted a small growl, find excitement when discovering more texts than I had expected. I cross paths with graduates in the physics department, we discuss the stars. I find myself confronting new stories, reading for materials and energies that shape, and cannot shape, our bodies.

Today I am in graduate school, the humanist project has not ended.

Dear Diary,

Today I find myself in graduate school, unsure if it is the translation or the theory that doesn’t make sense. I’m sitting in a class surrounded by people I just met. I’m wondering at what point I’ll feel like a graduate student—if I can even define “graduate student?” Graduate students look like the people around me. Allegedly, I look a lot like them.

Someone once told me individuals who hesitate when talking in a room full of people are afraid because everyone else looks like a complete human being, like they are in control of their bodies. I realize first-person perspective is nerve-wracking because I do not see a composed body. I can only see hands, gestures, flailing limbs that, I hope, are somehow clarifying my point. I can only hear how weak words sound when they are mumbled into my lap.

One day, we will talk about identity politics, about identification, and debate whether or not words have power. I don’t know yet that this will become relevant all too quickly. One Wednesday in November, I will walk onto campus and feel the tired breathing of bodies, like mine, that were up until 4 a.m. the night before.

I will spend this day and the coming weeks waiting for, hoping for, dreading the moment someone wants to talk. This anxiety will be more than just a product of introversion. I will interrogate the expectations attached to this side of the desk. There’s a frail aura of authority that comes with being the one already seated when someone enters a room.

Eventually, I will need to learn how to handle the guilt of looking away to get things done, to decompress, to not lose hope. I will fight back the feeling of sickness, the stomach acid associated with the privilege of being able to think about decompressing.

I will learn that so much of graduate school feels like learning how I’m probably being irresponsible. Why new historicism? Look what happens if you combine feminist criticism with that. Didn’t you have interest in class at one point? If you’re just looking at the feminist individual, are you inadvertently “reproducing the axioms of imperialism” in nineteenth-century British literature? I’m so uncomfortable with the idea of syphoning off problematic portions of texts to read other points I have personal investments in. How close is this to paranoia?

But then, I breathe.

One day, I will relish the feeling of breaking ground, of fingers flying over keys, the paradox of excited exhaustion. I will remember the way strangers’ smiles became familiar fixtures, and how I learned to read and laugh again.

Today, I find myself in graduate school. I say it is okay to feel fulfilled while still fulfilling.

Empathy and the Danger(s) Disengagement

 

 

 

For the past couple of years, I’ve been keeping a list.

Admittedly, it’s not an original concept, being a mental exercise adapted from one of many optimistic Pinterest boards encouraging meditative mindfulness and gratitude in the upcoming New Year. Instead of coming up with a soon-to-be neglected resolution, this effort at self-improvement requires little more than keeping a record of positive memories, noteworthy events, or otherwise “good things.”

In addition to brown paper packages tied up with strings, my list of “Good Things to Remember from 2016” ranged from personal achievements, to exciting sport victories, cultural and artistic high points, and celebrated milestones: in February, the Carolina Panthers – my home state’s football team – made it to Super Bowl L, where a spectacular halftime performance by Beyoncé Knowles-Carter called attention to the Black Lives Matter activist movement on the biggest stage in televised sports. In April, Knowles-Carter released her powerful visual album, Lemonade, an unflinching tribute to black women, honoring their voices, and acknowledging the struggle of living while black in the United States. My sister was married in May, my brother graduated from high school in June, and Lin-Manuel Miranda’s transformative musical, Hamilton, was nominated for sixteen Tony awards, and won eleven. After nearly eight months of intensive study, at the end of September I successfully passed my department’s Ph.D. Oral Qualifying Exam, and I subsequently took an impromptu celebratory trip to visit an old friend in Halifax.

Looking back, however, it’s easy to see the gaps in the record. Sometime around early June, the number of items in the list began to dwindle, and around mid-November, the documentation completely stops.

2016

Unsurprisingly, as pieces of cultural commentary, Internet memes are more productive and illuminating than many realize.

To say that the year 2016 has been fraught with tension is a tremendous understatement.[1] As Thomas Paine wrote, these are the times that try men’s [and women’s] souls, and in these past twelve months, it seems like we’ve run the gauntlet, a hundred times over. This is the year that Taiwan may be the first East Asian nation to achieve marriage equality, and the year that the deadliest shooting in American history was carried out against LGBTQ+ people at the Pulse Club in Orlando. This was the year of the United Kingdom’s decision to withdraw from the European Union, of the spread of far-right populist fervor across Europe, and the rise of white supremacist ideologies in the highest political offices and pulpits in the United States. The 2016 Summer Olympics in Rio de Janeiro saw, for the first time, a Refugee Olympic Team competing as independent participants, and this is the year that the Syrian Refugee Crisis reached its most desperate peak.

Political forces and governmental stratagems seemingly out of control dominated the domestic and international landscape, plaguing media outlets with misinformation and fake news. We watched tragedies unfold in real time,[2] counted the deaths of too many beloved and inspiring figures, and anxiously waited for the other shoe to drop, and keep on dropping.

In the face of all this, we have prepared to resist, and continue to call others and ourselves to higher standards of vigilance and accountability. We must continue to read, to think, to create, to teach and engage. This month’s series on empathy and education has attempted to provide a space for admitting our fears, confronting difficult questions regarding possible failures, and supply encouragement for the task now, and ahead.

Every winter, my family stages a viewing of Peter Jackson’s Lord of the Rings trilogy, and the scene captured above, from The Two Towers, has always proven to be enormously compelling. Coming at the end of one of the film’s two climactic battle scenes, Frodo’s haggard vulnerability and Sam’s motivational speech resonates with pathos, and displays the power of oral tradition, the written word, and the driving force of narrative in general.

While stories may drive us, oftentimes, “most fantasy provides an excursion from the normal order of things, in the same way that carnival and Saturnalia were an inversion of the normal order, a letting-off of steam in order to facilitate a return to business-as-usual.”[3] Following the Electoral College’s dispiriting conformity to historical tradition, and several weeks after the initial shock, we find ourselves now couched in the festive spirit of holiday celebrations, and all-too-ready to turn over a new leaf. It may be tempting to “get on with our lives,” as the president-elect lately urges, and to pull back from the front lines, and not necessarily forget, but forgive and quietly disengage.

In times like these, although stories remain important, I think more often of the impassioned plea Merry issues to the Ents on their decision to abstain from action, to “weather such things as we have always done.”

“You are young and brave,” the hobbit is told, by much elder and wiser folk, then cautioned, “But your part in this tale is over. Go back to your home.” His friend Pippin tries to reason with him and says, “It’s too big for us. What can we do in the end?”

Fiction can no longer serve only as an escape from reality; academics can no longer afford to distance themselves from that which appears too startling, too surreal,[4] too beyond our capabilities to successfully engage. My list of “Good Things to Remember from 2017” may be a bit more difficult to attend to, but one of the first things at the top of that list will be the opportunity to keep on teaching, and to lead students through learning about race and literary texts, to seek out difficult yet productive discussions, and to foster communication and understanding.

There is good to look after, and our part in this tale is never too big to fight for.

[1] For those in need of hopeful optimism, it is equally important to recall that a lot of positive changes have been put into effect this year. To begin, here is another list, this one detailing “99 Reasons 2016 was a Good Year” (https://medium.com/future-crunch/99-reasons-why-2016-has-been-a-great-year-for-humanity-8420debc2823#.6zrnibfvu)

[2] In an insightful piece on the consciousness of language use and suicide, Chinese author Yiyun Li complicates the concept of a tragedy in terms of private pain and public acknowledgement: “That something is called a tragedy, however, means that it is no longer personal. One weeps out of private pain, but only when the audience swarms in and claims understanding and empathy do people call it a tragedy. One’s grief belongs to oneself; one’s tragedy, to others” (“To Speak is to Blunder.” The New Yorker: Personal History. 2 January 2017 Issue).

[3] This fascinating article analyzes the differences of empathetic and intellectual effort necessary when engaging in the genres of science-fiction versus fantasy, and analyzes the models of resistance offered up by key texts from each genre: https://godsandradicals.org/2016/12/03/models-for-resistance/

[4] Ultimately, instead of “fascism,” Merriam-Webster selected “surreal” as the 2016 word of the year.

 

Empathy and Education: The Double Burden (Part II)

In the numerous fields comprising that artistic and cultural field we call “the humanities,” we who self-identify as scholars must constantly be on the defense regarding our own choice of profession. An increasingly corporatized world sees banks encouraging ballerinas and actors to become engineers and botanists instead, and federal agencies such as the CBO actively suggesting reducing federal funding for the Arts and Humanities, since “such programs may not provide social benefits that equal or exceed their costs.”

This cacophony joins with countless other voices in our own lives: those cautioning us about the shrinking opportunities of the academic job market, who gently chastise us for dabbling in a passion instead of pursuing a career that will prove economically viable, and otherwise reminding us that the humanities are not where the dollars – or pounds or euros, among other forms of financial credit – lie. There is no Wall Street of literature, no actual stock market of philosophical ideas, and little funding to be found in dusty bookshelves and puzzling over words, ideas, and their meanings.

Why even bother?

As the old adage goes, “Those who don’t study history are doomed to repeat it.” A bastardized proverb, perhaps, with uncertain origins, and appropriated right and left – often by the political and ideological Left and Right – for various ends. The myth of linear progress haunts us with these lessons of the not-so-distant past. Especially in the awareness of unavoidable pitfalls, regressions, and obstructions in the hard-fought effort forward and upwards, we take into consideration the wisdom of looking over our shoulders and consulting voices that tell tales of suffering and horror never to happen again.

For those of us working in the fields of analyzing literature and encouraging critical thought, our reasons for choosing to engage with such materials on a day-to-day basis have long found ethical expression in empathy. We aim to broaden awareness of self and others, and to celebrate multicultural differences by considering multiple avenues of theoretical exploration. This is why we construct syllabi with an eye toward incorporating more writers outside the realms of canonical literature, the majority of these names belonging to women writers, and writers of color. For many of us teaching at the collegiate level, or in higher education in general, critiquing the norms of institutions, modeling thoughtful self-reflexivity, and teaching students how to close-read all goes hand-in-hand.

On some level, either personally or with boisterous confidence, we all wish to believe in our role to “Make America Smart Again.” Our faith in education fueled our optimism in a future defined by intelligence and inclusivity, and many a liberal-leaning Op-Ed piece declared the one advantage of Britain’s recent referendum to leave the European Union as both instruction and a tale of warning:

“One of the few good things about Britain’s vote to leave the European Union is the rich curriculum of lessons it offers leaders and electorates in other democracies…

Across Europe and in the United States, politicians can either respond to these cries of protest or face something worse than Brexit.”[1]

Was such belief a stroke of overconfidence?

Following November 8th, with electoral results and statistics rushing in from all sides, bleak disappointment followed closely by crushing realization began to settle in. These gut-reactions mingled with irritation at the instantaneous, yet contradictory impulse to assign blame:

“Why Did College-Educated White Women Vote for Trump?” (The New York Times)

“Blame Trump’s Victory on College-Educated Whites, Not the Working-Class” (New Republic)

“Trump Won Because College-Educated Americans are Out of Touch” (The Washington Post)

Such was, and still is enough to shake one’s faith in purposeful education. In the face of all this, what is the point of what we teach? These are the questions to haunt us now: does the work of our lives actually take any root? Should intellectuals shoulder the blame of having morphed into snobbish cultural elites?

Does investment in efforts toward empathy really yield any ideological change?

merriamwebster

 

In the days and weeks that have followed the 2016 Presidential Election, attempting to navigate and teach in this new reality has proven unsettling. All of a sudden, we have swerved from the academic postmodern into a maelstrom of media-influenced misinformation, Twitter rants, and unprecedented threats against freedom of speech, critique,[2] and intellectual or creative expression.

Welcome to the new American age, where everything about knowledge is made up, and apparently, points of truth and facts no longer matter. While Merriam-Webster considers its top result of 2016, The Oxford Dictionary has chosen “post-truth” as its word of the year. As NPR reports, “The word has been around for a few decades or so, but according to the Oxford Dictionary, there has been a spike in frequency of usage since Brexit and an even bigger jump since the period before the American presidential election…feelings, identifications, anxieties and fantasies, that’s what actuated the electorate. Not arguments. Not facts.

Perhaps this struggle we now face started long before Election Day; now, it seems more urgent than ever. From a fake news epidemic of so virulent a strain that that Pope Frances felt compelled to condemn the “sin” of perpetuating misleading information, to a linguistics battle over how to address the Ku Klux Klan-backed “Alt-Right” White Supremacy movement, words, ideas, and the ideological weight they hold have become weapons and flashpoints.

Caption: “Hey! A Message to Media Normalizing the Alt-Right”

Source: Late Night with Seth Myers, 7 December 2016

Speaking truth to power has never been an easy task, and the struggle against the normalization of silencing dissent is, and will remain difficult. While we elegize and self-reflect, we also turn to writers such as Zadie Smith to remind us that “history is not erased by change…progress is never permanent, will always be threatened, must be redoubled, restated, and reimagined if it is to survive.”[3] Likewise, Chimamanda Ngozi Adichie speaks of the dangers of complacency and neutrality – and goes a step further to remind us of the boundaries of empathy:

“Now is the time to resist the slightest extension in the boundaries of what is right and just. Now is the time to speak up and to wear as a badge of honor the opprobrium of bigots. Now is the time to confront the weak core at the heart of America’s addiction to optimism; it allows too little room for resilience, and too much for fragility. Hazy visions of ‘healing’ and ‘not becoming the hate we hate’ sound dangerously like appeasement. The responsibility to forge unity belongs not to the denigrated but to the denigrators. The premise for empathy has to be equal humanity; it is an injustice to demand that the maligned identify with those who question their humanity.”[4]

Words can obfuscate, enlighten, and entrap – and these complexities are elements we anticipate and enjoy when working with literary texts and critical theories. Although the questions surrounding a liberal or humanities-affiliated education may still haunt us, nowhere else can one find a space more prepared for the deconstruction of flashy rhetoric and the unpacking of ideology. Beyond the humanities, critical engagement with disparate voices, texts, and the ideas they represent pertain to disciplines all across the board, and intellectual endeavors of all stripes. We have many more lessons to teach, and much left to learn. This is our task, and may we rise to meet it.

[1] “Learning from Britain’s Unnecessary Crisis.” E.J. Dionne Jr. The Washington Post. 26 June 2016.

[2] Most recently, the union president representing workers at the Indianapolis branch of Carrier Corp. criticized the business deal the President-elect enacted late last month. Chuck Jones, the leader of United Steelworkers Local 1999, challenged Trump to authenticate his claims, and soon afterwards began receiving anonymous death threats.

[3] “On Optimism and Despair.” Zadie Smith. The New York Review of Books. 22 December 2016 Issue.

[4] “Now is the Time to Talk About what we are Actually Talking About.” Chimamanda Ngozi Adichie. The New Yorker. 2 December 2016.


Vicky Cheng is a fourth-year Ph.D. student whose research and teaching interests center on nineteenth-century British literature and culture, with a specific focus on queer and feminist readings of Victorian texts. Her proposed dissertation project finds its structure through queer methodology, and will investigate Victorian novels and conflicting representations of gendered bodies within. Other scholarly interests include mediations between textual description and visualization, the structures of power surrounding the interplay of non-normative bodies and disruptive desires, and the complexities of embodied sexualities.

Empathy and Education: The Double Burden (Part 1)

A couple of weeks ago, toward the end of our class’s unit on “Thrills, Sensations, and the Ethics of Nonfiction,” I assigned my students the University of Chicago’s Welcome Letter to the Class of 2020 alongside Sara Ahmed’s thought-provoking “Against Students” (June 2015). The former, a document separately decried or praised as patronizing and oppressive or timely and appropriate, comes from a private University that prides itself as “one of the world’s leading and most influential institutions of higher learning,”[1] and has a notorious reputation among academics for fostering an ultra-competitive – and potentially hazardous – environment for its students.

Following a word of congratulations, the letter states:

“Our commitment to academic freedom means that we do not support so-called ‘trigger warnings,’ we do not cancel invited speakers because their topics might prove controversial, and we do not condone the creation of intellectual ‘safe spaces’ where individuals can retreat from ideas and perspectives at odds with their own.

Fostering the free exchange of ideas reinforces a related University priority – building a campus that welcomes people of all backgrounds. Diversity of opinion and background is a fundamental strength of our community. The members of our community must have the freedom to espouse and explore a wide range of ideas.”

A number of think pieces had their say, and the talking heads gave comment. In response, educators and administrators from various institutions defended their policy of creating safe spaces and giving trigger warnings; using the same terminology, they all argued for the same purpose: academic freedom and “moral responsibility.” Proponents of the University of Chicago’s pedagogical stance lauded this strike against so-called “political correctness,” insisting that incoming students should stop expecting a protective safety net to cushion controversial speech and difficult issues. Safe spaces, it was implied, or outright declared, are a cocoon of muffled sensitivities freshmen ought to have outgrown by their first semester of college.

Ahmed’s piece, while predating the University of Chicago’s letter by almost a year, exposes similar “sweeping” generalizations made in critiques of higher education, while laying bare the ideological contradictions the letter claims to espouse. Students who are often blamed as oversensitive, coddled, and otherwise too entitled to address “difficult issues” bear the brunt of critique in the wider battle of, and backlash against the dreaded brand of PC-neoliberalism. In actuality, those who oppose trigger warnings often do so at the expense of marginalized groups and students as a whole, and not in service of a wider range of critical discussion.

“The idea that students have become a problem because they are too sensitive relates to a wider public discourse that describes offendability as a form of moral weakness and as a restriction on “our” freedom of speech. Much contemporary racism works by positioning the others as too easily offendable, which is how some come to assert their right to occupy space by being offensive…

This is how harassment can be justified as an expression of academic freedom.”

Rhetorically, those who use this toxic, masculinist mantra to “man up and quit being so offended” imagine its directed audience as a bunch of whiny, thin-skinned spoiled brats. It has become a “no guts, no restriction of hateful speech, no glory” approach modified for instructional spaces. Unsurprisingly, it represents yet another attack upon we Millennials of the generation of participation trophies; we special snowflakes-turned-Social Justice Warriors; we who dare protest for a minimum wage of $15/hour, refuse to consider any human being “illegal,” and demand equal rights under the law for an ever-expanding catalogue of identities, intersectionalities, and sexualities.

PC.png

The thing about we who make it our job to deal in words is that we know what they say about us. Sometimes, we respond with sarcasm and memes.

Apparently, to many, intellectual boldness – or the tricky concept of free speech in general – is incompatible with thoughtfulness, compassion, or the necessity of imagining and reflecting upon the consequences of such speech. But at its core, intellectual efforts rest upon a foundation of empathetic engagement, curiosity, and responsible efforts to give voice to those who have previously been silenced.

For the most part, we who teach are expected to keep personal politics out of the classroom. Each student ought to have their say, and must not fear their grade may suffer due to a difference of religious, political, or personal ideological belief. The classroom is a place for critical engagement and analytical inquiry, but it should not act as a place of conversion, or the base of any particular soapbox.

On the other hand, we introduce students to the concept of ideology, and invite them to critically question previously held beliefs; we encourage students to critique ideas, and not the individual espousing them. Disagreement should not deter discussion, so long as speech remains respectful and productive. We are all here to learn, is the unspoken catchphrase of the liberal arts education, and we learn best when we question what it is we think we know.

I presented the University of Chicago’s welcome letter to my class without trepidation – not because I expected every student to agree with the material, or to contest it straight away; rather, their job was to consider the rhetorical strategies being employed, and foster an interpretive reading based upon textual evidence. Thus far, we had studied texts through the framework of social critique and purposeful writing, interrogating the usefulness of nonfiction texts that have outlived their writers. We questioned the boundaries of truth and fiction, fantasy and reality, and spent a good portion of the semester discussing the importance of readers’ ethical responses to texts presenting themselves as unproblematic, factual, and objective. They held productive class discussions on tone-policing, white privilege, and the conflation of violence with sensational journalism and the commodification of wartime horror. These students, most of them incoming freshmen, rose quickly to the challenge of tackling these subjects, with vigor and great respect for the material, and one another.

The students of this generation “aren’t snowflakes, and they don’t melt,” Yale professor Steven Berry writes, in admiration of the resiliency of students who were still able to attend class and complete an exam the morning of November 9th. The same resiliency we admire in our students becomes so much more difficult to embody when we, students and scholars and educators alike, consider how much more dangerous our world has suddenly become.

Ten days after the U.S. election, eight hundred sixty-seven hate incidents were reported to the Southern Poverty Law Center, the majority of these occurring in K-12 schools. Since then, an organization named Turning Point USA, which purports to “fight for free speech and the right for professors to say whatever they wish,” has created a Professor Watchlist, with profiles of “professors that advance a radical agenda in lecture halls” – the majority of those listed professors being women and persons of color.

post-election-hate

“Ten Days After: Harassment and Intimidation in the Aftermath of the Election” Source: Southern Poverty Law Center, https://www.splcenter.org/20161129/ten-days-after-harassment-and-intimidation-aftermath-election

Without giving into paranoia, the project of providing safe spaces appears more daunting than ever. Despite this, while the classroom may not be a pulpit or a soapbox, it nevertheless remains a platform for instruction. Our determination to forge ahead despite fear and anger represents both the privilege and the burden of educating with empathy, and an ethical responsibility we owe to ourselves, and those we aim to instruct.

[1] This quote comes from the University of Chicago’s Wikipedia page (https://en.wikipedia.org/wiki/University_of_Chicago); the university’s homepage and admissions proudly greets visitors as “a private, nondenominational, culturally rich and ethnically diverse coeducational research university…committed to educating extraordinary people regardless of race, gender, religion, or financial ability.” (http://www.uchicago.edu/)


Vicky Cheng is a fourth-year Ph.D. student whose research and teaching interests center on nineteenth-century British literature and culture, with a specific focus on queer and feminist readings of Victorian texts. Her proposed dissertation project finds its structure through queer methodology, and will investigate Victorian novels and conflicting representations of gendered bodies within. Other scholarly interests include mediations between textual description and visualization, the structures of power surrounding the interplay of non-normative bodies and disruptive desires, and the complexities of embodied sexualities.

“Of Course You Know…”: Deconstructing the Privilege of Knowledge

Some time ago, a colleague of mine was leading discussion in class, and he offhandedly remarked that, of course, we all knew that Aristotle had spoken of the same issue we were discussing in his Nichomachean Ethics. The way in which he made the utterance made it clear that, if we did not, in fact, know this reference, we were somehow lacking, that we had clearly missed out on some key part of being a truly educated person and that, equally clearly, graduate students in an English department should certainly be conversant with these sorts of (seemingly offhand) references.

Now, as a Classics major in undergrad, I was passingly familiar with Aristotle’s works (though I will admit that I had not read Nichomachean Ethics in approximately 10 years, so obviously my recollection of it would have been rusty to say the least). However, even I felt that this was somehow a thinly-veiled attack on those in the classroom who, for whatever combination of socio-economic and educational reasons, might not have had access to that same store of shared knowledge that my colleague was referencing. Whether or not the attack was malicious is impossible to say, but there was no question that there were many in the classroom who felt alienated by this comment–and, just as importantly, by its delivery–and that a valuable moment of shared learning was therefore compromised.

What distressed me the most, however, was how built into that moment of not-so-subtle shaming was a profound sort of privilege of which my colleague seemed to be utterly unaware. It no doubt never occurred to him that some of us may have come from high schools or undergraduate institutions that did not place such an emphasis on the Western canon, or that emphasized other important works of western philosophy that were not dominated by dead white men. So embedded was my colleague in both his class and knowledge privilege that any alternative to his ways of knowing seemed to exist beyond the pale of acceptability.

Nor is this sort of privileged posturing and knowledge shaming limited to graduate students (who, it must be said, often face their own challenge. The pressure to perform one’s expertise is particularly acute in the graduate classroom). I have, on numerous occasions, heard faculty from departments from various universities and departments dismiss the level of “basic knowledge” that today’s undergraduate students possess, implying that they have somehow fallen down on the job in terms of preparing themselves for their college education. This is not to say that the faculty actually think this, mind you, only that it is often heavily implied in the way in which these critiques of students are delivered.

This is not to say that there aren’t real deficiencies in the preparation that many high school students undergo as they prepare for their academic futures in college. What troubles me is the implication that somehow the students are to blame and, relatedly, that our privilege as learners and knowers is somehow natural and that this renders us somehow superior to the students we teach. Rather than attempting to understand the unique perspectives that students bring to the classroom–including and especially their socioeconomic status–these assumptions presume that there is a standard to which everyone should be held, regardless of their background.Periodically, I will catch myself making assumptions about the body of knowledge that my students bring into the classroom. I have become so entrenched in the world of academia–in particular, I have become accustomed to being around my graduate school colleagues in a private, well-funded institution–that it sometimes doesn’t occur to me that not everyone has had the same privilege that I do. When I lose track of that privilege, when I assume that my students have a knowledge and then shame then when they don’t, I lose a valuable sharing opportunity.

As a result, I have begun making a conscious effort to meet my students where they are and to help them access and share the same love of knowledge and learning that I have always possessed. I encourage them to ask me if they do not understand something or if I make a reference (or even a word) that they do not grasp, because only by doing so can I ensure that we are all learning and engaging with knowledge together. Rather than ensconcing myself in my privilege, I actively work to deconstruct it.

This more nuanced understanding of socio-economic and knowledge privilege allows me, I believe, to be a more compassionate and effective educator. I can use my knowledge, accrued and developed through years of undergraduate and graduate training, to meet students on their own terms and show them new ways of thinking and engaging, even as they also educate me. Rather than viewing their lack of knowledge as a problem to be corrected, I see it instead as an opportunity.

And that, I think, benefits both myself and my students.

 

Hidden mental health troubles in the ivory tower (13 Nov. 2015)

An initial reason for not sharing my experiences with depression was a persistent fear that people would think I was not strong enough for academia. My identity was so tightly wrapped up in my productivity, my latest department seminar, and my C.V. that the very thought of someone questioning my academic grit was enough to keep me from seeking treatment or even admitting to myself that something was wrong.

Fig 1

Fig 1: photo credit: D.A. Sonnenfeld

But I did have enough grit to excel in academia; I was tough as nails, strong as diamond, but that had very little bearing on my being strong enough to care for myself. Fortunately, around this time, I ran across a post by a favorite scientist blogger. He queried how many of his readers took a prescription drug, any drug, to enable successful academic performance. One in three of the over 150 reader responses in his unscientific, yet illuminating, poll confirmed the professional need for prescription drugs. One in three. These results were posted when I still shied away from talk therapy, let alone medication. It dawned on me that muscling through mental illness wasn’t the only option. Moreover, pushing through might not be a very good option.

A trip through academicsblogs suggests that not only is mental illness is pervasive in academia, but there is a paucity of research on mental health in the ivory tower. Being a scientist myself, I tried to find some nice, tidy statistics about the prevalence of mental illness in academia versus the general public, but repeatedly came up empty handed. The best evidence comes from two studies from the U.K. and Australia. A survey from the UK indicates nearly half of all academics report high or very high stress levels, though specific connections to anxiety, depression, or other disorders were not explored. Additionally, the magazine New Scientist reports that an Australian study found three to four times the incidence of mental health among academics compared to a general population. Unfortunately, the Australian study is behind a pay wall that, even with my University credentials, I can’t access to explore further.

There are a few factors that I propose contribute to the frequency of mental illness in academia, particularly among graduate students. Anyone who’s spent time in a graduate program or has loved someone working on their graduate degree knows the pressure to achieve can be intense. Graduate research can be an isolating experience as you zoom in on an ever-narrowing topic of study. Academia is also filled with rejection. Rejection of manuscripts, unfunded grant proposals, failed experiments, tenuous committee meetings, poorly received presentations, and the list continues. Unless you have a supervisor dedicated to championing academia’s infrequent successes, which I fortunately did, all the perceived failures can lead to a demoralizing collection of years.

Fig 2

Fig 2: photo credit: Greg Dunham

Another factor that’s less discussed, but I think is important to consider, is the predisposition of academics. I can only speak specifically to my observations in my little corner of Biology, but I suspect there is great overlap with other disciplines. We’re a detail and data-oriented bunch, trained to engage in the rational rather than the emotional side of our brains. We tend to be over-achievers, the highest achieving of which can still feel their contributions to science are not enough. Partitioning off important emotions, or even ignoring them in favor of the path to achievement, certainly did not help me with self-awareness.

In my experience, I used the academic pursuit to deny myself care. I tried to logic my way out of depression – I had a great partner and friends, I was successful in my work, it was simply illogical that I felt the way I did. In my last grasps to ignore that something was very wrong I turned harder into my research, attempting to fill my emptiness with data collection. It didn’t work.

One of the initially perplexing aspects of my depression was the timing. Depression didn’t follow a series of rejections, arise at a period of particularly high stress, or spring from a volatile relationship with my advisor and colleagues; depression hit when things were going well. After much discussion with my therapist, we decided that it was precisely the lack of academic or professional pressures to fixate on that unveiled the trouble underneath. My depression was not situational in the sense of a stressful external event causing my symptoms. It was clinical. It was major depressive disorder.

Fig 3

Fig 3: photo credit: Fresaj

In my case, genetic and early family environment most influenced my depression. Depression shows up on both sides of my family tree, for certain in at least the most recent generations when it’s become more societally acceptable to discuss mental health. I’d prefer not to delve into the early family environment portion, but I will say that overt abuse isn’t the only thing that can compromise a secure childhood. In short, many factors insidiously aligned to lead to my depression.

I continue to be frustrated at the lack of discussion of mental health in academia, despite its pervasiveness. At no point during any of the orientations I attended as graduate students was there mention of coping with mental illness while in grad school. If the existence of mental health facilities on campus were discussed, it was brief enough to be promptly forgotten. Discussions with fellow graduate students revealed that I am certainly not the only one to deal with depression. I’m also not the only one who has been hospitalized while in grad school. I can’t help but think that if I was aware of how common mental illness is in academia and if I knew that there is no shame in obtaining treatment, then I may have sought help much sooner.


As a Biology Ph.D. candidate, Liz Droge-Young studies the incredibly promiscuous red flour beetle. When not watching beetles mate, she covers the latest science news on campus for Syracuse University’s College of Arts & Sciences communication department. She is also a mental health advocate, a voracious consumer of movies, and a lover of cheese.