Public Humanities

“I come to bury Caesar, not to praise him:” Shakespeare and the Politics of Interpretation

[5-7 minute read]

During my last month writing for Metathesis, I talked about the contemporary desire to find political meaning in Shakespeare’s plays. Then in June, Shakespeare in the Park staged a performance of Julius Caesar in which the actor playing Caesar consciously invoked the image of President Trump, mimicking his vocal affectation and his mannerisms. This performance was met with public backlash, as voices responded with anger at the idea of a publicly funded art institution staging the assassination of the sitting President. As someone who studies early modern drama, it was a surreal moment to see the nation spend a few days in the middle of Summer having a conversation focused on how to properly interpret Act 3 of Julius Caesar. For a moment in June 2017, the text of a play from 1599 about the death of a Roman Consul in 44 BC was at the heart of a public debate over the relationship between art and politics.

Image 1Per the performance, this was a Caesar who could stab a man on fifth avenue and not lose a supporter.

Most surprising to me was the outpouring of reactions to the controversy that framed it as one over interpretations of the play. These responses attempted to announce, as clearly as possible, that Julius Caesar is not a play that endorses political violence – and they were built upon textual arguments and close-readings.[1] These responses, from sources like The Guardian and The New York Times to The AV Club and The Atlantic, centered on the idea that a sufficiently skillful reading of the text of Julius Caesar would clear up any confusion over whether or not the production supported the actions of the Roman conspirators. By extension, this assumption meant a skillful reading would also appropriately address – and perhaps deflate – any anger of what the play was perceived to say about President Trump. For these responses, the portion of the public angry about the performance was simply missing the point of the play, or as Atlantic frames it, it was a case of “[m]isplaced [o]utrage.” The Guardian piece brings in Stephen Greenblatt to explain how dissenters are missing “the point of the play.” Even the statement by the theater itself is built partially on this premise, stating “Shakespeare’s play, and our production, make the opposite point: those who attempt to defend democracy by undemocratic means pay a terrible price and destroy the very thing they are fighting to save.” Invoking the authorial voice of Shakespeare alongside their own production decisions, the statement reads as not only a defense of artistic integrity, but also a pointed claim: at the heart of the controversy is a misreading of Julius Caesar.

Now, these responses also seem intent on producing a singular interpretative lens through which to view the play. These readings gloss over the idea that while one can read Julius Caesar as a play that is deeply skeptical about the conspiratorial action of figures like Cassius and Brutus, it can also be read as a play in which a demagogue exploits a mob of Roman citizens and preys upon their anger and resentment to compel them to destructive violence. This notably includes a scene in which the mob tears a poet to shreds because they dislike his verses, an equally prescient interpretation. However, for me, the fascinating aspect of these responses lies less in the specific interpretations that they provide for Julius Caesar, and more in the underlying assumption that the entire ordeal stemmed from a debate over the textual meaning of Act 3 of Julius Caesar, with the accompanying suggestion that this would be cleared up through the authoritative voices of individuals who were simply better readers. This move signals an important divide in how the various voices in the conversation conceptualize the place of the stage (and other arts) in public discourse. Shakespeare, these responses seem to imply, is more in danger of being misread than anything else. The political undercurrents of the play are not dangerous; rather, the possibility that they will be misunderstood is dangerous and that must be warded against.

Central to this conversation is the implication that the theater is a site of political tension and that the interpretation of this tension can be, and often is, a deeply political act. This is certainly not a new debate. For another examination of the relationship between theater and the present administration, see Ashley O’Mara’s Persuasive Performance: Theater and Conversion. Tensions surrounding the theater and the role of drama in the Anglophonic world date back to the foundation of the first public theaters and in my next post, I’m going to explore how debates over the place of the theater in public political life have evolved since Shakespeare’s work were first performed on the London stage.


[1] Putting my own personal interpretative cards on the table: Julius Caesar is not a play that endorses political violence. Also, it should be noted that the original story that generated anger around the performance neglected to mention that the play in question was Julius Caesar.

Evan Hixon is a third-year Ph.D. student in the English Department. His studies focus on Early Modern British theater with an emphasis on Shakespeare, political theory and Anglo-Italian relations. His current research work examines the rise of English Machiavellian political thought during the reign of Elizabeth I.

Advertisements

Empathy and the Danger(s) Disengagement

 

 

 

For the past couple of years, I’ve been keeping a list.

Admittedly, it’s not an original concept, being a mental exercise adapted from one of many optimistic Pinterest boards encouraging meditative mindfulness and gratitude in the upcoming New Year. Instead of coming up with a soon-to-be neglected resolution, this effort at self-improvement requires little more than keeping a record of positive memories, noteworthy events, or otherwise “good things.”

In addition to brown paper packages tied up with strings, my list of “Good Things to Remember from 2016” ranged from personal achievements, to exciting sport victories, cultural and artistic high points, and celebrated milestones: in February, the Carolina Panthers – my home state’s football team – made it to Super Bowl L, where a spectacular halftime performance by Beyoncé Knowles-Carter called attention to the Black Lives Matter activist movement on the biggest stage in televised sports. In April, Knowles-Carter released her powerful visual album, Lemonade, an unflinching tribute to black women, honoring their voices, and acknowledging the struggle of living while black in the United States. My sister was married in May, my brother graduated from high school in June, and Lin-Manuel Miranda’s transformative musical, Hamilton, was nominated for sixteen Tony awards, and won eleven. After nearly eight months of intensive study, at the end of September I successfully passed my department’s Ph.D. Oral Qualifying Exam, and I subsequently took an impromptu celebratory trip to visit an old friend in Halifax.

Looking back, however, it’s easy to see the gaps in the record. Sometime around early June, the number of items in the list began to dwindle, and around mid-November, the documentation completely stops.

2016

Unsurprisingly, as pieces of cultural commentary, Internet memes are more productive and illuminating than many realize.

To say that the year 2016 has been fraught with tension is a tremendous understatement.[1] As Thomas Paine wrote, these are the times that try men’s [and women’s] souls, and in these past twelve months, it seems like we’ve run the gauntlet, a hundred times over. This is the year that Taiwan may be the first East Asian nation to achieve marriage equality, and the year that the deadliest shooting in American history was carried out against LGBTQ+ people at the Pulse Club in Orlando. This was the year of the United Kingdom’s decision to withdraw from the European Union, of the spread of far-right populist fervor across Europe, and the rise of white supremacist ideologies in the highest political offices and pulpits in the United States. The 2016 Summer Olympics in Rio de Janeiro saw, for the first time, a Refugee Olympic Team competing as independent participants, and this is the year that the Syrian Refugee Crisis reached its most desperate peak.

Political forces and governmental stratagems seemingly out of control dominated the domestic and international landscape, plaguing media outlets with misinformation and fake news. We watched tragedies unfold in real time,[2] counted the deaths of too many beloved and inspiring figures, and anxiously waited for the other shoe to drop, and keep on dropping.

In the face of all this, we have prepared to resist, and continue to call others and ourselves to higher standards of vigilance and accountability. We must continue to read, to think, to create, to teach and engage. This month’s series on empathy and education has attempted to provide a space for admitting our fears, confronting difficult questions regarding possible failures, and supply encouragement for the task now, and ahead.

Every winter, my family stages a viewing of Peter Jackson’s Lord of the Rings trilogy, and the scene captured above, from The Two Towers, has always proven to be enormously compelling. Coming at the end of one of the film’s two climactic battle scenes, Frodo’s haggard vulnerability and Sam’s motivational speech resonates with pathos, and displays the power of oral tradition, the written word, and the driving force of narrative in general.

While stories may drive us, oftentimes, “most fantasy provides an excursion from the normal order of things, in the same way that carnival and Saturnalia were an inversion of the normal order, a letting-off of steam in order to facilitate a return to business-as-usual.”[3] Following the Electoral College’s dispiriting conformity to historical tradition, and several weeks after the initial shock, we find ourselves now couched in the festive spirit of holiday celebrations, and all-too-ready to turn over a new leaf. It may be tempting to “get on with our lives,” as the president-elect lately urges, and to pull back from the front lines, and not necessarily forget, but forgive and quietly disengage.

In times like these, although stories remain important, I think more often of the impassioned plea Merry issues to the Ents on their decision to abstain from action, to “weather such things as we have always done.”

“You are young and brave,” the hobbit is told, by much elder and wiser folk, then cautioned, “But your part in this tale is over. Go back to your home.” His friend Pippin tries to reason with him and says, “It’s too big for us. What can we do in the end?”

Fiction can no longer serve only as an escape from reality; academics can no longer afford to distance themselves from that which appears too startling, too surreal,[4] too beyond our capabilities to successfully engage. My list of “Good Things to Remember from 2017” may be a bit more difficult to attend to, but one of the first things at the top of that list will be the opportunity to keep on teaching, and to lead students through learning about race and literary texts, to seek out difficult yet productive discussions, and to foster communication and understanding.

There is good to look after, and our part in this tale is never too big to fight for.

[1] For those in need of hopeful optimism, it is equally important to recall that a lot of positive changes have been put into effect this year. To begin, here is another list, this one detailing “99 Reasons 2016 was a Good Year” (https://medium.com/future-crunch/99-reasons-why-2016-has-been-a-great-year-for-humanity-8420debc2823#.6zrnibfvu)

[2] In an insightful piece on the consciousness of language use and suicide, Chinese author Yiyun Li complicates the concept of a tragedy in terms of private pain and public acknowledgement: “That something is called a tragedy, however, means that it is no longer personal. One weeps out of private pain, but only when the audience swarms in and claims understanding and empathy do people call it a tragedy. One’s grief belongs to oneself; one’s tragedy, to others” (“To Speak is to Blunder.” The New Yorker: Personal History. 2 January 2017 Issue).

[3] This fascinating article analyzes the differences of empathetic and intellectual effort necessary when engaging in the genres of science-fiction versus fantasy, and analyzes the models of resistance offered up by key texts from each genre: https://godsandradicals.org/2016/12/03/models-for-resistance/

[4] Ultimately, instead of “fascism,” Merriam-Webster selected “surreal” as the 2016 word of the year.

 

Machiavelli’s “Small Volume”: The Legacy of the Stage Machiavel (29 April 2016)

“Bearing in mind all the matters previously discussed, I ask myself whether the present time is appropriate for welcoming a new ruler in Italy, and whether there is matter that provides an opportunity for a few-seeing and able man to mold it into a form that will bring honour to him and its inhabitants.”

-Machiavelli

As we’ve been considering the seemingly timeless quality of the figure of the stage Machiavel, it is worth remembering that the archetype is drawn from a series of highly specific moments in history.   The quote at the top of the page reminds us that Machiavelli is writing during a period of intense civil unrest in Italy, following a major foreign invasion and the dissolution of a number of seemingly stable governments and it was written as a gift for a single man—Lorenzo de’ Medici.[1]  Even so, while English audiences found themselves largely disinterested with Machiavelli’s specific appeals to Italian cultural history or his interest in the maintenance of armies and auxiliaries, there was something about the Florentine that caught fire in the cultural imagination of England.  Through stage representations, his political ideas were spread to a population that would have otherwise had little access to them,[2] and the staging tropes that helped to disseminate a basic overview of Machiavellian thought have remained with us ever since.

Over the last few weeks, I’ve been looking at popular representations of Machiavellian politics with an eye turned towards the ways in which contemporary audiences share the same fascination with Machiavelli that defined early modern representations.  For the last 400 years, Anglophonic audiences have been fascinated by attempts to understand Machiavelli’s political beliefs, and I have only touched upon a small sample of the most popular contemporary representations.  The goal here has been less to say anything about Machiavelli’s actual politics than to examine the process by which cultural understandings of those politics end up in our popular fiction.  The stage Machiavel offers an interesting case study for examining the ways in which popular representations of political philosophy can make those theories more accessible and the ways in which those same representations can participate in shaping public discourse concerning those theories.   While printers would eventually receive license to legally print The Prince in England, decades of being represented as a ruthless stage villain certainly colored the reading practices of English audiences.

This in turned has dramatically impacted our cultural perception of virtually everything connected to Machiavelli.  Period fiction set during the early 16th century frequently turns to him as a ready-made villain in the same way that Christopher Marlowe utilized Machiavelli to introduce The Jew of Malta.[3]  He has appeared as a character in texts ranging from Showtime’s The Borgias to Ubisoft’s Assassin’s Creed II.

Machaivelli%2c The Borgias

Machiavelli in The Borgias

Just as his name became shorthand for a duplicitous schemer, his person has entered into the stable of stock historical villains.  Just as stage representations of Machiavellianism would brand any act that was remotely morally questionable as Machiavellian, modern pop culture representations label any act of political scheming as inherently connected to Machiavellian thought.  Even though the characters that I examined in the last few weeks of posts frequently display a number of profoundly non-Machiavellian beliefs,[4] the image of the stage Machiavel still informs the way in which we understand those characters.

In closing up my month of blog posts, I hope to have demonstrated the ways in which the tropes of the early modern stage have remained with us throughout the past five centuries.  In the wake of the 400th anniversary of Shakespeare’s death, it becomes worth considering the ways in which it isn’t simply the texts of the early modern theatre that have stuck in our imaginations.  While we certainly imagine Machiavellianism differently than audiences did in the 16th century, many of the same questions and concerns still exist in the fiction that we create.  We may not be interested in the complex history of English kingship that exists in The History of Henry IV part 1, but we do still have an investment in the questions that the play asks about how a ruler should act.  While representations of Machiavellianism are not the only entry point into understanding the continuities that exist between early modern and contemporary practices of representation, the stage Machiavel does provide a fairly clear example of an early modern stage trope that continues to capture our imagination well into the 21st century.

[1] The Prince was not published until 1532, five years after Machiavelli’s death.

[2] The Prince could not be legally published in England during the 16th century and literacy rates were fairly low.

[3] This habit of making Machiavelli a central character in narratives about 16th century Florence dates back to the mid-19th century at the latest, as George Eliot’s Romola features extended cameos by a pre-Prince Machiavelli.

[4] I noted last week that Machiavelli would likely have hated Frank Underwood for being a self-invested conspirator.  Beyond this, Cersei Lannister would likely be chided for her absolute disregard for the opinions of the populace and the fact that so few people actual trust Peytr Baelish suggests that he lacks the fox-like qualities that Machiavelli lauds in his schemers.


Evan Hixon is a first year PhD student in the English Department.  His studies focus on Early Modern British theater with an emphasis on Shakespeare, political theory and Anglo-Italian relations.  His current research work examines the rise of English Machiavellian political thought during the reign of Elizabeth I.

Privileged Positions: House of Cards and Frank Underwood’s Machiavellian Monologues (22 April 2016)

“Since a ruler, then, must know how to act like a beast, he should imitate both the fox and the lion, for the lion is liable to be trapped, whereas the fox cannot ward off wolves…[b]ut foxiness should be well concealed: one must be a great feigner and dissembler.  And men are so naïve…that a skillful deceiver always finds plenty of people who will let themselves be deceived.”

-Machiavelli

At the conclusion of Act 4, Scene 3 of Hamlet, after convincing Hamlet to sail to England, the stage is cleared for Claudius to address the audience.  Though not marked as an aside, Claudius uses these 11 lines to announce that he has sealed letters “conjuring to that effect/The present death of Hamlet” (4.3.62-63).  By this point in the play, audiences have little reason to trust the words of Claudius, but at this moment, he utilizes the empty stage as an opportunity to pull back the curtain of his deception to reveal to the audience the machinations of his plot.  This was a common theatrical device on the early modern stage, in which the soliloquy or the aside would offer characters a chance to directly address the audience.  In this particular example, Claudius drops the façade of the Machiavellian liar to reveal his true intentions.  In doing so, he reveals truths about himself to the audience that he had kept hidden from the rest of the characters within the play, confirming what they already knew—that Claudius could not be trusted.

Turning to modern representations of Machiavellian villains, this is a device employed with frequency by Frank Underwood in Netflix’s House of Cards, a political thriller that owes a great deal to the tradition of the stage Machiavel.

House of Cards

Machiavellianism, American style

Frank Underwood, the Democratic House Majority whip, is introduced to audiences as a ruthless pragmatist, directly addressing his audience to explain the principles that guide his philosophy. In this moment of revelation, it is not only important that audiences witness Underwood’s actions, but also that he shows himself capable of pulling back the veil that is assumed to exist between his character and his viewing audience.

Here, he, like Claudius, is revealing truths about himself to which only his audience will have access.  Through the later use of these asides, Underwood is presented as a consummate liar, a man capable of sabotaging the administration in which works from within and he is often heralded as a prime example of a modern Machiavel.[1]  He represents what modern writers understand to be an idealized form of Machiavelli’s Fox-Lion politician, capable of crushing those he feels have wronged him while deceiving the world into believing that he remains loyal to their cause.

Frank Underwood, like Claudius, participates in affirming for audiences what they already believe to be true.  In Hamlet, the moments in which Claudius reveals himself to be a treacherous usurper affirm that which audiences could only speculate upon prior to his confession.  In a similar vein, Underwood’s casual asides become revelatory for audiences, but what they reveal is political rather than personal. These tiny acts of revelation say a great deal about how House of Cards conceptualizes the modern political landscape.  Underwood is able to speak truths to the audience as if he were a kind of omniscient chorus, well versed in the inner workings of Washington politics and able to speak with an authority which other characters lack.  As the Machiavellian fox, capable of lying to and manipulating those around him, Underwood’s monologues seem to remove the veil of calculated dissimulation and therefore come as unfiltered truths about the political system, and in a sense they simply affirm what audiences already believe about the operation of power.  Even though we may know that they are presented through the voice of a liar, by framing them as asides directly to the audience, they are granted a significant measure of authority.  In these brief asides, the figure of the liar takes off his mask, but instead of revealing guilt, he reveals how easily he is able take the reins of the political system to his own advantage.

Similarly, this device places audiences in a privileged position of knowing what other characters do not.  In Hamlet, the titular character is never given the clarity of truth concerning his uncle that audiences receive thanks to the decision to stage Claudius’s confessions as spoken upon an empty stage.  Likewise, none of Underwood’s victims are given the privileged knowledge that we as spectators enjoy thanks to our frequent glimpses into Underwood’s rationale for his actions.   In essence, by revealing his status as a Machiavellian dissimulator, Underwood affirms the value of Machiavellian dissimulation.  By announcing himself as Machiavelli’s fox and granting audiences a privileged glimpse into the rationale of the fox, we affirm the maxim that a man must be like a fox if he is to succeed in the world of politics.  House of Cards, like Game of Thrones, utilizes Machiavellian thought to demonstrate the ruthlessness and dissimulation that these programs believe underscore successful politicking.  While certainly not an affirmation of the political beliefs of its characters, our introduction to Frank Underwood in House of Cards breaks the 4th wall to convince audiences of what they already believed to be true:  Washington politics is a game of deception and ambition where ruthlessness trumps idealism.

[1] It is worth noting that Machiavelli would likely despise men like Frank Underwood.  Much of The Prince is presented as a guidebook for ways in which a ruling prince can avoid being undermined by duplicitous schemers like Underwood.


Evan Hixon is a first year PhD student in the English Department.  His studies focus on Early Modern British theater with an emphasis on Shakespeare, political theory and Anglo-Italian relations.  His current research work examines the rise of English Machiavellian political thought during the reign of Elizabeth I.

Part II: Wicked Women and the Negotiation of Female (Dis)empowerment (1 April 2016)

“Not only did she dupe me into believing she still loved me, she actually forced me to implicate myself. Wicked, wicked girl. I almost laughed. Good Lord, I hated her, but you had to admire the bitch.” – Nick Dunne

Gone Girl, (Flynn 345) [1]

The majority of Gone Girl’s masterful storytelling depends on Flynn’s fascinating, journalistic style of characterization and description, a thriller’s requisite plot twists and explosive reveals, and the unreliability of the two narrators, Nick and Amy Elliott Dunne.[2] Throughout the majority of the novel’s first part, “Boy Loses Girl,” while Nick narrates the present-day events concerning the disappearance of his wife, readers learn about Amy through various diary entries, the first of which details the night she and Nick met at a writer’s party – a charming, witty, and thoroughly romantic meet-cute scenario that plays perfectly into the image of a happy couple destined for a wrong turn, somewhere, somehow. After all, no one is perfect, least of all Amy Elliott herself.

The thing is, though, Amy knows this. From the start, she laughs at her own claims of being a writer – even as the author of the diary, Amy undermines her own narrative authority by admitting that she only writes personality quizzes for tween magazines. Such a confession makes Amy likable and relatable, with a sweet girl-next-door kind of charm. She acknowledges her shortcomings as a daughter, and tells the story of how her parents actually created a literary avatar of a perfect child – aptly named Amazing Amy – that represents, in Amy’s words, a plagiarized correction of all her life’s faults, which “was not just fucked up but also stupid and weird and kind of hilarious.” (27). In comparison to her husband, Amy is refreshingly honest. She is forthright, self-conscious of her own faults without being too teeth-grittingly self-effacing, and tries so hard to be a decent, good woman – a good wife. She faces the economic downturn, the loss of financial security, and the gradual dissolution of her marriage to Nick with the occasional emotional outburst. These, however, are quickly quelled by confessions of “being a girl,” coupled with declarations to rise above the stereotype of the embittered wife: “I won’t blame Nick. I don’t blame Nick. I refuse – refuse! – to turn into some pert-mouthed, strident, angry-girl” (65).

She is also a skillful liar, a schemer, an angry sociopath, and a very, very vengeful scorned wife.

The title of the novel’s second part is “Boy Meets Girl,” and insinuates a re-discovery, a recovery of alternate meaning. Just as Nick unravels his wife’s treasure hunt of punishment, humiliation, and retribution that frames him for her murder, readers are also made aware of their own identification with Nick[3] – outsmarted, outwitted, and duped by an unreliable narrator and a literary lie. Even if we don’t share in Nick’s philandering ways, repressed misogynistic impulses, or his present role as entrapped husband and suspected killer, we too have been beguiled by Diary Amy and her romantic fiction.

“I’d like you to know me first,” Amy writes. “Not Diary Amy, who is a work of fiction (and Nick said I wasn’t really a writer, and why did I ever listen to him?), but me, Actual Amy. What kind of a woman would do such a thing? Let me tell you a story, a true story, so you can begin to understand.” (220)

And yet, from this point on, the narrative spirals into a multiplicity of Amys: Diary Amy finds herself cast off by Actual Amy (220), who merges in and out of Dead Amy (234), Ozark Amy (244), Other Dead Amy (246), and under the pseudonyms of Lydia and Nancy. Besides these alternate versions of her self, Amy has had close to four decades to cycle through a laundry list of “people I’ve already been” (236), which reads like a closet of Barbie-identities, suitable and discarded as soon as the wearer begins to tire of it.

As a first-time reader, I understood some of Nick’s reluctant admiration. Personally, my moral compass didn’t encourage identifying with or cheering on a wicked woman who accused a man of rape just to teach him a lesson, who would gaslight a teenage girl into nearly committing suicide, or vindictively wish for her husband to be ass-raped in prison.[4] On the other hand, Amy Elliott had significant truth bombs to drop, and drop them she did. “I hope you liked Diary Amy. She was meant to be likable…She’s easy to like…I wrote her very carefully, Diary Amy. She is designed to appeal to the cops, to appeal to the public should portions be released. They have to read this diary like it’s some sort of Gothic tragedy…They have to like me. Her” (237-8), Actual Amy now confides to the reader, and the shock – dare I say the magic – of the narrative manipulation is no less deft for the revelation of such.

Ironically, in successfully duping the reader alongside beguiling her cheating husband, the cops, and the entire American public, Amy shows her hand. Actual/Real Amy’s anger lies in the fact that Nick fell in love with one of her personas – Cool Girl Amy, specifically – and then out of love with her unadorned, real self. “Can you imagine,” she seethes, “finally showing your true self to your spouse, your soul mate, and having him not like you?” (225). Add infidelity to the list, Nick has thoroughly shaken his wife. By his inelegant actions, he has reduced her to “Average Dumb Woman Married to Average Shitty Man. He had single-handedly de-amazed Amazing Amy” (234), and toppled the wicked woman from her throne. Not only does it sting to be thrown over for a younger Cool Girl model, but Amy’s anger mingles with shame – to rekindle the romance, she had actually been willing to retry her hand at being the Cool Girl that she so deplored, and Nick loved.

In the end, while Amy gives into her misreading of Nick’s rekindled love for her true self, and the marriage continues with both partners acting their part – for the arguable betterment of both – Amy nearly gets the last word on her self-fashioning and the definition of her identity. She is no mere “psycho bitch,” as Nick accuses; she sees through his attempt to label her as a lazy cop-out. “It’d be so easy, for him to write me off that way. He’d love that, to be able to dismiss me so simply” (Flynn 394) – which indeed, Nick takes morbid pleasure in having married “the world’s foremost mindfucker” (271). But despite her success, the thought of waking up every morning, and being herself, doesn’t thrill like she thought it would.

What then, wicked woman?

“It’s not a particularly flattering portrait of women, which is fine by me. Isn’t it time to acknowledge the ugly side?” Gillian Flynn writes, calling for a triumph of “violent, wicked women” over the watered-down “girl-power” rhetoric of a supposedly post-feminist era. “Dark sides are important. They should be nurtured like nasty black orchids.”[5] If exposing wickedness by showing its construction gives such women a chance to shine, it also weakens the mystification of the wicked woman’s power – dispelling the myth, tarnishing the shine of glorification, and making wickedness just a little bit more human.

[1] Flynn, Gillian. Gone Girl. New York: Broadway Books, Random House. 2012.

[2] The majority of this blog post will examine both Flynn’s novel and David Fincher’s 2014 film adaptation, of which Flynn wrote the screenplay. Given the emphasis on acting, deception, and the unreliability of signs in reading the self, I consider the literary and visual text alongside one another to heighten the instability of self-depiction/description and markers of identity.

[3] In some ways, life imitates art: Ben Affleck’s partial Irish heritage, working-class roots, and troubled relationships fit characterizations of Nick Dunne perfectly. “I have a face you want to punch: I’m a working-class Irish kid trapped in the body of a total trust-fund douchebag” (32), Nick admits soon enough, and most of my students agreed that Affleck had been a rather stellar casting choice for that quality alone.

[4] Gillian Flynn responds to accusations of misogyny and anti-feminist rhetoric in the novel by turning the tables on such a script, and argues for an expansion of feminism to include villainous women. For more, see The Guardian interview: “Gillian Flynn on her bestseller Gone Girl and accusations of misogyny” (May 2013).

[5] “I was not a Nice Little Girl.” For Readers – Gillian Flynn. Web. 20 March 2016.


Vicky Cheng is a third year Ph.D. student and teaching associate in Syracuse’s English Department. She studies Victorian literature and culture, with an emphasis on feminist and queer readings of the body. When not reading for forthcoming qualifying exams, she can be found drinking tea, napping, or having strong feelings about Star Wars, Marvel films, and Hamilton.

Zen and the Art of the Course Description (19 February 2016)

Course descriptions bridge the gap between the university’s corporate model and the classroom’s pedagogical space, aiding in achieving satisfactory enrollment “numbers.” In this way, the description of a class has to do the work of both an advertisement and an infomercial, appealing to students as well as cuing them about the course’s content. Despite our idealistic desires about learning for learning’s sake that might suggest otherwise, it is important, then, that a course seem interesting or “fun” so that students will actually register for it. However, this can be a fine line to walk: if an instructor goes overboard with trying to make the course appealing, students who do take the course can end up with something like academic buyer’s remorse—feeling that the course they signed up for is not represented in the classroom they occupy. Typically, this means that the student expected to have a lot of fun and (surprise!) the course turns out to be a lot of work. A balance must be struck between appealing to students’ interests and hinting at the rigorous intellectual labor required of a college course. The course description can be the first clue (and compelling advertisement) for how students and instructors will achieve these ambitions together.

As I tried to formulate my own course description for a class I plan to teach next Fall semester (ETS 181: Class and the Literary Text PLUG!), I began to consider how the course description is the first glimpse into what the educational future holds for students. For some, this tiny blot of text is the first step into opening their mind (and consciousness) toward the fundamental questions of the humanities: Who are we? What are we doing in the classroom? What are the forces that shape our world? How can we be engaged members of our classroom, society, and world? These are, of course, age-old questions that teachers have asked for hundreds of years. As I meditated on how to describe the content and objectives of my own course, I came to realize that a profound dialectic of instructional philosophy found in Zen Buddhism could also be found in the humanities classroom.

The practice of Zen Buddhism can be conceptually described as having two schools, each of which can represent different pedagogical ideologies that surface in humanities classrooms. Rinzai Zen practice is centered on the use of the koan, an absurd or impossible question engineered to push the mind away from dualistic thinking and toward “enlightenment,” a state of total awareness and detachment. The most well known example of a Rinzai Zen koan is “What is the sound of one hand clapping?” A practitioner may work on the same koan for weeks or even years. Working through the experience of frustration and confusion that results from a koan allows a student of Buddhism to better understand the limits of their own internal logics.

Many humanities classrooms follow a strikingly similar logic as Rinzai Zen, asking students to formulate their own answers to questions of aesthetics, ethics, and ideology like “What is beauty?” or “Can society achieve equality?” that are as seemingly absurd or impossible as any koan. For many humanities instructors, the goal of asking such questions is not for the student to answer, once and for all, what beauty or truth is, but to get the student to ask “Why is it important that we must ask these questions?” This metacognitive approach can seem like the equivalent of a student’s enlightenment: finding a contemplative state rooted in higher-order concerns that engender critical thinking.

However, this goal-oriented approach is not the only way for students to come to a greater understanding of their role in the classroom. As many instructors have experienced, sometimes it is in the least-planned moments that students learn the most. Soto Zen, Rinzai’s competing school, rejects the centrality and formality of the koan as well as the goal of a particular “enlightened” state. For practitioners of Soto Zen, there is no goal to be achieved beyond the practice itself; the only object is to be awake and aware of the here and now. Translated to the humanities seminar, this practice asks the students to be fully immersed in learning, but also to move outside of ideology into subjective and intuitive experiences of the classroom and the world. In my experience, some of the best discussions come from this place of open awareness and improvisation. By letting strict lesson plans and pre-designed questions take a backseat to the participation and engagement of students in the moment, instructors can encourage students to seize their own agency, develop a community of ideas, and make the classroom their own.

Getting students to that “a-ha” moment of realization can be rewarding for the instructor, but often times, students get the most sustained intellectual value from pedagogical experiences that remain open-ended. By moving pedagogy away from structured goals, and, yes, even grade-oriented experiences, students can continue to build their knowledge years later, rather than leaving their experience, and their transcript, at the door of the classroom.

mcb3f2

Every student learns differently. For some people, the Rinzai approach to the humanities will best allow them to reach their educational aspirations, and they will emerge from the University system with a degree that is the material evidence of a more “enlightened” state. For these students, a course description should explain exactly what they will learn—the why is less important. For others, education is a lifelong process that doesn’t start and stop on an academic schedule. These students might benefit from a Soto approach that allows them to “sit” with their new knowledge and apply it to their life inside and outside the classroom. For these students, a course description should explain why they want to be in that class now, and why it will still mean something to them in 2, 10, and even 50 years. A well-balanced course description hopefully appeals to both types of students and ideally makes them excited about the possibilities their learning experience holds. But regardless of why the students are there, the course description has facilitated the most important function of a classroom: the students have chosen to find their way to it.


Max Cassity is a 2nd year PhD student in English and Textual Studies. His studies encompass 20thand 21st Century American fiction, poetry, and digital media. He is currently beginning a dissertation that studies fictional representations of epidemic diseases in American and Global modern literature and digital narratives including Ebola, Cancer, and Pandemic Flu.

Don’t Eat The Flatware: Balancing Instruction and Interpretation in the Classroom (5 February 2016)

For this month’s posts, I will focus on how engagement with social media, popular culture, film, and video games can inform the work we do in humanities classrooms. This week, I look at how criticism of humanities instruction on Reddit might help us understand why the practice of interpretation leaves some students with a negative impression of this field.

To do this, I want to examine one particular Reddit thread about the Oscars that quickly segued into a discussion about students’ expectations of interpretative arguments and pedagogical assessment in humanities classrooms. Initially, this forum comments on a controversy among Jada Pinkett Smith, Will Smith, Spike Lee, and Janet Hubert, Smith’s co-star on the ‘90s television series The Fresh Prince of Bel-Air. This disagreement concerns celebrity reactions to the despairing lack of nominations of people of color for marquee positions at the last two Academy awards, which in turn has engendered the resurfacing of the social media hashtag #OscarsSoWhite as an attempt to return public awareness to Hollywood’s historical marginalization of people of color. In their own call to action, Pinkett Smith, Smith, and Lee have advocated boycotting the award ceremony. However, this decision, in turn, has been met with resistance by actors of color such as Hubert, who claimed that Smith’s boycott was a temper-tantrum over not being nominated for 2015’s Concussion rather than an expression of race solidarity (for more on this debate click here).

The Fresh Prince of Bel-Air

Will Smith and Aunt Viv (Hubert) in Fresh Prince (Photo by: Chris Haston/NBCU Photo Bank via AP Images)

 

This celebrity pseudo-family feud has promoted discussion of the institutionalized racism that persists in US culture, but I am particularly interested here in how one Reddit discussion connects the #OscarsSoWhite debate with the institution of the university, a dialogue that I think can offer those of us instructing in humanities classrooms a unique window into students’ experience.

Commenting on Hubert’s response to the Academy Awards boycott, Reddit user “hashbrown” associated her reaction with their own experience of receiving a disappointing grade on an interpretive community college essay assignment.

Hashbrown writes:

“I have a story that relates.

Last year I had an English class at the biggest community college in California. My African American teacher made the topic of the entire class revolve around black literature. One of the videos we watched talked about how African Americans need to start “helping” and empowering each other out [sic] by only watching black television, shopping at black stores, and volunteering in black communities.

I wrote my paper on how self segregation was a form of racism itself. Why should black people not shop at stores because of color of the owners skin? Why should people not watch white or asian actors?

In the end my teacher ended up giving me a C and wrote that I wasn’t understanding the material.

A year later and I’m still bitter about that class.”

Hashbrown articulates a common misconception about the value placed on interpretive analysis in the humanities—the notion that any ideological position on a text, regardless of its merit, is valid if properly argued. Missing from this perception is an important aspect of the interpretive process, in which students must take into account the contexts that inform their claims. In this case, hashbrown’s assertion that African American engagement in community activism equates to “self segregation” fails to account for the history of structural racism in Hollywood cinema, and the result of this lack of context was a C grade–hardly a failing score and a nearly universally accepted marker of “average” work in undergraduate study, indicating a need for improvement. Despite hashbrown’s possible “bitterness” over the grade itself, it seems to me that their frustration also might indicate a miscommunication in the instructor’s expectations.

While one could easily dismiss these kinds of complaints as quests for minor revenge by disengaged students turned internet trolls, the sheer number of responses that echo hashbrown’s frustrations suggest there may be something more here. Coming to hashbrown’s defense, other Reddit users noted how experiencing an instructor’s criticism to their subjective interpretations of texts left them with a cynical outlook on the project of humanities instruction at large. Reddit user Rainator writes, “I learned in English that the way to get a good grade was to just parrot whatever nonsense the teacher said.” User OneFatGuy described a similar experience, commenting, “I had a professor that would only agree on arguments based on his ideas, and anything other than his ideas were wrong or weak arguments.” From the perspective of the frustrated student, these users articulate a fundamental miscommunication that can occur between students and teachers concerning the pedagogical interplay between instruction and interpretation.

I believe that effective pedagogy embraces a dialogue between instruction—the teacher’s role of providing proper historical and cultural contexts that inform effective humanities study—and interpretation—the practice of synthesizing information from texts and developing an understanding of its meaning. This allows students to form interpretations that are unique, creative, and grounded in an enriched understanding of the text rather than construed from initial, unexamined reactions or previously fortified ideologies. However, when the prioritization of one element leads to the neglect of the other, the result can be the regrettable alienation of the student and/or the demonization of the instructor.

For Hashbrown and many other students with similar experiences, pedagogical focus on subjective argumentation is understood as a license to assert any of all possible readings of a text, even those that do not account for the specificity of material histories and social contexts. To be fair, the focus on rhetoric in many humanities classrooms makes this an easy misperception, even for advanced students. It is especially common in lower-level composition and survey courses, where the responsibility for providing such contextualization usually falls solely on the instructor. This problem is magnified in English and Literature Studies, where students are encouraged to form nuanced interpretations of texts that deal in complex and even contradictory aspects of culture and society, such as racism. However, focusing too much on contextualization over interpretation can be a problem as well. As Rainator’s response points out, when teachers over-prioritize instruction, students can feel that they have no agency in the discussion and simply parrot back information rather than engage in a critical practice.

This experience can be as frustrating for instructors as it is for students. One instructor in particular voiced on this thread their frustration at students’ mishandling of the “tools” provided by instruction claiming, ”It’s like I prepared you dinner and you ate the cutlery.” Engaging critically with such issues often involves confronting unsettling aspects of culture, society, and even our own experiences—a prospect that can be difficult for students and instructors alike. However, by providing historical and cultural context for the texts students read, and setting clear expectations about how student interpretation will engage with this context, instructors can prevent turning students off to the valuable practice of critical analysis and perhaps even help our students to have their cake and eat it, too!

Next week I will continue to think about how engagement with the public can inform humanities research and instruction, so grab your knives and forks and let’s eat!


Max Cassity is a 2nd year PhD student in English and Textual Studies. His studies encompass 20thand 21st Century American fiction, poetry, and digital media. He is currently beginning a dissertation that studies fictional representations of epidemic diseases in American and Global modern literature and digital narratives including Ebola, Cancer, and Pandemic Flu.

Coda: The Human in the Humanities (29 Jan. 2016)

My first semester of grad school was kind of a wreck: I was constantly sick, my nerves were bound tight with anxiety, and my back and wrists were in pain from the Soviet-era metal chair-desks in a basement classroom. None of this was helped by the ideological distress I found myself in. Two pieces of scholarly advice that found their way to me that semester still linger with me: one, there’s no such thing as the human condition; and two, your graduate program will tear you apart and remake you in its image.

A photo of a metal classroom chair with tiny desk attached at the armrest.

The chairs were still the worst part, though.

In the classroom, I mentally conceded the probable truth of the first one. My undergrad philosophy classes taught me that we have no good definition of “human.” And the conditions people live in vary so radically that there can’t really be a universal one: the Elizabethans understood the world’s functions quite differently than do the Mosuo or a New Yorker, and attempts to demand that there is one ideal understanding usually end up serving some hegemonic understanding to the exclusion and oppression of other worldviews. That didn’t stop the statement from messing with my heart, though.

You won’t be surprised to learn that I had recently graduated from a Jesuit college, and “the human condition” is a big part of Ignatian philosophy. My best friend and I had lofty aspirations of studying “the human condition” through literature in grad school; I still amuse myself by correctly identifying Jesuit-educated students and priests by their use of the phrase in discussions and homilies, respectively; and Christ’s entering “the human condition” through the Incarnation is the foundation of Ignatian imaginative contemplation, my graduate research, and my personal aesthetic. To be told that “the human condition” is inherently meaningless was like being told that J.K. Rowling’s prose is mediocre, only worse: both statements may be true, but I still love the object that they discredit — and “the human condition” informed my life and work more deeply and for far longer than Harry Potter.

A photo of tree-lined sidewalk leading to a redbrick academic building, which features a statue of a priest over the entry doors and a clocktower topped with a cross. The trees are bare but there is no snow on the grass.

Le Moyne College on a rare snowless day in winter.

 

As imposter syndrome set in and I attempted to impress my professors and fit in with my classmates through mimicking their interests and ideologies, I began to darkly wonder if there was some degree of truth to the second statement, too. As I’ve gained confidence in my ideas, my professors have all been wonderfully supportive of my research, even at critical moments of doubt, but I still felt strangely disembodied from my ideas. They were necessarily available, even susceptible, to outside influences in the name of getting a job, which could range from something as benign as entering them into a critical discourse I was unenthusiastic about to something as disheartening as avoiding theories that are no longer trendy.

Not until I took a summer creative nonfiction workshop with the magnificent Minnie-Bruce Pratt did I realize that this compulsory refashioning had nothing to do with my program, but with the state of English-language literary studies. I spent two weeks reading first-hand accounts like Toni Morrison’s Playing in the Dark: Whiteness and the Literary Imagination, in which Morrison exposes the subtle racism of American literary tradition not in the form of a journal article, but of a personal reckoning with that history. I spent three weeks writing in the first person about the body of Christ, the woman’s body, and the queer body not in the form of a seminar paper but in the form of a series of anecdotes and meditations steeped in medieval and Renaissance mysticism. I found myself applying my research to my life in ways that made the Early Moderns come alive — in our exchange of good-byes, classmates from diverse religious backgrounds told me how fascinating and important my research was through having encountered it in this genre.

: The greyscale cover of Toni Morrison’s book Playing in the Dark. Morrison holds a giant floppy hat. A gold sticker proclaims that the book won the 1993 Nobel Prize in Literature.

Fantastic book, by the way: accessible first-person literary criticism. Highly recommend.

Creative nonfiction enabled me to communicate my ideas — shaped by research and critical writing — with a public upon whom they had material impact. My ideas became my own again: I had a personal investment in recovering historically obscured understandings of gender and the body to not only locate the essential value of the queer and the female bodies in Catholicism but also to share old ways of embodying queerness and femininity that are relevant today. In creative nonfiction, my first-person voice had credibility, purpose, and an audience who otherwise wouldn’t or couldn’t access to this knowledge.

Radical queer and feminist scholarship is somewhat better at this, leveraging the personal narrative as a source of knowledge and an act of inquiry. To assert a self in English (and, I’d wager, biology, history, math, or information studies) is to assert that you are not the implied raceless, genderless, classless entity interested only in books, but that you instead have an investment in disrupting the status quo. This trickles down into policing how we frame our inquiries: we teach our students not to use the first-person because the personal isn’t credible, and we apply the same principle to our critical essays. Consequently, I have no idea why most of my colleagues study what they do: I assume they all love literature, but if that were their only motivation they wouldn’t still be suffering through grad school. If the English scholar speaks, it is only through the voice of their subject of study, and tentatively: papers on nuns I identify with, on devotional poems that resonate with me. Our research overwhelms our selves, and obscures its own real-life applicability. And so we get accused of navel-gazing and being out of touch with reality:

Nothing like some anti-intellectual sentiment to kick-start one’s drive to inform the public.

So maybe there isn’t a single human condition, but that doesn’t mean studying the humanities can’t improve the conditions of some humans. If my experience with creative nonfiction is any indication, one of the most meaningful ways to connect with those outside the academy is to acknowledge our own subject positions, explicitly recognizing the self in order to humanize the humanities. This is what I’ve tried to do here. But now it’s your turn:

Why do you study what you do? Why do you work where you do? Who are you?

A painted full-length portrait of a nun sitting in a library, paging through a book; she wears a large icon of the Annunciation over her breast.

Also, Sor Juana Inés de la Cruz is just objectively rad.


Ashley O’Mara (@ashleymomara | ORCID 0000-0003-0540-5376) is a PhD student and teaching assistant in the Syracuse University English program. She studies how Ignatian imagination and Catholic iconology shape representations of sacred femininity in Early Modern devotional writings. In her down time, she writes creative nonfiction and snuggles her bunny Toffee.

 

Common Knowledge?: EEBO, #FrEEBO, and Public Domain Information (15 Jan. 2016)

If you work in the humanities and you’ve used a database, a dictionary, or Google Docs in the past ten years, congratulations! — you’re already doing digital humanities. This was a point emphasized by Syracuse University professor Chris Hanson in a panel discussion on the digital humanities that I attended after the Six Degrees of Francis Bacon workshop last fall. Grad students, faculty, and a librarian from a range of disciplines underscored that, according to this definition, anyone can do digital humanities — in fact, many already do — as long as they have access to digital information and the tools to manipulate it.

Not everyone has that kind of access, however, and this became painfully obvious for Renaissance-studies scholars a few weeks later when ProQuest discontinued access to the Early English Books Online (EEBO) database for Renaissance Society of America (RSA) members. Previously, those who didn’t have EEBO access through a university’s library subscription — such as independent scholars or those at smaller schools with smaller budgets — could gain access by joining the RSA, a professional organization rather than a library. After a Twitter uproar, ProQuest quickly restored access without much of an explanation, but not before Renaissance scholars could write about the implications of a private business’s controlling access to what is ultimately public domain information.

EEBO’s origins lie in World War II, when the London Blitz threatened to destroy English libraries and the thousands of medieval and Early Modern books they contained — a potential massive loss of information. University Microfilms International (UMI) stepped in to scan the texts for future generations … and for profit. UMI began to offer microfilmed titles in the English Short Title Catalogue (SCT) to university libraries through print-on-demand services.[1] For decades, Renaissance scholars outside the UK relied upon libraries’ microfilm reprints to do their research. Seventy years later, UMI is now ProQuest and the microfilmed SCT is now EEBO, a digitized and expanded collection of scanned texts. Just under half of the (rapidly expanding) current collection was released into the public domain last year. But anyone without library access will have to wait until 2020 for ProQuest’s exclusive rights to expire in order to access the complete collection.[2]

A library with the ceiling caved in. Beams, rubble, curtains, and ladders are heaped in the center. Three men in hats and wool coats inspect the books that remain on the shelves.

The private library at the seventeenth-century Holland House was bombed in the London Blitz. Books in national libraries were quaking in their dust jackets.

I’m one of the lucky ones: Syracuse University participates in the EEBO Text-Creation Partnership, so I have access even to texts that haven’t been made fully searchable. Without my university library access, I couldn’t possibly be an Early Modernist studying Jesuit literature. Syracuse is a long way from the Huntington and the Folger libraries, let alone Cambridge or Oxford. Not only do I not have a research budget as a PhD student, but some of the most prestigious libraries limit access to students already working on a dissertation.. If I hadn’t spent time browsing EEBO’s collections, I wouldn’t even know that I wanted to write about Jesuit literature. I may eventually have read that Richard Crashaw, a seventeenth-century poet and Catholic sympathizer-turned-convert, was raised by a virulently anti-Catholic father who wrote a tract called “The Bespotted Jesuite.” But without EEBO, I would never have had the opportunity to actually read the elder Crashaw’s text for its obsession with the maternal role of the Virgin Mary in Catholic notions of salvation, and then compare its horrified images of breastfeeding with the glorifying images that appear in the younger Crashaw’s baroque — even mystical — poetry. Without EEBO, I couldn’t read about the Maryland colony’s connection to the English Jesuit mission; I couldn’t perform full-text proximity searches comparing discourse on Eucharistic flesh and New-World cannibals; and I couldn’t crosscheck textual references to English Jesuits to add to Six Degrees of Francis Bacon.

 

A poorly copied black-and-white page of text titled “To OUR LADY OF Hall, and to the Child JESUS”; the rest of the text is half-obscured because text from the opposite side bleeds through.

A page from William Crashaw’s “The Bespotted Jesuite,” aka the “Jesuites Gospell” (1642). Read might be a generous verb.

But not everyone is so fortunate: in the few days when some RSA members believed they would lose their only means of accessing the full EEBO, proposals to make a #FrEEBO circulated on the internet. The conversations reminded me of when I graduated from undergrad and realized, to my horror, that I no longer had access to the Oxford English Dictionary. I found myself keeping younger classmates “on retainer,” pestering them to please, please look up the seventeenth-century definitions of this word so I can revise my writing sample to apply to grad school. Imagine being a scholar trying to publish a journal article for tenure and having to do the same thing — but with every single primary text you’re analyzing. Unlike the OED, the texts in EEBO are public domain, after all, even if ProQuest’s digitizations aren’t; there’s no reason scholars couldn’t create a parallel database that’s wholly public domain from inception.[3]

Digital texts have their shortcomings, of course, including other forms of inaccessibility as well. Untranscribed texts are wholly inaccessible to those with visual impairments. Databases like EEBO offer OCR transcriptions of some scanned texts, and while the good ones can be helpful, quality is inconsistent and frequently bad, especially for Early Modern typefaces and spellings. (If anyone has had a good experience using a screen reader with EEBO, let me know in the comments.) Digital texts also necessarily misrepresent the material object it’s based on by transcribing it into a different medium: a scan of a book obscures its size, its texture, its color, its smell, and even, in EEBO’s case, its cover. (More about that next week!)

A black-and-white scan of two pages of text fills the top two-thirds of the image; a transcription fills the bottom third. The transcription is filled with punctuation marks to signal line breaks and diacritical marks. Each transcription has a yellow post-it note icon in the middle of sentences. The text that fills the margins of the scan is not included in the transcription.

A side-by-side comparison between the scan and the transcription of two pages from “True relations of sundry conferences had between certaine Protestant doctours and a Iesuite called M. Fisher” (1626) in EEBO. To read marginal commentary, you have to click the yellow post-it note icons — a very different experience than the Early Moderns had.

 

But shortcomings shouldn’t stop us from finding new ways to increase access to these texts. One aspect of Jesuit philosophy that’s always resonated with me is that education is inseparable from social justice. Extensive higher education is required during Jesuits’ training in part because they are meant to share that knowledge in service to others. Education itself is a common good, and as an aid to education the cultural heritage contained in databases like EEBO shouldn’t be limited to scholars attached to the wealthiest schools — or even to scholars alone. If public scholars are truly committed to democratizing knowledge, our work shouldn’t end at merely presenting our research to the public, which only reinforces the ivory tower’s hierarchical relationship to the public. Our service to the public should extend to enable universal access to the primary sources we work with, so that anyone who wants to — no matter their situation — can discover not only our knowledge but also how we arrived at it, and how they could make some new knowledge themselves.

[1] http://folgerpedia.folger.edu/History_of_Early_English_Books_Online

[2] http://www.textcreationpartnership.org/tcp-eebo/

[3] https://medium.com/@john_overholt/together-we-can-freebo-b33d39618f8#.wpxzn95s1


Ashley O’Mara (@ashleymomara | ORCID 0000-0003-0540-5376) is a PhD student and teaching assistant in the Syracuse University English program. She studies how Ignatian imagination and Catholic iconology shape representations of sacred femininity in Early Modern devotional writings. In her down time, she writes creative nonfiction and snuggles her bunny Toffee.

The Dust-Heap of the Database and the Specters of the Spectator

In 2014, networks launched some 1,715 new television series, a staggering number that prompted many articles to declare variations on the theme “there are too many shows to watch.” Same story, different medium, I say. Franco Moretti, a contemporary literary scholar, writes that while twenty-first century Victorianists may (may) read around two-hundred Victorian titles, that barely counts as a drop in the bucket of the 40,000 titles published in the nineteenth century. And the other 39,800 novels? The short version: gone. The longer version: maybe not.

The plethora of “lost” Victorian novels challenges any sweeping claims about Victorian society based on the fourteen or so (depends on how you count) full-length novels of Charles Dickens. But it becomes even more daunting if one’s studies include explorations of Victorian popular magazines and journals. The Waterloo Directory of English Newspapers and Periodicals 1800-1900 lists 50,000 titles. If each of those titles published a single, twenty-page issue—and certainly they published more—that alone would amount to 1,000,000 pages to read.

The imbalance between what we read, what we could read, and what we can’t read makes Victorian studies (and, I suspect, other historical studies) a strange beast. Any decent Victorianist monograph will address the familiar tunes (Dickens, the Brontës, Eliot, etc.), but it will probably do so through ephemera and periodicals that maybe only the author has read thanks to hours of archival digging. The internet makes the strange Victorian studies beast even stranger. The internet not only changes how I do history because I can do most of my archival work from the back corner of Mello Velo (the local coffee shop, to which I owe my doctorate, whenever I finally defend). Historical research online changes academic reading practices, the kinds of arguments we can make, and finally, how we teach historical reading in the classroom. Internet archives make available texts virtually nobody has read. Electronic archives offer the chance to reinvigorate the dust-heap of forgotten novels—although with the change in what we can read, there comes an inevitable and sometimes ineffable change in how we read. It also makes it possible to discover a text nobody has read, without leaving the comfort of your favorite coffee shop table.

And yet, when I say a text nobody has read, this isn’t quite true. These texts do not simply appear on one’s screen. These historical documents already bear the marks of their nineteenth-century readers, but they now bear the marks of my search terms, the database algorithms and tags, scanners, computer processing, and somewhere in a basement, other people who plugged this material into the database. These extra, mostly ineffable hands mark the text like the fingerprint of electronic ghosts—and these spectral hands can sometimes offer us bizarre, fortuitous accidents.

I’m sorry, Peter. I’m afraid I you can’t read that.

Here’s an example. My dissertation is in part about Charles Dickens, because of course it is. I’m also heavily invested in Victorian literary criticism; that is, as opposed to Victorianist literary criticism of the twentieth- and twenty-first centuries, I gravitate toward the theories and ideas the Victorians themselves used to analyze their own work.  I’m specifically interested in Dickens’s serial publications (stories told in installments, like a modern television show), and I wanted to see what the Victorians thought about serialization.

So, off I go to sundry databases and metadatabases, where I search terms like “serial,” “part,” “periodical,” “novel,” and “publication.” As part of my search, I examined the Spectator Archives (1.5 million pages, by the way), where I found this priceless artefact: “Doe’s Oliver Twist.”

Wait, didn’t Dickens write Oliver Twist? you ask. Who on earth is “Doe”?

Welcome, Dear Reader, to the dust-heap of the archival database. Archives like the Spectator Archive use something called Optical Character Recognition (OCR), which is the process by which a computer converts scanned images of pages from something like an 1838 edition of a magazine into searchable text. It’s built in part by programs like reCAPTCHA, the obnoxious text you have to enter before buying or registering at some websites to prove that you’re a human, because only humans scream obscenities at their computers after the thirtieth failed entry.  It’s pretty incredible, when you think about it.

And it’s also terrible, as proven by the title: the Spectator Archive’s OCR rendered “Boz” as “Doe.” Wait, didn’t Dickens—

Yes, Dickens wrote Oliver Twist. But before that, he published Sketches by Boz, a series of wonderfully liberal musings on life in London. And so, when Dickens began to serialize Oliver in Bentley’s Miscellany in 1837, the author’s name was “Boz.” But the Spectator Archive doesn’t know that. In fact, it doesn’t know anything. It’s a scanner, and a computer that runs OCR software, tags its garbled production, and then throws it into the ether for some random grad student to stumble across. And behind that, someone—probably a random grad student or intern—in the basement of the Spectator building on Old Queen Street—could have read this article. Because someone had to put the page on the scanner and press “go.” Behind the Spectator is a series of spectral readers: the Victorians who may have read the article in 1838, the person who scanned the article, the scanner, the computer, the series of algorithms and programs that brought me from Google to the Archive and to that article.

“Doe’s Oliver Twist” is a gold-mine for Victorian theories of reading, serial publication, and distinctions between common readers and academic readers. But in order to find it, one has to enter the right search terms, and—here’s the real punchline—those search terms may abound in a document and not show up in the algorithm because the OCR is wrong. But there’s one final twist, and it isn’t Oliver.

deadpeople

No, it’s not that, either.

In fact, “Doe’s” showed up in my search results because something was OCR’d incorrectly. While it thought it recognized one of my terms, in fact, that term does not appear in the document.

Internet archives allow scholars to dive into the dust-heap of history. In their clunky, unintuitive ways, they cough up garbage and leave us to sort the mess. And as I will argue in future posts, they fundamentally alter the ways we perform these readings. Welcome to twenty-first century history: a tangled heap of trashed treasures and treasured trash.


 

Cover image: Stone, Marcus and Dalziel. The Bibliomania of the Golden Dustman. Scanned by Phillip V. Allingham. Victorian Web.

Peter Katz is a fifth-year Ph.D. student in Victorian Literature and Culture. His dissertation focuses on sensation fiction, the history of science, and the history of the novel.