As Stanley Fish discovered more than a year ago, it’s hard to call a trend based simply on the number of sessions listed in the program of an academic conference. That’s why I’m hesitant to call what I observed at NEMLA 2013 a trend just yet. It is worth noting, however, that a shift seems to be occurring among a sizable number of literary scholars and that shift could prove comforting to the technophobes among us who shudder every time they hear the phrase “digital humanities.”
What I observed in panels such as “Teaching the History of the Book to Undergraduates” and “Teaching How We Read Now” was the already well-documented movement away from post-structuralism and identity-based theories in favor of textual analysis. Yet this is far from the old-fashioned textual analysis practiced by literary scholars since the days when Greek and Latin authors constituted literary study on United States college campuses.
QR codes are now embedded in Medieval manuscripts that reveal how Old English in Chaucer should sound. Hyperlinks allow multiple editions of a text to be read simultaneously and compared. Computer algorithms allow for the analysis of an author’s use of language to determine who wrote an anonymous work of fiction. Data mining techniques help scholars to create word clouds and thought maps to dramatically visualize the zeitgeist of an era or show the evolution of language in graphic terms.
The techniques are new and in some cases require more advanced technical knowledge than the average humanities scholar might possess. But the newness of the techniques with all their bells and whistles hide the reality that philologists (in the guise of DH gurus) are cool again.
Where this turn in literary scholarship will eventually lead is anyone’s guess. I for one am glad to read something for a change that isn’t Foucault.