Posts Tagged Digital Humanities
Much of the research on the “digital divide” focuses on individual users and demographic groups that have traditionally had limited access to technology. A recent study by the Pew Research Center continues this trend. Their findings indicate that thanks to mobile technology, specifically the smart phone, internet use among all social groups is increasing. Fear of technology is also fading as once excluded groups learn digital literacy.
Although these studies are heartening to read, indicating gradual progress towards greater access to technology for all citizens, they fail to take into account the digital divide that exists within educational institutions. While television, radio, and internet news providers have been busy bashing the teacher’s unions and tearing apart the educational policies of “No Child Left Behind,” precious little has been said about the uneven technological infrastructure of our nation’s schools.
For every school with access to i Pads and state of the art computer labs, there are hundreds with only a handful of aging computers (usually in the library) that are available on a first come first served basis for internet research and word processing. This problem is endemic throughout the current educational system, reaching as far as the ranks of higher education.
Right now I am writing this blog post at home on my personal laptop. Partially this decision was made voluntarily, as I wanted to write during the evening in the comfort of my home and not use work resources for non-work related activities. Even if I had wanted to write this post earlier at work, however, I could not.
I share an office at my institution with four other Non-Tenure Track Faculty (NTT as we’re calling them these days). At one point, we had a desktop computer that was five years old. Not surprisingly given the CPU intensive nature of WEB 2.0, this machine died during the summer semester.
In its place, next to the CRT monitor (i.e. the kind that looks like an old TV), mouse, and keyboard of the old computer, sits a seven-year old laptop–a PowerBook G4. This machine was wrangled from the department after over a month of hectoring our IT guy. I had never even heard of this particular brand of Apple laptop so I took the time to search for information about the system on Wikipedia. It turns out that the “new” computer in my office is the precursor to the now ubiquitous Mac Book.
With its limited CPU power and an outdated browser, the most I can do with this laptop is check my email and read websites that aren’t overly graphics heavy or interactive. On most days I go upstairs to the computer lab and wait to use one of the three computers in our departmental computer lab. I also have the option (unlike most of my colleagues) of using the computer in my other office where I serve as an undergraduate studies program assistant.
Added to these frustrations is the lack of wireless internet access in either of my offices, which prohibits me from bringing my personal i Pad to work and getting around the technological limitations of my work space. At one point, I was able to “hack” my way into the network by plugging the internet cable in my teaching office into my own laptop, but as of today our internet connection there is down. This also makes it impossible to use the telephone in that room as my institution switched a few years ago from regular phone service to VOIP (voice over internet protocol).
If we move from my early twentieth century office into the classrooms where I teach, the situation is only slightly better. In a course I designed to teach digital literacy and multi-modal writing to my students, the most advanced technology in any of my three classrooms is a flat screen monitor with a VGA cable that allows me to plug in my own laptop and display its screen on a 25″ television. Wireless access is available in all three rooms, but that assumes that my students can afford to bring their own technology to class as I have.
“Plug and Play” is better than nothing in a world where technological access is no longer a luxury but a precondition for education to take place. Yet it places the burden of technology’s cost on the students and educators. Not only is this unfair, it also sends a strange message to our students: “You need to be educated for the jobs of the 21st century, but we will not provide the tools.” No wonder self-learning is coming back into fashion. Why pay for school when you can buy a laptop and let the internet teach you the skills needed to survive in a tech-driven world?
Now I should perhaps qualify my statement/rant above by reiterating the fact that I am a NTT faculty member. I’m also an English Professor. Perhaps things are different for the TT faculty in my department or are significantly better in other programs at my institution. My suspicion, however, is that while the technological infrastructure might be less antiquated than what I described above it is still inadequate to meet student needs.
When we talk about the digital divide, we need to remember that surfing the internet is a skill easily learned alone at home. Using the web to your advantage, however, is a skill that should be learned collectively in the classroom. Regrettably, this can’t happen when many educators work in an environment designed to teach Baby Boomers to fight the Red Menace.
Imagine this scenario: After weeks of preparing your talk and struggling to cut it to fit the 20 minute time slot of your three person panel, you arrive in the conference room to find that not only is your session chair missing but there are three people in the audience, one of whom is your best friend from grad school.
Think I’m making this up? Well, I’m not. It really happened. I was one of the three people in the audience at the above named conference panel and I felt bad for the presenter. I did my best to ask her insightful questions but I couldn’t help wondering where the other attendees had gone. Where was the loyalty to intellectual inquiry and more important where was common courtesy, which should have dictated to the panel chair that he contact his panel in advance to let them know he would be absent?
Although I have no way of knowing exactly what led this scenario to occur, it is possible to make two assumptions. The first (in the venerable tradition of Stanley Eugene Fish) is based on the Convention program which was well over 1,000 pages long and listed hundreds of events each day starting at 8am and ending around 8pm. Even the most dedicated audience member couldn’t help but crash after about four panels. I tried to listen in on five or six a day but found myself succumbing to the “museum effect.” All of the talks started to merge into one huge cluster of meta-discourse in my brain.
Some professional organizations such as the MLA (Yes, I am complimenting them. Try not to gasp too loud.) have made positive steps to ameliorate this effect by implementing new conference presentation formats. The dominance of Digital Humanities at this year’s MLA convention made this change much more prominent than it might otherwise have been as presenters in these fields are quite frankly much better at using audio-visual equipment than traditional humanities scholars. They also seem to have learned how to be succinct without omitting essential information in their talks. This allows more time for discussion and is less overwhelming for the audience.
The second assumption I gleaned from listening to conference attendees talk in the hotel lobby. As I sipped a coffee and prepared for my own presentation, it became clear that cost concerns or job pressures forced many to attend simply for the day of their talk. It was also clear that some convention attendees were more interested in sightseeing than their were in listening to the latest scholarship in the field.
Bearing all of this in mind, it is worth asking–What exactly is the purpose of the large academic conference in 2012? In the age of social media such as Twitter and Google + why not simply hold a “tweet-up” or create a “google hangout” for scholars in a particular field of study? These virtual arenas would cost participants far less and could be used at any time during the year.
The short answer to these questions seems to be career networking.
Now don’t get me wrong, I understand the value of face to face interaction with scholars in my field. I value it greatly. However, $800, which is the average amount I’ve spent attending academic conferences, seems a steep price to pay for networking. Almost as much, in fact, as my monthly rent. That is why I make a habit of attending conferences only if I’m either presenting or chairing a panel.
I wonder how many make the same choice and are thus shut out of the opportunity to network and exchange ideas in real-time. Yet another way that non-elite faculty are prevented from full participation in the discipline they help sustain.
Among the many changes that I hope will take place as the discipline of English is forced to evolve or disappear is a reexamination of the annual convention model. It seems at best overly bloated (a point made by Fish that most of his readers conveniently ignored) and at worst hopelessly out of date. Fewer panels of shorter duration, new presentation methods, new division structures, less pressure to conduct face to face membership business one time a year. These changes are all desperately needed. Maybe regional conferences affiliated with national ones could pick up the slack. Or perhaps a lot of the work needed could be done online.
In any event, if we want all the members of the profession to have a say in its future, we need something better than the traditional annual convention. The premium for attendance is too steep. Even if you might get to shake hands with Michael Berube.