Much of the research on the “digital divide” focuses on individual users and demographic groups that have traditionally had limited access to technology. A recent study by the Pew Research Center continues this trend. Their findings indicate that thanks to mobile technology, specifically the smart phone, internet use among all social groups is increasing. Fear of technology is also fading as once excluded groups learn digital literacy.
Although these studies are heartening to read, indicating gradual progress towards greater access to technology for all citizens, they fail to take into account the digital divide that exists within educational institutions. While television, radio, and internet news providers have been busy bashing the teacher’s unions and tearing apart the educational policies of “No Child Left Behind,” precious little has been said about the uneven technological infrastructure of our nation’s schools.
For every school with access to i Pads and state of the art computer labs, there are hundreds with only a handful of aging computers (usually in the library) that are available on a first come first served basis for internet research and word processing. This problem is endemic throughout the current educational system, reaching as far as the ranks of higher education.
Right now I am writing this blog post at home on my personal laptop. Partially this decision was made voluntarily, as I wanted to write during the evening in the comfort of my home and not use work resources for non-work related activities. Even if I had wanted to write this post earlier at work, however, I could not.
I share an office at my institution with four other Non-Tenure Track Faculty (NTT as we’re calling them these days). At one point, we had a desktop computer that was five years old. Not surprisingly given the CPU intensive nature of WEB 2.0, this machine died during the summer semester.
In its place, next to the CRT monitor (i.e. the kind that looks like an old TV), mouse, and keyboard of the old computer, sits a seven-year old laptop–a PowerBook G4. This machine was wrangled from the department after over a month of hectoring our IT guy. I had never even heard of this particular brand of Apple laptop so I took the time to search for information about the system on Wikipedia. It turns out that the “new” computer in my office is the precursor to the now ubiquitous Mac Book.
With its limited CPU power and an outdated browser, the most I can do with this laptop is check my email and read websites that aren’t overly graphics heavy or interactive. On most days I go upstairs to the computer lab and wait to use one of the three computers in our departmental computer lab. I also have the option (unlike most of my colleagues) of using the computer in my other office where I serve as an undergraduate studies program assistant.
Added to these frustrations is the lack of wireless internet access in either of my offices, which prohibits me from bringing my personal i Pad to work and getting around the technological limitations of my work space. At one point, I was able to “hack” my way into the network by plugging the internet cable in my teaching office into my own laptop, but as of today our internet connection there is down. This also makes it impossible to use the telephone in that room as my institution switched a few years ago from regular phone service to VOIP (voice over internet protocol).
If we move from my early twentieth century office into the classrooms where I teach, the situation is only slightly better. In a course I designed to teach digital literacy and multi-modal writing to my students, the most advanced technology in any of my three classrooms is a flat screen monitor with a VGA cable that allows me to plug in my own laptop and display its screen on a 25″ television. Wireless access is available in all three rooms, but that assumes that my students can afford to bring their own technology to class as I have.
“Plug and Play” is better than nothing in a world where technological access is no longer a luxury but a precondition for education to take place. Yet it places the burden of technology’s cost on the students and educators. Not only is this unfair, it also sends a strange message to our students: “You need to be educated for the jobs of the 21st century, but we will not provide the tools.” No wonder self-learning is coming back into fashion. Why pay for school when you can buy a laptop and let the internet teach you the skills needed to survive in a tech-driven world?
Now I should perhaps qualify my statement/rant above by reiterating the fact that I am a NTT faculty member. I’m also an English Professor. Perhaps things are different for the TT faculty in my department or are significantly better in other programs at my institution. My suspicion, however, is that while the technological infrastructure might be less antiquated than what I described above it is still inadequate to meet student needs.
When we talk about the digital divide, we need to remember that surfing the internet is a skill easily learned alone at home. Using the web to your advantage, however, is a skill that should be learned collectively in the classroom. Regrettably, this can’t happen when many educators work in an environment designed to teach Baby Boomers to fight the Red Menace.