Beyond Computer Literacy: Defining Levels of Skill in the Use of Computers
In the Spring 1986 issue of the College Board Review, Eugene Arden made an attempt at defining computer literacy. It hasn’t aged well.
Among the original-crew Star Trek movies, Star Trek IV: The Voyage Home is, arguably, the best. It's at least the most fun and irreverent. Released in November 1986 and directed by Leonard Nimoy, the film finds Kirk, Spock, McCoy, and the rest of the Enterprise crew thrown back in time, from 2286 to mid-'80s San Francisco, on a mission to retrieve humpback whales—which are extinct in the 23rd century—to save future-Earth. (Just go with it. It's a fantastic movie. Stream it immediately.) Besides being great science-fiction, using a ridiculous plot to pointedly comment on a present-day issue (in this case, the environment), it's an excellent fish-out-of-water story. Future spacemen representing a united planet and various other worlds, navigating late-20th century Bay Area culture, Cold War politics, and technology unsurprisingly makes for great comedy.
For instance, there's a moment when Scotty (James Doohan), Enterprise engineer who's always givin' it everything he's got, finds himself in front of an early Apple Macintosh. In the future, people say "Computer" and a command to interact with their machines. Scotty, naturally, doesn't know this and says, "Computer?" Nothing happens. After being told to use the nearby mouse to do what he needs, he smiles, says thank you, picks it up, and says, "Computer?" into the mouse. In our Alexa/Siri/Google Assistant world, the comedy is, admittedly, dated. Still, the scene's good for a guffaw—or at least a solid meme.
This piece about computer literacy, published in the Spring 1986 issue of the College Board Review, feels like a print version of Scotty's experience with the mouse—except in retrospect and without the knowing wink. Eugene Arden, vice chancellor and dean of academic affairs at the University of Michigan, Dearborn, writes about the concept of "computer literacy" so literally it reads as both painfully earnest and completely out of step with even the computers of the Eighties and their place in the global transformation that was about to happen.
"Much as we may want to bring a complex topic into convenient focus," Arden writes, "we should not burden 'computer literacy' with more meaning than a single phrase can possibly bear." That’s a debatable conclusion for 1986. Personal computers were big, bulky, and expensive and the concept of carrying computers in our pockets and on our wrists was still the fantasy of, well, Star Trek. Still, simpler technologies by that point had saturated everyday life, from phone switchboards to ATMs, and were poised to dig in deeper, with Steve Jobs likening them to bicycles for the mind. There was a good case 34 years ago for burdening “computer literacy” with as much portent as possible. Read today, with computers ubiquitous and foreign governments and lone-wolf lurkers manipulating social media to influence elections and economies, Arden’s analysis feels especially shortsighted. He might as well have picked up a mouse from one of his campus computer labs and asked it to tell him what it means to be computer literate.
So why share this article now, if it feels so dated and, well, wrong? In the broadest sense, it’s important to remember how we got here—the hits, as well as the misses. But even if Arden's analysis is naïve, prioritizing passive experience with computers over active engagement, it’s vital to remind ourselves of how shortsighted we can be when confronted with the new. It happened with the creation of the printing press, the radio, television, computers, and now with technology like artificial intelligence, machine-learning-based algorithms, facial recognition software, and quantum computing. There are always people who embrace cutting-edge ideas, but too often far more who want to write them off as fads or the domain of nerds. And then, in a blink, that tech becomes our everyday experience.
In the third decade of the 21st century, "computer literacy" has evolved into the broader concept of digital literacy. But it isn't, nor ever was, simply about knowing how a computer worked, as Arden describes. Computers might run on coding languages, but understanding how the machine operates—what goes in and, especially, what comes out—isn't the same as picking up Spanish. That was true in 1986, and it's true today.
Hulton Archive/Getty Images
Two men work at different components of a large Univac computer in an office, circa-1960.
"Computer literacy" is one of those marvelously convenient catch phrases that typically appear just when they are needed most—but then fall into profound disfavor not only because they are overused, but because their meaning had never been clear in the first place. The disenchantment with this particular phrase has even been expressed in folk humor: "Computer literacy is a disease invented by companies wanting to sell lots of personal computers."
Any review of the current educational literature will show how important it is to reach a common understanding of what that phrase means, provided that we avoid an easy over-simplification. Much as we may want to bring a complex topic into convenient focus, we should not burden "computer literacy" with more meaning than a single phrase can possibly bear.
Literacy in any circumstance, after all, means no more than a fundamental ability to use written symbols as a means of communication. Applied to computers, literacy means having only an elementary understanding of what computers can do, how they function in practice, and what their major characteristics are (extraordinary speed in making calculations, for example). Thus, to be computer literate means that one knows how to turn a machine on and get it ready for action, how to follow instructions about entering data or getting information back from the machine, and how to use it as a calculator and as a word processor, at least to the extent of revising a manuscript without having to retype it.
There might be a few other peripheral skills that are implied by computer literacy, such as being able to play some entertainment games on the machine and perhaps doing a color design or even "composing" a few bars of music, but all of these activities are done at a personal rather than professional level, and within severely limited boundaries of accomplishment.
Beyond mere literacy, however, there are at least three gradations that describe a hierarchy of computer abilities:
To be computer competent implies being able to use the computer as a tool—professionally. To illustrate, for a secretary it means being able to do complex formatting of documents; for an accountant it means knowing COBOL and being able to use such programs as VisiCalc and Lotus 1-2-3; for an engineer it means knowing FORTRAN and being able to solve routine design problems. At this level, in summary, the computer works for the individual, who is not only competent with it, but comfortable as well.
FPG/Archive Photos/Getty Images
An African-American office worker logs addresses into a Data General computer at a direct-mail company in Boston, Massachusetts, circa 1980.
To be computer fluent is to reach a much higher level of skills, for it implies knowing several computer languages, being at ease with a variety of machines and software packages, and being able to create problem-solving programs. The person with such fluency will understand the place of the computer in the whole knowledge realm, and will even be able to teach others to become computer competent in their own fields. The analogy to gaining fluency in a foreign language is helpful, since we are describing a person who can not only communicate by reading and writing a foreign language ("literacy"), but can also think in the foreign language, can savor style in a page of prose or poetry, and in general engage in a full understanding of idiom and nuance ("fluency").
Finally, there is a no-name group of people, very small in number, who are the creators of new computer knowledge or processes, to whom we owe the radical change, for better or worse, that has come into our lives as a result of the computer and the software. Perhaps "computer genius'' is the only way to describe them, for they possess a wide range of technical and analytical skills as well as mathematical insights; most critically, they are of an original cast of mind. Such people do not merely work in a field; they change the field, making it richer and deeper than ever before by their efforts. Yes, for this very special group, computer genius is the only right name.