acm - an acm publication

Critical thinking for the google generation

Ubiquity, Volume 2006 Issue May | BY John Stuckey 


Full citation in the ACM Digital Library

- Are serious institutions of higher education in danger of being deserted if they don't accommodate the latest gadget of technological whim?
- Are faculty members at such institutions being scared with the prospect that their students are more clever than they, more adept at using the latest technology, more likely to be bored, and therefore unable to learn?
- Is there a useful role for new technology in the teaching/learning process?

I'd say, "unlikely, definitely, and absolutely." Your answers? I'm more interested in the reasoning behind them than the multiple-choice options, even if it's more challenging to grade. News reports persistently detail the digital skill and expectations of the approaching generation of net-natives (as opposed to us net-immigrants who were not born into but needed to learn our digital competence; it's a counter-intuitive notion that seems to imply that the technology was here before the users, but it seems conceptually useful).

There are plenty of good reasons to incorporate information technology into teaching and learning, but the fear of being left behind or left out or rejected by demanding techno-proficient applicants is not among them. The prudent incorporation of technology can enrich and strengthen education. Will it make teaching easier or ensure effective learning? Probably not, on both counts. Using technology well is hard work, and the results have to be weighed critically and honestly. But pandering to the whims of gadget-obsessed youths in order to ensure institutional survival (not to mention job security) is destined to fail.

OK, "gadget-obsessed" is a cheap shot. And more than that, it probably describes you and me as accurately as it does the current generation of college-bound students. Yet we do have to be careful in giving students what they fervently believe they want. We don't encourage them to gravitate to snap classes or easy graders; we give at least lip service to resisting grade inflation; most of us were concerned when our institutional networks were in danger of becoming useless for any activity other than trafficking in music or movies of, at best, questionable legality. Sometimes, students don't instinctively know what's best, a fact that should make their tuition payments good investments. If the faculty does not have useful knowledge to impart, in a process that often demands that students do things they'd rather not do, then perhaps we should close our doors. I don't mean we can't learn from them. The best scholar, after all, never stops learning and is always open to new possibilities and paradigms.

We can generate headlines by touting technology's relentless and apparently accelerating advance, and it will likely always be true that entering students have a greater familiarity and facility with the latest (=greatest?) technology. That is an opportunity rather than a threat. I rejoice in not having to introduce today's students to the same elementary and superficial skills that were necessary 20, 10, or even five years ago (it was the early 80s when we introduced the Computing Skills Workshop as a requirement for all freshmen at Carnegie Mellon). But if today's students have an instinctive capacity to operate the devices and navigate the shoals of the networked world, to find easy answers with astonishing speed, to impress us with the dexterity of their shortest, thickest digits, they still require education in learning how to ask the difficult questions that most likely have no simple answers. That is what critical thinking requires, and that is the essence of true information technology. The answers are not in the computers, no matter how smoothly we operate them. Discard the computers? Don't be silly. They are valuable, even indispensable tools. But they are not sufficient to produce, much less guarantee, knowledge. As educators, we owe it to our students to ensure that multitasking and non-linearity are different from what used to be called failing to pay attention and being easily distracted. We owe it to them to explain the differences among a Google search, a literature search, and research, and we definitely owe them a serious warning about plagiarism. (Plagiarism is certainly not new and, alas, not limited to students, but it sure is a heck of a lot easier these days. Shoot, even plagiarism is less educational today; earlier generations might at least have gained passing familiarity with plagiarized material by having to re-type it!).

Good education is still hard work and not usually glamorous. This is not the scary stuff of headlines. Improving teaching and learning means re-assessing the educational process thoughtfully and often. We give easy lip service to the importance of problem solving, but don't we still base grades in many fields on memorization and recall? We need to consider innovative, creative ways to integrate technology into teaching and learning, but as a means, not an end.

We will not serve our students well if we entice them with a diet of digital dessert and candy in order to keep our own institution's doors open but fail to prepare them for the ever-more-serious challenges with which their world is sure to confront them.

About the Author
John Stuckey has been associated with Washington & Lee University in Virginia for many years. He is an Associate Editor of Ubiquity.


Leave this field empty