acm - an acm publication

Articles

Through a glass, darkly

Ubiquity, Volume 2000 Issue March, March 1 - March 31, 2000 | BY Robert C. Heterick, Jr. 

|

Full citation in the ACM Digital Library



The process of teaching and learning in higher education, which has remained essentially untouched by war, politics or scientific discovery for half a millennium, is about to undergo radical change as a consequence of the computer and communications revolution. For those in the academy it has overtones of a religious war. For the average person on the street (including legislators at the state and Federal level) it is a deeply mysterious subject more governed in discussion by myth than reality.

Just as higher education in this country went through a dramatic conversion in the mid-1800s from religious to public support, the dawn of the 21st century is marking a change from public to private support. That change is realized by, and driven by, incredible advancements in digital technology -- both in capability and cost. At the same time, higher education is bedeviled with two systemic problems -- high costs and, from a corporate standpoint, insufficient quality. It is not yet clear just what final form this transformation will produce, but it is worth speculating about what forms it might assume.

Ralph Waldo Emerson once observed that "a foolish consistency is the hobgoblin of small minds." I've never thought of the academy as a place of small minds but it is certainly consistent in one thing -- tuition increases. For who knows how many times in the past quarter century, tuition increases will more than double the rise in the Consumer Price Index. Are institutions trying to do anything about these increases in cost? Probably, but the question lingers as to whether it is with much effect.

If the price of a bottle of Coke had risen at twice the rate of inflation every year for the past two decades we would all be drinking Pepsi. If the price of computers, or long distance phone calls, had been rising at twice the rise in the Consumer Price Index personal computers would cost $7,000 rather than $700 and phone calls would cost 40 cents rather than 7 cents a minute. Perhaps if student learning had also doubled in that period we could be less troubled by what seems to be a set of costs out of control. But, what a number of Federal study commissions and anecdotal reports from employers would suggest is the contrary.

A recent article in the Chronicle of Higher Education contained this innocuous looking pair of sentences,"Jones employs 56 faculty members, of whom only two work full time. The rest are adjuncts who hold academic posts at other universities and work for Jones, either teaching or working as 'content experts'."

Now, such freelancing isn't unheard of in higher education. Faculty write textbooks that frequently lead to a similar relationship with a publisher. Many faculty "moonlight" teaching for-profit seminars and workshops and others can be found teaching in the night schools of other local post-secondary institutions. But, is this a harbinger of a new relationship between faculty and post-secondary institutions? Is this the academic equivalent of free agency in sports?

Why, for instance, shouldn't we expect some faculty to enter practices such as those of the professions of law or medicine? And if such practices were formed, why wouldn't we expect that they might bid on doing the complete instructional package for freshman English or math at an institution? Given the current misalignment of institutional goals and faculty rewards in most of higher education, such freelancing arrangements on the part of some faculty may signal a positive trend.

History has it that the band played "A World Turned Upside-Down" at the surrender of Cornwallis terminating the War of Revolution. It must be time to strike up the band again for surely the world is turned upside-down, at least on the Net.

The longer Amazon.com predicts it will go without a profit, the higher its stock seems to rise. On the Net, market share means everything. Get there first, take advantage of lower margins by not having to build or maintain a physical infrastructure, embed as much intelligence as possible in the software thereby reducing your cost of personal services, and leverage your product -- sell ads, arbitrage the float on your accounts receivable and payable, and mine your database for associated services. In the extreme, give away your product and rely upon sales of ancillary services -- consulting, warrantees and franchising.

How long will it be before the first educational offering on the Net reaches a thousand or a million customers? Margins can be paper thin, say one percent, and still make for an attractive venture if your product costs $500 and has a million customers. And, will someone decide to "give away" the educational product and sell the associated services of faculty consulting, testing and credit granting, and portal services to other educational offerings?

The lesson of the Net is that centralized planning doesn't produce the results stemming from flattened, seemingly chaotic organizational structures such as the one that built the Net. The Net is complex and complicated but its complexity wasn't designed. It simply grew from the premise that by putting the intelligence out at the ends, thereby encouraging innovation, lots of people could make contributions to the whole product without needing to understand the whole product. The concept of putting the intelligence at the edges also fostered incredible entrepreneurship. People could dream up ideas at the edges and, if they were accepted by enough others on the edge, they could become quite profitable products.

The bundling of all sorts of services around the educational product forces the intelligence to the center and away from the edges, thereby impairing, if not altogether eliminating, innovation. Established institutions of higher education operate on a time frame that is comparatively glacial. New relationships with the faculty are likely to be much more productive in producing quality educational offerings on the Net. They are also much more likely to do it in Net time.

The other day I picked up a local newspaper and was surprised to see a political candidate suggesting that it might be time to consider privatizing some or all of the state's colleges and universities and using tax credits or tuition vouchers as the state vehicle to fund and encourage higher education. His proposal was to have the money follow the student. Subsidize the consumer rather than the provider. Never mind that even Washington has discovered that funding the recipient rather than the provider works a lot better -- our poor political aspirant had done the candidate equivalent of passing gas in church.

The paper's editorial was cutely titled "Let's not give 'privatize' the old college try." The editorial writer said, "There are those ideas that are rejected out of hand because they are so utterly nonsensical that, for good reason, they are not to be taken seriously." I searched in vain for the "good reason" through the rest of the editorial. It never ceases to amaze that those who most strongly profess their openness to new ideas are incapable of imagining any future that is more than just a minor perturbation of the status quo.

What exactly is the role of state government in supporting higher education? Must they "operate" institutions of higher education or is their imperative to see that post-secondary educational experiences are available and that citizens are encouraged to take advantage of them through governmental support. The latter doesn't presuppose the former, the voice of our editorial writer to the contrary notwithstanding.

Most situations in life are tradeoffs. You can run long or you can run fast, but you can't run both long and fast. Learning, in the context of the modern college and university, is a tradeoff among three parameters -- time, content and mastery. A typical two-semester math sequence could be offered in one semester. The change in the time parameter would be accompanied by a change in the mastery parameter: more students would fail to achieve the desired average level of subject mastery. The syllabus of the first semester could be stretched over two semesters and the mastery parameter would change again.

The twentieth century university model developed into one in which the time parameter was defined and the content adjusted so that some acceptable number of students achieved the desired average level of subject mastery. Content and mastery were not adjusted the same in all content areas. Hence, a math course might have a much higher student failure rate than, say, a history course but still maintain the desired level of student mastery.

Certainly our agrarian roots had much to do with the choice of 30 rather than 40 or 50 weeks as the academic calendar. The average life span of people at the turn of the century likely had much to do with setting the baccalaureate span at four years rather than five or six.

As any faculty member can tell you, the content base of many courses has increased dramatically during the twentieth century. Establishing time as fixed has required constant juggling of just how much content can and should be squeezed into the semester. It is all too typical to approach this compromise about time, content and mastery as fixed and decided -- all too easy to forget that some set of temporal conditions influenced the trade-off that might not exist a century later. Certainly time off for summer work on the farm is a situation that effects no one as we enter the 21st century. The average lifespan of an individual has increased by nearly 50 percent over that same century, calling into question again the correctness of the choice of four years as the baccalaureate span. The nature of modern work and the physical and mental health of current "senior citizens" suggests that the effective and desired work life of individuals may be lengthening as well.

Modern technology, in the form of inexpensive, pervasive access to high bandwidth communications and high performance computing, changes dramatically many of the temporal conditions. It seems likely that mastery is a more important parameter than time. It is now possible, in most instances, for the teacher to take the content to the learner rather than for the learner to come to the teacher for the content. In fact, in many learning situations the economics are beginning to strongly favor the former over the latter.

What if, for example, we set some minimum level of mastery as the constant and let the time for learners to achieve it vary? Rather than have 40 percent of our students fail the first calculus course, we let the time to achieve acceptable mastery of first calculus vary by 40 percent . The problem becomes not to weed out the low achievers over a fixed time frame but rather to help accelerate the slower learner to the fixed minimally acceptable level of mastery.

This would, of course, play havoc with the neat Industrial Age rhythms we have established in our institutions of higher learning. It would make lecture and recitation an ancillary rather than primary tool of the learner. It would move the learner to the center and the teacher to the periphery (not necessarily a less important but certainly a different vantage point for the teacher.) It would suggest that we "grade" learners on the amount of content they have mastered to some acceptable level rather than on their level of mastery of some fixed content.

It would, in effect, establish a finish line. Some learners would cross it sooner than others -- but nearly all would cross it. Some might choose to run only to the finish line, others to run a little or a lot past it. Our current model has a whistle blow after some period of time, leaving runners all over the track -- some past the finish line and others still well short of it. Unlike track and field games, in our current situation the losers are the ones who have to repeat in the next heat. The question we ought to ask ourselves is, "Why is it necessary for anyone to repeat?"




Bob Heterick is past President of Educom, Vice President Emeritus at Virginia Tech and currently a Fellow of the Center for Organizational and Technological Advancement at Virginia Tech and Visiting Research Professor at the Center for Academic Transformation at Rensselaer Polytechnic Institute.

www.center.rpi.edu/LFHome.html

COMMENTS

POST A COMMENT
Leave this field empty

2018 Symposia

Ubiquity symposium is an organized debate around a proposition or point of view. It is a means to explore a complex issue from multiple perspectives. An early example of a symposium on teaching computer science appeared in Communications of the ACM (December 1989).

To organize a symposium, please read our guidelines.

 

Ubiquity Symposium: Big Data

Table of Contents

  1. Big Data, Digitization, and Social Change (Opening Statement) by Jeffrey Johnson, Peter Denning, David Sousa-Rodrigues, Kemal A. Delic
  2. Big Data and the Attention Economy by Bernardo A. Huberman
  3. Big Data for Social Science Research by Mark Birkin
  4. Technology and Business Challenges of Big Data in the Digital Economy by Dave Penkler
  5. High Performance Synthetic Information Environments: An integrating architecture in the age of pervasive data and computing By Christopher L. Barrett, Jeffery Johnson, and Madhav Marathe
  6. Developing an Open Source "Big Data" Cognitive Computing Platform by Michael Kowolenko and Mladen Vouk
  7. When Good Machine Learning Leads to Bad Cyber Security by Tegjyot Singh Sethi and Mehmed Kantardzic
  8. Corporate Security is a Big Data Problem by Louisa Saunier and Kemal Delic
  9. Big Data: Business, technology, education, and science by Jeffrey Johnson, Luca Tesei, Marco Piangerelli, Emanuela Merelli, Riccardo Paci, Nenad Stojanovic, Paulo Leitão, José Barbosa, and Marco Amador
  10. Big Data or Big Brother? That is the question now (Closing Statement) by Jeffrey Johnson, Peter Denning, David Sousa-Rodrigues, Kemal A. Delic