Volume 2014, Number June (2014), Pages 1-13
Ubiquity symposium: MOOCs and technology to advance learning and learning research: from MOOCs to SPOCs: curricular technology transfer for the 21st century
Is the MOOC honeymoon winding down? With many university faculty opposing the MOOC movement, the author argues taking the best of massive online courses—access to high-quality materials and rapid feedback to students—to implement SPOCs (small private online courses) will provide a more effective leverage of instructors' time and resources.
After more than a year of a media love affair with MOOCs and a visible rush of university administrators and politicians rushing to embrace them, it appears the honeymoon may be winding down. Faculty at San Jose State University's Department of Philosophy published an open letter to Harvard professor Michael Sandel, chastising his complicity in "perilous" online learning efforts ; faculty at Harvard have called for greater oversight on the university's investment in MOOCs ; faculty at the University of Massachusetts, Amherst, voted against joining MOOC provider EdX despite lengthy negotiations ; and Duke University faculty forced the school to back out of a proposed agreement with education vendor 2U to create for-credit online courses .
The history of educational technology is littered with well-meaning failures. Some disappeared forever; others have continued to be nontrivially used but with little fanfare. There are specific characteristics of MOOCs that allow them to be repurposed more effectively for blended learning and for "technology transfer" of effective teaching materials than earlier technologies, and I urge my colleagues to reconsider MOOC technology in this light.
In particular, if MOOCs are used as a supplement to classroom teaching rather than being viewed a replacement for it, they can increase instructor leverage, student throughput, student mastery, and student engagement. To describe this model, we invented the term "SPOC", for small private online course. Given our belief that instructors play a critical role in facilitating learning, we argue for SPOCs because they give more effective leverage of instructors' time and resources.
If I am arguing for SPOCs, why does it matter that MOOCs are massive to begin with? Isn't a SPOC the same as pre-MOOC "hybrid" or "blended" courses? There are two subtle reasons why I believe the answer is "no." First, designing course materials for scale leads to a different point in the design space. Assignments with automatic grading that must scale to a large worldwide audience with zero instructor intervention require a level of forethought and polish that makes them even more effective for classroom students. Second, the large scale of MOOCs enables the use of inferential statistics techniques to evaluate and improve the course content, which ultimately benefits residential students. Since the infrastructure for MOOC delivery is relatively easy to administer and hosted as a service that doesn't require software configuration for individual instructors, I hypothesize that this makes MOOCs and SPOCs a new instrument for "curricular technology transfer" that is far superior to traditional methods such as standalone textbooks combined with recycled course notes, last semester's PowerPoint decks, and old exam questions.
Our MOOC Journey
At Berkeley, my colleague David Patterson and I undertook an effort that began with a MOOC and led us to the above views about SPOCs. We created a MOOC based on our upper division software engineering course, the first MOOC offered by Berkeley1 and among the first to launch with Coursera. The on-campus course was so popular among Berkeley students that we had already been thinking about automated grading; committing to a MOOC forced us to really explore that possibility.
As recommended by colleagues who had produced MOOCs, we refactored our 80-minute lectures into 6–10 minute "lecturelets" focused on single topics and accompanied by one or two self-check questions with short or multiple-choice answers.
In addition, we created different autograders for different types of software engineering homework assignments, whose schematic view is shown in Figure 1.
To create these autograders, we invested several hundred engineer-hours in repurposing tools used by professional programmers, as Figure 2 describes. Far beyond simply indicating the correctness of a student's answer, the autograders provide finer-grained feedback than students would get from human TAs, who can spend at most a few minutes per assignment. The course staff came up with very creative ways to provide feedback on qualitative coding style, and to not only test student's code, but to test the quality of their test cases. We even test their ability to enhance legacy code; a critical skill neglected in most undergraduate software engineering course but in high demand by employers.
How the SPOC Changed our On-Campus Course
In typical CS fashion, we "ate our own dogfood" by using the MOOC materials in our campus course to debug and refine them before inflicting them on students in the MOOC. To do this, we made a special arrangement giving our on-campus students access to a "private" copy of the MOOC that was off-limits to everyone else. Within that copy, we could set different assignment deadlines than those in the MOOC—for example, to grant extensions due to bugs in our autograders—and our TAs had access to student assessment results, so that (for example) they could manually adjust scores when a student was unfairly penalized by the autograder for a valid solution not anticipated in the rubric. We invented the term SPOC, for "small private online course," to distinguish this deployment from a regular MOOC using the same materials and technology. We quickly identified three benefits to the SPOC model in our course:
- Shorter, modularized "lecturelets" keep students focused, as self-check questions can be used for peer learning. Where a MOOC student would watch a short video segment and then answer a self-check question, a student in the live lecture would listen to a few minutes of lecture, after which we would pause and use the self-check question in a peer learning exercise . Students vote simultaneously on the answer by holding up a colored index card (cheaper than clickers), discuss their answer with a neighbor for 30–60 seconds, and then vote again. Students in the live lecture know that every 6–10 minutes we will stop the lecture and engage them, so they tend to be more attentive.
- Design projects are better. The on-campus course includes an open-ended design project in which teams of four to six students create software for nonprofits and campus units. The design project is both required by the department and identified by the students as the most valuable aspect of the course. Although the design project is completely absent from the MOOC, the use of the MOOC technologies in the homework assignments leading up to the project allows students to begin the project more confidently and with a stronger set of foundational skills. In the Fall 2012 offering of the course, 92 percent of the external customers for the design projects were either "happy" or "thrilled" with the students' work, and nearly half tried to hire the students to continue developing the prototype for pay. This is remarkable for an eight-week project by an undergraduate team. In the most recent (Fall 2013) offering, customers have not yet been surveyed, but roughly 40 percent of the students report they plan to continue supporting the software for the customer after the class ends.
- Students like both the time-shifting option and the focused short videos. In an anonymous end-of-course survey in Fall 2012, we asked students who had stopped coming to lecture what the main reason was. The majority responded they preferred to watch videos online, where there is no stigma attached to rewinding the video to improve understanding. Because the individual videos are short and focused on a single topic, it's easy for students to review only troublesome topics. Based on this information, our Fall 2013 offering of the course was scheduled in a lecture hall half the size of the allowed enrollment, to make it clear that we attached no stigma to watching online while still welcoming students who prefer the live lecture. Throughout the semester, live lecture attendance was between 80 and 100 (total enrollment was 237), so students clearly appreciated time shifting. Our end-of-semester survey in 2013 revealed the students who relied solely on videos rather than coming to lecture preferred the edited "lecturelets" over the raw (80 minutes unedited) lecture videos, even though the edited videos took two to three days to be posted while the unedited lecture videos were available within less than 24 hours of lecture.
As a result of these benefits, we quickly realized while we thought we had been creating tools to facilitate a MOOC, what we had actually created was a set of tools that had radically changed how we teach the on-campus course. The results have been remarkable; as Figure 3 shows, the SPOC model has allowed us to increase the enrollment of the course nearly fourfold while yielding higher instructor and course ratings, even though the material covered has changed very little. Thus, the latest offering has the rare distinction of having both the largest enrollment and the highest student ratings in the course's 20-year history.
SPOCs Relieve Concerns About MOOCs
SPOCs offer positive counterexamples to concerns about MOOCs that have dominated recent media coverage. The most frequently voiced concerns about potential negative impact of MOOCs are as follows:
- Students will be shortchanged by the loss of important aspects of traditional instruction;
- Faculty will be fired to save money or will be distracted from improving their on-campus teaching;
- Course content and teaching methods will become homogenized, reducing faculty to little more than glorified teaching assistants (TAs).
In fact, our observation is SPOCs leverage MOOC technologies to support the instructor to shift effort to higher-value activities, such as small-group discussions and face-to-face time with instructors. That is, they enhance, rather than replace, traditional course elements. For example, rather than debating whether autograders should replace human evaluation, we observe that autograders increase TA leverage, allowing TAs to focus on the interaction-intensive design project reviews and on addressing challenging questions in office hours and discussion section. That is, the scarce resource of instructor time has been shifted from the lower-value activity of grading to the higher-value activity of student interaction.
Similarly, rather than worrying whether MOOC-based social networking will replace face-to-face peer interactions, we can ask and experimentally answer: Under what conditions and with what types of material do online communities help foster learning, and how can social networking technology help foster both online and in-person community building ? Learning activities that don't appear to be "MOOCable"—discussion-based learning, open-ended design projects, and so on—can just be omitted from the MOOC but covered in the classroom setting, as we've done.
In light of this claim, could SPOCs be used as the sole vehicle for offering a degree, in the spirit of Georgia Tech's recently-announced online master's in computer science program that is 100 percent MOOC-based? There are two major caveats. First, we have personally experienced how SPOCs increase instructor leverage, but our multiplier has been a small integer, not one or more orders of magnitude. That is, while we successfully scaled a 55-student course (historical average) to nearly 250, we don't believe the same techniques would support 2,500 or 25,000 students while maintaining instructor interaction, yet those enrollment numbers are common for MOOCs. Second, while the instant-feedback autograding of SPOCs is particularly useful for "skills-based" course material, much important material can't be learned effectively in this format, including critical small-group peer discussions and (so far) open-ended design projects that require substantial instructor oversight. While we expect to increasingly embrace SPOCs for the benefits they provide to instructors as well as students, we don't claim the SPOC benefits would extend out to the orders-of-magnitude scale-up in enrollments that generated early excitement around MOOCs. Nonetheless, the observation raises an interesting opportunity: By separating the scalable and non-scalable aspects of existing courses (or "MOOCable and non-MOOCable aspects," if you wish), those components could be resourced differently because of the MOOC's ability to scale. Under this interpretation, SPOCs can be seen as leveraging important MOOC features, such as access to high-quality materials and rapid feedback to students via autograding, to maximize the leverage of the scarce resource—instructor time. Given these observations, firing instructors is a poor idea, even if saving money is a goal.
Yet SPOCs can increase educational quality and empower instructors, while still saving money. To see this, observe that even though our design project is absent from the MOOC, on-campus student performance on the design project is enhanced by the benefits of MOOC technology. Autograders allow students to improve mastery: Students can resubmit homework assignments to improve their scores, a policy we have always favored but was impractical to implement with human TAs. The increased enrollment facilitated by autograders saves money; and instead of turning away graduating seniors, we now admit juniors, who can put these skills to use earlier in their careers, such as during summer internships following their junior year. Others have experienced similar results, including the early pilot with San José State University in California, where 91 percent of students in a circuits SPOC using MIT-authored MOOC materials passed the class, compared to 59 percent without the SPOC . In both scenarios, educational quality arguably increased because instructors shifted their time from the lower-value activity of creating and delivering lectures to the higher-value activity of working directly with students on the material. Money was saved not by firing people, but by helping more students acquire economically valuable skills more quickly.
The critique that MOOCs and SPOCs are a distraction to instructors can be countered by the observation that SPOCs based on high-enrollment MOOCs allow the world to help the instructor improve the course. The large enrollments of MOOCs offer us new and unprecedented opportunities to improve our on-campus courses using inferential statistics techniques that just don't work at smaller scales, and so were previously available only to large-enrollment "high stakes" exams such as the GRE or SAT.3 For example:
- Exploratory factor analysis lets us identify questions that test comparable concepts, giving instructors a way to vary exam content .
- Item response theory allows us to discover which questions are more difficult  (in the statistical sense that higher-performing students are more likely to get them right).
- A/B testing gives us a controlled way to evaluate which approaches have better effects on learning outcomes and student learning experience , just as high-volume e-commerce sites evaluate which user experience results in more purchases.
None of these techniques gives usable error bars on classroom-sized cohorts (say, 200 or fewer students), but they do very well with cohorts of several thousand students. We are already using this feedback to not only polish the material for our on-campus students, but to maintain a "question bank" of difficulty-graded review questions from which future students can benefit. And we have already conducted in situ A/B experiments on Berkeley MOOCs both to test both user-experience features in the EdX platform and to investigate the introduction of proven techniques such as peer learning into MOOCs . Hence, even when using a SPOC in the classroom, faculty can still leverage the scale of an (open) MOOC to enhance their classroom teaching.
But if faculty choose to use SPOCs in their classrooms, won't this result in a "winner take all" scenario in which all offerings of a course become homogenized based on the small number of MOOCs from which they draw? Our experience has not borne out this concern. In fact, far from homogenizing course content, SPOCs empower faculty even more powerfully than textbooks, while retaining each instructor's ownership of the individual course.
Not all faculty are good at book writing, so most of us use others' textbooks, and some courses are indeed dominated by a small number of textbook titles. Yet few of us would concede that in using someone else's textbook we have ceded ownership of our course to the textbook author. In Tools For Teaching, Davis recommends lecture styles and teaching strategies should vary depending on the nature of the material and the target audience of students . That is, course materials, however engaging, do not make a course; instructor interaction does. As does targeted help for students grappling with the material, which is ultimately the only way they learn it. Indeed, recent research on the first EdX MOOC confirms students who interacted with experts outside the MOOC performed better than those who did not .
SPOCs allow MOOC materials to serve the same role as textbooks, but much more powerfully. The pace and content of the SPOC can be customized to fit the local classroom; material can be skipped, reordered, speeded up, or slowed down. Autograders relieve instructors of an unpopular task and improve the leverage of their time. Video lectures allow the flexibility of either "flipping the classroom" or using some material as review or optional advanced topics, and the modular structure (short lectures) make it relatively easy to selectively mix SPOC material with the local instructor's own materials.
Given that about 8 percent of all the students who have taken our MOOC have self-identified as instructors, MOOCs may be even more effective than traditional textbooks at reaching instructors and getting effective materials and techniques out to a large audience. Happily, because MOOCs are designed for large scale and "low touch," their packaging and delivery—software as a service, delivered to a Web browser by a centralized hosting provider such as EdX or Coursera—makes them particularly easy to deploy to classrooms. We recently reported on the experience of five other instructors at different universities using our SPOC materials . Each combined a subset of our SPOC materials with their own materials, and no two used the same subset in the same way. All five are continuing to use the SPOC in their subsequent course offerings, two are making substantial new contributions of autograded assignments and other activities that take advantage of the SPOC infrastructure and are thus readily reusable by others and ten new instructors have joined (as of December 2013). Bi-weekly conference calls confirm that this is an engaged group of instructors who feel strong ownership of their courses but are happy to use (and contribute to) field-tested high-quality materials in their own way.
MOOCs are exciting because their scale, reach, and delivery method offers instructors both potential new insights on the effectiveness of their course materials and a convenient delivery model. The delivery model—software hosted as a service—allows new research results and best practices to be immediately deployed and passed on to instructors who may be unaware of them. With SPOCs the MOOC material is deployed in a classroom setting with customizable deadlines, syllabus, and other adaptations to meet the needs of local students and instructors.
In addition, the SPOC model finesses the sticky question of credentialing. With SPOCs, students get credit for enrolling in a course owned by the local instructor, even if the course happens to use MOOC technology, rather than getting credit for the MOOC itself. This important separation allows assessment and credentialing to remain in the instructor's and the institution's hands. Indeed, Berkeley's institutional policy is not to grant academic credit for its MOOCs, even though several instructors are already using them as SPOCs in their for-credit courses.
We have had promising success with the SPOC model at Berkeley, and we are working with EdX to make it easy for other instructors to try it as well. We believe this new opportunity for curricular technology transfer of both materials and best practices can become a positive lasting legacy of the "MOOC moment."
David Patterson was my teaching partner, book co-author, and mentor throughout every aspect of the work described in this article. Thanks to the SPOC instructors who have adopted our material and taught us much about it, especially Sam Joseph, who stewards the SPOC community and now facilitates the MOOC as well. Thanks to EdX for being receptive to the SPOC model and working with us to extend the opportunity to these instructors and others to come.
 The Department of Philosophy. San José State University. An Open Letter to Professor Michael Sandel From the Philosophy Department at San Jose State U. The Chronicle of Higher Education. May 2, 2013.
 Jaschik, S. Harvard Profs Push Back. Inside Higher Ed, May 28, 2013.
 Rivard, R. EdX Rejected. Inside Higher Ed, April 19, 2013.
 Rivard, R. Duke Faculty Say No. Inside Higher Ed, April 30, 2013.
 Ammann, P. and Offutt, J. Introduction to Software Testing. Cambridge University Press, Cambridge, 2008.
 Smith, M. K. et al. Why Peer Discussion Improves Student Performance on In-Class Concept Questions. Science 323, 2 (2009).
 Coetzee, D., Fox, A., Hearst, M. A., and Hartmann, B. Should Your MOOC Forum Use a Reputation System? In Proc. of the 17th ACM Conference on Computer Supported Collaborative Work, CSCW 2014 (Baltimore, MD, Feb. 2014) 2014. ACM Press, New York, 2014.
 Lewin, T. and Markoff, J. California to Give Web Courses a Big Trial. The New York Times. January 15, 2013.
 Lawley, D. Estimation of Factor Loadings by the Method of Maximum Likelihood. In Proc. Royal Soc. Edinburgh, 60A, 1940.
 Lord, F. M. Applications of Item Response Theory to Practical Testing Problems. Erlbaum, Mahwah, N.J., 1980.
 Kohavi, R., Henne, R. M., and Sommerfield, D. Practical Guide to Controlled Experiments on the Web: Listen to Your Customers not to the HiPPO. In Proc. of the 2007 ACM SIGKDD Conference on Knowledge Discovery and Data Mining (San Jose, CA, Aug. 2007). ACM Press, New York, 2007.
 Davis, B. G. Tools for teaching. Jossey-Bass, 2009.
 Breslow, L., Pritchard, D. E., DeBoer, J., Stump, G. S., Ho, A. D., and Seaton, D. T. Studying Learning in the Worldwide Classroom: Research into EdX's First MOOC. Research in Practice & Assessment 8 (2013).
 Fox, A., Patterson, D., Ilson, R., Joseph, S., Walcott-Justice, K., and Williams, R. Software Engineering Curriculum Technology Transfer: Lessons learned from MOOCs and SPOCs. UC Berkeley EECS Technical Report
Armando Fox is a professor in Berkeley's Electrical Engineering and Computer Science Department as well as the Faculty Advisor to the UC Berkeley MOOCLab. He co-designed and co-taught Berkeley's first Massive Open Online Course on Engineering Software as a Service, currently offered through EdX, through which more than 10,000 students worldwide have earned certificates of mastery. He also serves on EdX's Technical Advisory Committee, helping to set the technical direction of their open MOOC platform. With colleagues in computer science and in the School of Information, he is doing research in online education including automatic grading of students' computer programs and improving student engagement and learning outcomes in MOOCs. His other computer science research in the Berkeley ASPIRE project focuses on highly productive parallel programming. While at Stanford he received teaching and mentoring awards from the Associated Students of Stanford University, the Society of Women Engineers, and Tau Beta Pi Engineering Honor Society. He has been a "Scientific American Top 50" researcher, an NSF CAREER award recipient, a Gilbreth Lecturer at the National Academy of Engineering, a keynote speaker at the Richard Tapia Celebration of Diversity in Computing, and an ACM Distinguished Scientist. In previous lives he helped design the Intel Pentium Pro microprocessor and founded a successful startup to commercialize his UC Berkeley Ph.D. research on mobile computing. He received his other degrees in electrical engineering and computer science from MIT and the University of Illinois. He is also a classically trained musician and performer, an avid musical theater fan, and a freelance music director.
1The MOOC was first offered on Coursera. Berkeley subsequently joined EdX, so our MOOC has moved there. It has been offered five times to more than 100,000 students, of whom more than 10,000 have earned certificates of mastery.
2RSpec, reek, flay, and Mechanize are widely used tools designed around the Ruby programming language and Rails programming framework for creating Web services.
3The Graduate Record Exam (GRE) and Scholastic Aptitude Test (SAT) are standardized tests that are part of most students' applications to American graduate and undergraduate programs respectively.
Figure 1. Given a student submission and an instructor-developed rubric, the grading engine produces both textual feedback and a numerical score.
Figure 2. Primarily of interest to programmers, this table outlines the strategies used by the four Ruby "AutoGraders" developed for our software engineering course. The open source code is available on GitHub as saasbook/rag.2
Figure 3. Course enrollments, overall instructor rating, and overall course rating based on data collected anonymously by Eta Kappa Nu Engineering Honor Society. The Spring and Fall 2012 offerings used the SPOC; previous offerings did not. We had 237 students in Fall 2013, but the ratings for the Fall 2013 offering are not yet available as this goes to press.
2014 Copyright held by the Owner/Author. Publication rights licensed to ACM.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2014 ACM, Inc.