acm - an acm publication

Symposia

  • What About an Unintelligent Singularity?: The technological singularity (Ubiquity symposium)

    For years we humans have worried about plagues, asteroids, earthquakes, eruptions, fires, floods, famines, wars, genocides, and other uncontrollable events that could wipe away our civilization. In the modern age, with so much depending on computing and communications, we have added computers to our list of potential threats. Could we perish from the increasing intelligence of computers? Denning thinks that is less of a threat than the apparently mundane march of automated bureaucracies. He also asserts that none of the possible negative outcomes is a forgone conclusion because humans teaming with machines are far more intelligent than either one alone.

  • Computers versus Humanity: Do we compete?: The technological singularity (Ubiquity symposium)

    Liah Greenfeld and Mark Simes have long worked together, integrating the perspectives of two very different disciplinary traditions: cultural history/historical sociology and human neuroscience. The combination of their areas of expertise in the empirical investigation of mental disorders, which severely affect intelligence---among other things---has led them to certain conclusions that may throw a special light on the question of this symposium: Will computers outcompete us all?

  • Exponential Technology and The Singularity: The technological singularity (Ubiquity symposium)

    The Priesthood of the Singularity posits a fast approaching prospect of machines overtaking human abilities (Ray Kurzweil's The Singularity is Near, Viking Press, 2006) on the basis of the exponential rate of electronic integration---memory and processing power. In fact, they directly correlate the growth of computing technology with that of machine intelligence as if the two were connected in some simple-to-understand and predictable way. Here we present a different view based upon the fundamentals of intelligence and a more likely relationship. We conclude that machine intelligence is growing in a logarithmic (or at best linear fashion) rather than the assumed exponential rate.

  • The Singularity and the State of the Art in Artificial Intelligence: The technological singularity (Ubiquity symposium)

    The state of the art in automating basic cognitive tasks, including vision and natural language understanding, is far below human abilities. Real-world reasoning, which is an unavoidable part of many advanced forms of computer vision and natural language understanding, is particularly difficult---suggesting the advent of computers with superhuman general intelligence is not imminent. The possibility of attaining a singularity by computers that lack these abilities is discussed briefly.

  • Human Enhancement--The way ahead: The technological singularity (Ubiquity symposium)

    In this paper a look is taken at artificial intelligence and the ways it can be brought about, either by means of a computer or through biological growth. Ways of linking the two methods are also discussed, particularly the possibilities of linking human and artificial brains together. In this regard practical experiments are referred to in which human enhancement can be achieved though linking with artificial intelligence.

  • The Future of Synchronization on Multicores: The multicore transformation (Ubiquity symposium)
    Synchronization bugs such as data races and deadlocks make every programmer cringe traditional locks only provide a partial solution, while high-contention locks can easily degrade performance. Maurice Herlihy proposes replacing locks with transactions. He discusses adapting the well-established concept of data base transactions to multicore systems and shared main memory.
  • The Multicore Transformation Closing Statement: The multicore transformation (Ubiquity symposium)
    Multicore CPUs and GPUs have brought parallel computation within reach of any programmer. How can we put the performance potential of these machines to good use? The contributors of the symposium suggest a number of approaches, among them algorithm engineering, parallel programming languages, compilers that target both SIMD and MIMD architectures, automatic detection and repair of data races, transactional memory, automated performance tuning, and automatic parallelizers. The transition from sequential to parallel computing is now perhaps at the half-way point. Parallel programming will eventually become routine, because advances in hardware, software, and programming tools are simplifying the problems of designing and implementing parallel computations.
  • The MOOC and the Genre Moment: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
    In order to determine (and shape) the long-term impact of MOOCs, we must consider not only cognitive and technological factors but also cultural ones, such as the goals of education and the cultural processes that mediate the diffusion of a new teaching modality. This paper examines the implicit cultural assumptions in the "MOOCs and Technology to Advance Learning and Learning Research Symposium" and proposes an alternative theory of diffusion to Clayton Christensen's disruptive innovation model as an illustration of the complexity that these assumptions hide.
  • Making Effective Use of Multicore Systems A software perspective: The multicore transformation (Ubiquity symposium)
    Multicore processors dominate the commercial marketplace, with the consequence that almost all computers are now parallel computers. To take maximum advantage of multicore chips, applications and systems should take advantage of that parallelism. As of today, a small fraction of applications do. To improve that situation and to capitalize fully on the power of multicore systems, we need to adopt programming models, parallel algorithms, and programming languages that are appropriate for the multicore world, and to integrate these ideas and tools into the courses that educate the next generation of computer scientists.
  • MOOCs: Symptom, Not Cause of Disruption: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
    Is the MOOCs phenomenon a disruptive innovation or a transient bubble? It may be partly both. Broadcasting lectures and opening up courses via MOOCs by itself poses little change of the academic status quo. But academia is part of a broader academic-bureaucratic complex that provided a core framework for industrial-age institutions. The academic-bureaucratic complex rests on the premise that knowledge and talent must be scarce. Presumed scarcity justifies filtering access to information, to diplomas, and to jobs. But a wave of post-industrial technical, economic, and social innovations is making knowledge and talent rapidly more abundant and access more "open." This mega-trend is driving the academic-bureaucratic complex toward bankruptcy. It is being replaced by new, radically different arrangements of learning and work. The embrace of MOOCs is a symptom, not a cause of academia's obsolescence.
  • GPUs: High-performance Accelerators for Parallel Applications: The multicore transformation (Ubiquity symposium)
    Early graphical processing units (GPUs) were designed as high compute density, fixed-function processors ideally crafted to the needs of computer graphics workloads. Today, GPUs are becoming truly first-class computing elements on par with CPUs. Programming GPUs as self-sufficient general-purpose processors is not only hypothetically desirable, but feasible and efficient in practice, opening new opportunities for integration of GPUs in complex software systems.
  • Data-driven Learner Modeling to Understand and Improve Online Learning: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
    Advanced educational technologies are developing rapidly and online MOOC courses are becoming more prevalent, creating an enthusiasm for the seemingly limitless data-driven possibilities to affect advances in learning and enhance the learning experience. For these possibilities to unfold, the expertise and collaboration of many specialists will be necessary to improve data collection, to foster the development of better predictive models, and to assure models are interpretable and actionable. The big data collected from MOOCs needs to be bigger, not in its height (number of students) but in its width more meta-data and information on learners' cognitive and self-regulatory states needs to be collected in addition to correctness and completion rates. This more detailed articulation will help open up the black box approach to machine learning models where prediction is the primary goal. Instead, a data-driven learner model approach uses fine grain data that is conceived and developed from cognitive principles to build explanatory models with practical implications to improve student learning.
  • The Multicore Transformation Opening Statement: The multicore transformation (Ubiquity symposium)
    Chips with multiple processors, called multicore chips, have caused a resurgence of interest in parallel computing. Multicores are now available in servers, PCs, laptops, embedded systems, and mobile devices. Because multiprocessors could be mass-produced for the same cost as uniprocessors, parallel programming is no longer reserved for a small elite of programmers such as operating system developers, database system designers, and supercomputer users. Thanks to multicore chips, everyone's computer is a parallel machine. Parallel computing has become ubiquitous. In this symposium, seven authors examine what it means for computing to enter the parallel age.
  • Offering Verified Credentials in Massive Open Online Courses: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
    Massive open online courses (MOOCs) enable the delivery of high-quality educational experiences to large groups of students. Coursera, one of the largest MOOC providers, developed a program to provide students with verified credentials as a record of their MOOC performance. Such credentials help students convey achievements in MOOCs to future employers and academic programs. This article outlines the process and biometrics Coursera uses to establish and verify student identity during a course. We additionally present data that suggest verified certificate programs help increase student success rates in courses.
  • Assessment in Digital At-scale Learning Environments: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
    Assessment in traditional courses has been limited to either instructor grading, or problems that lend themselves well to relatively simple automation, such as multiple-choice bubble exams. Progress in educational technology, combined with economies of scale, allows us to radically increase both the depth and the accuracy of our measurements of what students learn. Increasingly, we can give rapid, individualized feedback for a wide range of problems, including engineering design problems and free-form text answers, as well as provide rich analytics that can be used to improve both teaching and learning. Data science and integration of data from disparate sources allows for increasingly inexpensive and accurate micro-assessments, such as those of open-ended textual responses, as well as estimation of higher-level skills that lead to long-term student success.
  • Ubiquity symposium: The science in computer science: natural computation
    In this twelfth piece of the Ubiquity symposium discussing science in computer science, Erol Gelenbe reviews computation in natural systems, focusing mainly on biology and citing examples of the computation that is inherent in chemistry, natural selection, gene regulatory networks, and neuronal systems. This article originally appeared as part of the "What is Computation" symposium.
  • Ubiquity symposium: Evolutionary computation and the processes of life: some computational aspects of essential properties of evolution and life

    While evolution has inspired algorithmic methods of heuristic optimization, little has been done in the way of using concepts of computation to advance our understanding of salient aspects of biological phenomena. The authors argue under reasonable assumptions, interesting conclusions can be drawn that are of relevance to behavioral evolution. The authors will focus on two important features of life---robustness and fitness---which, they will argue, are related to algorithmic probability and to the thermodynamics of computation, disciplines that may be capable of modeling key features of living organisms, and which can be used in formulating new algorithms of evolutionary computation.

  • Ubiquity symposium: The science in computer science: how to talk about science: five essential insights

    The goal of the LabRats Science Education Program is to inspire secondary school-age students from all backgrounds to love learning about science and technology. Shawn Carlson, the Executive Director of LabRats, presents five key insights that can be integrated into any science and technology program. The purpose of which is to overhaul students' attitudes and motivation to learn. Carlson also offers detailed suggestions on how educators can use these insights to inspire their students to become lifelong learners of science and technology.

  • Ubiquity symposium: The science in computer science: the sixteen character traits of science

    Phil Yaffe has provided numerous commentaries on various aspects of professional communication, which have helped readers more effectively articulate their own ideas about the future of computing. Here he tells us about how scientists see the world---the "scientific approach," he calls it---because he thinks many non-scientists see the world in a similar way. This realization can lower barriers of communication with scientists.

  • Ubiquity symposium: The science in computer science: broadening CS enrollments: an interview with Jan Cuny

    Until 2000, computer science enrollments were steadily increasing. Then suddenly students started turning to other fields; by 2008, enrollments had dropped by 50 percent. To that end, Jan Cuny has been leading a program at the National Science Foundation to increase both the number and diversity of students in computing. In this interview with Ubiquity, she discusses the magnitude of the problem and the initiatives underway to turn it around.

  • Ubiquity symposium: The science in computer science: computer science revisited

    The first article in this symposium, which originally appeared in the Communication the ACM, is courtesy of ACM President Vinton Cerf. Earlier this year, he called on all ACM members to commit to building a stronger science base for computer science. Cerf cites numerous open questions, mostly in software development, that cry out for experimental studies.

  • Ubiquity symposium: The science in computer science: opening statement

    The recent interest in encouraging more middle and high school students to prepare for careers in science, technology, engineering, or mathematics (STEM) has rekindled the old debate about whether computer science is really science. It matters today because computing is such a central field, impacting so many other fields, and yet it is often excluded from high school curricula because it is not seen as a science. In this symposium, fifteen authors examine different aspects from what is science, to natural information processes, to new science-enabled approaches in STEM education.

  • Ubiquity symposium: Evolutionary computation and the processes of life: the emperor is naked: evolutionary algorithms for real-world applications

    During the past 35 years the evolutionary computation research community has been studying properties of evolutionary algorithms. Many claims have been made---these varied from a promise of developing an automatic programming methodology to solving virtually any optimization problem (as some evolutionary algorithms are problem independent). However, the most important claim was related to applicability of evolutionary algorithms to solving very complex business problems, i.e. problems, where other techniques failed. So it might be worthwhile to revisit this claim and to search for evolutionary algorithm-based software applications, which were accepted by businesses and industries. In this article Zbigniew Michalewicz attempts to identify reasons for the mismatch between the efforts of hundreds of researchers who make substantial contribution to the field of evolutionary computation and the number of real-world applications, which are based on concepts of evolutionary algorithms.

  • Ubiquity symposium: Evolutionary computation and the processes of life: the essence of evolutionary computation

    In this third article in the ACM Ubiquity symposium on evolutionary computation Xin Yao provides a deeper understanding of evolutionary algorithms in the context of classical computational paradigms. This article discusses some of the most important issues in evolutionary computation. Three major areas are identified. The first is the theoretical foundation of evolutionary computation, especially the computational time complexity analysis. The second is on algorithm design, especially on hybridization, memetic algorithms, algorithm portfolios and ensembles of algorithms. The third is co-evolution, which seems to be under studied in both theory and practice. The primary aim of this article is to stimulate further discussions, rather than to offer any solutions.

  • Ubiquity symposium: Evolutionary computation and the processes of life: opening statement

    Evolution is one of the indispensable processes of life. After biologists found basic laws of evolution, computer scientists began simulating evolutionary processes and using operations discovered in nature for solving problems with computers. As a result, they brought forth evolutionary computation, inventing different kinds operations and procedures, such as genetic algorithms or genetic programming, which imitated natural biological processes. Thus, the main goal of our Symposium is exploration of the essence and characteristic properties of evolutionary computation in the context of life and computation.

  • Ubiquity symposium: What have we said about computation?: closing statement

    The "computation" symposium presents the reflections of thinkers from many sectors of computing on the fundamental question in the background of everything we do as computing professionals. While many of us have too many immediate tasks to allow us time for our own deep reflection, we do appreciate when others have done this for us. Peter Freeman points out, by analogy, that as citizens of democracies we do not spend a lot of time reflecting on the question, "What is a democracy," but from time to time we find it helpful to see what philosophers and political scientists are saying about the context in which we act as citizens.

  • Ubiquity symposium: What is information?: beyond the jungle of information theories

    Editor's Introduction This fourteenth piece is inspired by a question left over from the Ubiquity Symposium entitled What is Computation? Peter J. Denning Editor

    Computing saw the light as a branch of mathematics in the '40s, and progressively revealed ever new aspects [gol97]. Nowadays even laymen have become aware of the broad assortment of functions achieved by systems, and the prismatic nature of computing challenges thinkers who explore the various topics that substantiate computer science [mul98].

  • Ubiquity symposium: Biological Computation

    In this thirteenth piece to the Ubiquity symposium discussing What is computation? Melanie Mitchell discusses the idea that biological computation is a process that occurs in nature, not merely in computer simulations of nature.
    --Editor

  • Ubiquity symposium: Natural Computation

    In this twelfth piece to the Ubiquity symposium discussing What is computation? Erol Gelenbe reviews computation in natural systems, focusing mainly on biology and citing examples of the computation that is inherent in chemistry, natural selection, gene regulatory networks, and neuronal systems.
    --Editor

  • Ubiquity symposium: Computation, Uncertainty and Risk

    In this eleventh piece to the Ubiquity symposium discussing What is computation? Jeffrey P. Buzen develops a new computational model for representing computations that arise when deterministic algorithms process workloads whose detailed structure is uncertain.
    --Editor

  • Ubiquity symposium 'What is computation?': Computation is process

    Various authors define forms of computation as specialized types of processes. As the scope of computation widens, the range of such specialties increases. Dennis J. Frailey posits that the essence of computation can be found in any form of process, hence the title and the thesis of this paper in the Ubiquity symposium discussion what is computation. --Editor

  • Ubiquity symposium 'What is computation?': Opening statement

    Most people understand a computation as a process evoked when a computational agent acts on its inputs under the control of an algorithm. The classical Turing machine model has long served as the fundamental reference model because an appropriate Turing machine can simulate every other computational model known. The Turing model is a good abstraction for most digital computers because the number of steps to execute a Turing machine algorithm is predictive of the running time of the computation on a digital computer. However, the Turing model is not as well matched for the natural, interactive, and continuous information processes frequently encountered today. Other models whose structures more closely match the information processes involved give better predictions of running time and space. Models based on transforming representations may be useful.

  • Ubiquity symposium 'What is computation?': Computation is symbol manipulation

    In the second in the series of articles in the Ubiquity Symposium What is Computation?, Prof. John S. Conery of the University of Oregon explains why he believes computation can be seen as symbol manipulation. For more articles in this series, see table of contents in the http://ubiquity.acm.org/article.cfm?id=1870596 Editors Introduction to the symposium. --Editor