A Ubiquity symposium is an organized debate around a proposition or point of view. It is a means to explore a complex issue from multiple perspectives. An early example of a symposium on teaching computer science appeared in Communications of the ACM (December 1989).
To organize a symposium, please read our guidelines.
Ubiquity Symposium: Workings of Science
Table of Contents
- Ubiquity: Editors Weigh On The Workings Of Science by Peter Denning
- The Paradoxical Faces of Science by Peter Denning
- Can Mankind Survive Scientific Illiteracy? by Philip Yaffe
- Character Traits of Science by Peter Denning and Philip Yaffe
- Is Science Limited to Science? By Philip Yaffe
- AI in 2156: The Science of Intelligence by Kemal A. Delic and Jeff A. Riley
- Is Engineering Applied Science? by Sharad Sinha
- Trust in Science and Mathematics by Jeffrey Johnson and Andrew Odlyzko
- Debunked Software Theories by Walter Tichy
- How Software Engineering Research Became Empirical by Walter Tichy
Symposia
-
Closing Statement: Reflections on a singularity symposium: The technological singularity (Ubiquity symposium)
by Espen Andersen
December 2014The debate about computers and intelligence must go on - we have more to learn, and more people need to convert their strong opinions to measured arguments. There is no reason to panic, however.
-
What About an Unintelligent Singularity?: The technological singularity (Ubiquity symposium)
by Peter J. Denning
December 2014For years we humans have worried about plagues, asteroids, earthquakes, eruptions, fires, floods, famines, wars, genocides, and other uncontrollable events that could wipe away our civilization. In the modern age, with so much depending on computing and communications, we have added computers to our list of potential threats. Could we perish from the increasing intelligence of computers? Denning thinks that is less of a threat than the apparently mundane march of automated bureaucracies. He also asserts that none of the possible negative outcomes is a forgone conclusion because humans teaming with machines are far more intelligent than either one alone.
-
Computers versus Humanity: Do we compete?: The technological singularity (Ubiquity symposium)
by Liah Greenfeld, Mark Simes
November 2014Liah Greenfeld and Mark Simes have long worked together, integrating the perspectives of two very different disciplinary traditions: cultural history/historical sociology and human neuroscience. The combination of their areas of expertise in the empirical investigation of mental disorders, which severely affect intelligence---among other things---has led them to certain conclusions that may throw a special light on the question of this symposium: Will computers outcompete us all?
-
Exponential Technology and The Singularity: The technological singularity (Ubiquity symposium)
by Peter Cochrane
November 2014The Priesthood of the Singularity posits a fast approaching prospect of machines overtaking human abilities (Ray Kurzweil's The Singularity is Near, Viking Press, 2006) on the basis of the exponential rate of electronic integration---memory and processing power. In fact, they directly correlate the growth of computing technology with that of machine intelligence as if the two were connected in some simple-to-understand and predictable way. Here we present a different view based upon the fundamentals of intelligence and a more likely relationship. We conclude that machine intelligence is growing in a logarithmic (or at best linear fashion) rather than the assumed exponential rate.
-
Human Enhancement--The way ahead: The technological singularity (Ubiquity symposium)
by Kevin Warwick
October 2014In this paper a look is taken at artificial intelligence and the ways it can be brought about, either by means of a computer or through biological growth. Ways of linking the two methods are also discussed, particularly the possibilities of linking human and artificial brains together. In this regard practical experiments are referred to in which human enhancement can be achieved though linking with artificial intelligence.
-
The Singularity and the State of the Art in Artificial Intelligence: The technological singularity (Ubiquity symposium)
by Ernest Davis
October 2014The state of the art in automating basic cognitive tasks, including vision and natural language understanding, is far below human abilities. Real-world reasoning, which is an unavoidable part of many advanced forms of computer vision and natural language understanding, is particularly difficult---suggesting the advent of computers with superhuman general intelligence is not imminent. The possibility of attaining a singularity by computers that lack these abilities is discussed briefly.
-
Opening Statement: Will computers out-compete us all?: The technological singularity (Ubiquity symposium)
by Espen Andersen
October 2014To jumpstart this symposium, Espen Andersen describes the debate surrounding "technological singularity" and questions whether this is something that will happen---and if so, what the consequences might be.
-
The Future of Synchronization on Multicores: The multicore transformation (Ubiquity symposium)
by Maurice Herlihy
September 2014Synchronization bugs such as data races and deadlocks make every programmer cringe traditional locks only provide a partial solution, while high-contention locks can easily degrade performance. Maurice Herlihy proposes replacing locks with transactions. He discusses adapting the well-established concept of data base transactions to multicore systems and shared main memory.
-
The MOOC and the Genre Moment: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
by Michael Feldstein
September 2014In order to determine (and shape) the long-term impact of MOOCs, we must consider not only cognitive and technological factors but also cultural ones, such as the goals of education and the cultural processes that mediate the diffusion of a new teaching modality. This paper examines the implicit cultural assumptions in the "MOOCs and Technology to Advance Learning and Learning Research Symposium" and proposes an alternative theory of diffusion to Clayton Christensen's disruptive innovation model as an illustration of the complexity that these assumptions hide.
-
The Multicore Transformation Closing Statement: The multicore transformation (Ubiquity symposium)
by Walter Tichy
September 2014Multicore CPUs and GPUs have brought parallel computation within reach of any programmer. How can we put the performance potential of these machines to good use? The contributors of the symposium suggest a number of approaches, among them algorithm engineering, parallel programming languages, compilers that target both SIMD and MIMD architectures, automatic detection and repair of data races, transactional memory, automated performance tuning, and automatic parallelizers. The transition from sequential to parallel computing is now perhaps at the half-way point. Parallel programming will eventually become routine, because advances in hardware, software, and programming tools are simplifying the problems of designing and implementing parallel computations.
-
Making Effective Use of Multicore Systems A software perspective: The multicore transformation (Ubiquity symposium)
by Keith D. Cooper
September 2014Multicore processors dominate the commercial marketplace, with the consequence that almost all computers are now parallel computers. To take maximum advantage of multicore chips, applications and systems should take advantage of that parallelism. As of today, a small fraction of applications do. To improve that situation and to capitalize fully on the power of multicore systems, we need to adopt programming models, parallel algorithms, and programming languages that are appropriate for the multicore world, and to integrate these ideas and tools into the courses that educate the next generation of computer scientists.
-
GPUs: High-performance Accelerators for Parallel Applications: The multicore transformation (Ubiquity symposium)
by Mark Silberstein
August 2014Early graphical processing units (GPUs) were designed as high compute density, fixed-function processors ideally crafted to the needs of computer graphics workloads. Today, GPUs are becoming truly first-class computing elements on par with CPUs. Programming GPUs as self-sufficient general-purpose processors is not only hypothetically desirable, but feasible and efficient in practice, opening new opportunities for integration of GPUs in complex software systems.
-
Multicore Processors and Database Systems: The multicore transformation (Ubiquity symposium)
by Kenneth A. Ross
August 2014Database management systems are necessary for transaction processing and query processing. Today, parallel database systems can be run on multicore platforms. Presented within is an overview of how multicore machines have impacted the design and implementation of database management systems.
-
The MOOC Spring: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
by Fred Siff
August 2014Fred Siff warns us that online learning, and in particular MOOCs, are threatening to overrun not just old models of instruction but the very nature of higher education institutions themselves.
-
MOOCs: Symptom, Not Cause of Disruption: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
by Lewis J. Perelman
August 2014Is the MOOCs phenomenon a disruptive innovation or a transient bubble? It may be partly both. Broadcasting lectures and opening up courses via MOOCs by itself poses little change of the academic status quo. But academia is part of a broader academic-bureaucratic complex that provided a core framework for industrial-age institutions. The academic-bureaucratic complex rests on the premise that knowledge and talent must be scarce. Presumed scarcity justifies filtering access to information, to diplomas, and to jobs. But a wave of post-industrial technical, economic, and social innovations is making knowledge and talent rapidly more abundant and access more "open." This mega-trend is driving the academic-bureaucratic complex toward bankruptcy. It is being replaced by new, radically different arrangements of learning and work. The embrace of MOOCs is a symptom, not a cause of academia's obsolescence.
-
Engineering Parallel Algorithms: The multicore transformation (Ubiquity symposium)
by Peter Sanders
July 2014In the past, parallel processing was a specialized approach to high-performance computing. Today, we have to rethink the computational cores of algorithmic and data structures applications. In this article we discuss how this process of rethinking can be understood using algorithm engineering.
-
Limitations of MOOCs for Computing Education- Addressing our needs: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
by Mark Guzdial
July 2014Computing education has some significant education challenges today. We aren't diverse enough, and we need to be able to develop more teachers. Despite popular opinion, the current generations of MOOCs don't meet those needs.
-
Can MOOCs Help Reduce College Tuition?: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
by Stephen Ruth
July 2014This article will briefly describe some of the cost issues associated with MOOCs and suggest a perspective through which drastic tuition savings might someday be achieved, possibly through the assistance of MOOCs.
-
MOOCs on and off the Farm: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
by John C. Mitchell
June 2014Whether MOOCs can provide a good education and broaden educational opportunities at lower cost is an ongoing discussion. In this article Stanford professor, John C. Mitchell, reflects on Stanford University's pioneering role in the MOOC movement, explains how to harness the power of digital technology, and offers predictions for the academic landscape.
-
The Science of Computer Science: Closing Statement: The Science of Computer Science (Ubiquity Symposium)
by Richard Snodgrass, Peter Denning
June 2014Where does computer science as an intellectual discipline fit in human discourse? Over a dozen contributors have looked at this question of identity from as many viewpoints. In this closing statement, we emphasize six themes running through these 16 commentaries and draw some conclusions that seem to be supported by the symposium.
-
Auto-tuning parallel software: an interview with Thomas Fahringer: the multicore transformation (Ubiquity symposium)
by Walter Tichy
June 2014In this interview conducted by Ubiquity editor Walter Tichy, Prof. Thomas Fahringer of the Institute of Computer Science, University of Innsbruck (Austria) discusses the difficulty in predicting the performance of parallel programs, and the subsequent popularity of auto-tuning to automate program optimization.
-
Waiting for Godot? the right language abstractions for parallel programming should be here soon: the multicore transformation (Ubiquity symposium)
by Todd Mytkowicz, Wolfram Schulte
June 2014As a discipline, we have been discussing parallel programming for years. After all these years, do we know the right language abstractions for parallel programming? Would we recognize the right abstractions if we were to see them? In this article, Todd Mytkowicz and Wolfram Schulte, both from Microsoft Research, ask: Have we been simply biding our time, waiting for our Godot?
-
Curricular Technology Transfer for the 21st Century: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
by Armando Fox
June 2014Is the MOOC honeymoon winding down? With many university faculty opposing the MOOC movement, the author argues taking the best of massive online courses access to high-quality materials and rapid feedback to students to implement SPOCs (small private online courses) will provide a more effective leverage of instructors' time and resources.
-
Data-driven Learner Modeling to Understand and Improve Online Learning: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
by Kenneth R. Koedinger, Elizabeth A. McLaughlin, John C. Stamper
May 2014Advanced educational technologies are developing rapidly and online MOOC courses are becoming more prevalent, creating an enthusiasm for the seemingly limitless data-driven possibilities to affect advances in learning and enhance the learning experience. For these possibilities to unfold, the expertise and collaboration of many specialists will be necessary to improve data collection, to foster the development of better predictive models, and to assure models are interpretable and actionable. The big data collected from MOOCs needs to be bigger, not in its height (number of students) but in its width more meta-data and information on learners' cognitive and self-regulatory states needs to be collected in addition to correctness and completion rates. This more detailed articulation will help open up the black box approach to machine learning models where prediction is the primary goal. Instead, a data-driven learner model approach uses fine grain data that is conceived and developed from cognitive principles to build explanatory models with practical implications to improve student learning.
-
The Multicore Transformation Opening Statement: The multicore transformation (Ubiquity symposium)
by Walter Tichy
May 2014Chips with multiple processors, called multicore chips, have caused a resurgence of interest in parallel computing. Multicores are now available in servers, PCs, laptops, embedded systems, and mobile devices. Because multiprocessors could be mass-produced for the same cost as uniprocessors, parallel programming is no longer reserved for a small elite of programmers such as operating system developers, database system designers, and supercomputer users. Thanks to multicore chips, everyone's computer is a parallel machine. Parallel computing has become ubiquitous. In this symposium, seven authors examine what it means for computing to enter the parallel age.
-
Offering Verified Credentials in Massive Open Online Courses: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
by Andrew Maas, Chris Heather, Chuong (Tom) Do, Relly Brandman, Daphne Koller, Andrew Ng
May 2014Massive open online courses (MOOCs) enable the delivery of high-quality educational experiences to large groups of students. Coursera, one of the largest MOOC providers, developed a program to provide students with verified credentials as a record of their MOOC performance. Such credentials help students convey achievements in MOOCs to future employers and academic programs. This article outlines the process and biometrics Coursera uses to establish and verify student identity during a course. We additionally present data that suggest verified certificate programs help increase student success rates in courses.
-
MOOCs and Technology to Advance Learning and Learning Research Opening Statement: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
by Candace Thille
April 2014MOOCs have fueled both hope and anxiety about the future of higher education. Our objective in this symposium is to surface and explore some of the open questions which have arisen in the MOOC debates. In this symposium, ten authors examine different aspects of MOOCs and technology to advance learning and learning research.
-
Assessment in Digital At-scale Learning Environments: MOOCs and technology to advance learning and learning research (Ubiquity symposium)
by Piotr Mitros, Anant Agarwal, Vik Paruchuri
April 2014Assessment in traditional courses has been limited to either instructor grading, or problems that lend themselves well to relatively simple automation, such as multiple-choice bubble exams. Progress in educational technology, combined with economies of scale, allows us to radically increase both the depth and the accuracy of our measurements of what students learn. Increasingly, we can give rapid, individualized feedback for a wide range of problems, including engineering design problems and free-form text answers, as well as provide rich analytics that can be used to improve both teaching and learning. Data science and integration of data from disparate sources allows for increasingly inexpensive and accurate micro-assessments, such as those of open-ended textual responses, as well as estimation of higher-level skills that lead to long-term student success.
-
Ubiquity symposium: The science in computer science: the computing sciences and STEM education
by Paul S. Rosenbloom
March 2014In this latest installment of "The Science in Computer Science," Prof. Paul Rosenbloom continues the discussion on whether or not computer science can be considered a "natural science." He argues not only is computing the basis for a true science, it is in fact an entire scientific domain.
-
Ubiquity symposium: The science in computer science: unplugging computer science to find the science
by Tim Bell
March 2014The Computer Science Unplugged project provides activities that enable students to engage with concepts from computer science without having to program. Many of the activities provide the basis of a scientific exploration of computer science, and thus help students to see the relationship of the discipline with science.
-
Where's the science in software engineering?: Ubiquity Symposium: The science in computer science
by Walter F. Tichy
March 2014This article is a personal account of the methodological evolution of software engineering research from the 1970s to the present.
-
Ubiquity symposium: The science in computer science: natural computation
by Erol Gelenbe
February 2014In this twelfth piece of the Ubiquity symposium discussing science in computer science, Erol Gelenbe reviews computation in natural systems, focusing mainly on biology and citing examples of the computation that is inherent in chemistry, natural selection, gene regulatory networks, and neuronal systems. This article originally appeared as part of the "What is Computation" symposium.
-
Empirical software research: an interview with Dag Sjøberg, University of Oslo, Norway
by Walter Tichy
December 2013
-
Ubiquity symposium: Evolutionary computation and the processes of life: what the no free lunch theorems really mean: how to improve search algorithms
by David H. Wolpert
December 2013
-
Ubiquity symposium: Evolutionary computation and the processes of life: towards synthesis of computational life-like processes of functional and evolvable proto-systems via extending evolutionary computation
by Darko Roglic
December 2013
-
Ubiquity symposium: Evolutionary computation and the processes of life: perspectives and reality of evolutionary computation: closing statement
by Mark Burgin, Eugene Eberbach
December 2013
-
Ubiquity symposium: The science in computer science: On experimental algorithmics: an interview with Catherine McGeoch and Bernard Moret
by Richard T. Snodgrass
December 2013
-
Ubiquity symposium: The science in computer science: why you should choose math in high school
by Espen Andersen
May 2013
-
Ubiquity symposium: Evolutionary computation and the processes of life: information, biological, and evolutionary computing
by Walter Riofrio
May 2013
-
Ubiquity symposium: Evolutionary computation and the processes of life: some computational aspects of essential properties of evolution and life
by Hector Zenil, James A. R. Marshall
April 2013While evolution has inspired algorithmic methods of heuristic optimization, little has been done in the way of using concepts of computation to advance our understanding of salient aspects of biological phenomena. The authors argue under reasonable assumptions, interesting conclusions can be drawn that are of relevance to behavioral evolution. The authors will focus on two important features of life---robustness and fitness---which, they will argue, are related to algorithmic probability and to the thermodynamics of computation, disciplines that may be capable of modeling key features of living organisms, and which can be used in formulating new algorithms of evolutionary computation.
-
Ubiquity symposium: The science in computer science: how to talk about science: five essential insights
by Shawn Carlson
March 2013The goal of the LabRats Science Education Program is to inspire secondary school-age students from all backgrounds to love learning about science and technology. Shawn Carlson, the Executive Director of LabRats, presents five key insights that can be integrated into any science and technology program. The purpose of which is to overhaul students' attitudes and motivation to learn. Carlson also offers detailed suggestions on how educators can use these insights to inspire their students to become lifelong learners of science and technology.
-
Ubiquity symposium: The science in computer science: the sixteen character traits of science
by Philip Yaffe
March 2013Phil Yaffe has provided numerous commentaries on various aspects of professional communication, which have helped readers more effectively articulate their own ideas about the future of computing. Here he tells us about how scientists see the world---the "scientific approach," he calls it---because he thinks many non-scientists see the world in a similar way. This realization can lower barriers of communication with scientists.
-
Ubiquity symposium: The science in computer science: broadening CS enrollments: an interview with Jan Cuny
by Richard Snodgrass
February 2013Until 2000, computer science enrollments were steadily increasing. Then suddenly students started turning to other fields; by 2008, enrollments had dropped by 50 percent. To that end, Jan Cuny has been leading a program at the National Science Foundation to increase both the number and diversity of students in computing. In this interview with Ubiquity, she discusses the magnitude of the problem and the initiatives underway to turn it around.
-
Ubiquity symposium: Evolutionary computation and the processes of life: evolutionary computation in physical world
by Lukáš Sekanina
February 2013In this ninth symposium article, Lukáš Sekanina addresses evolutionary and evolvable hardware; answering the questions what it means for a physical system to be designed evolutionarily and on what kinds of computations such physical systems perform will be answered.
-
Ubiquity symposium: The science in computer science: performance analysis: experimental computer science at its best
by Peter J. Denning
January 2013In the third installment of this symposium, which originally appeared in the Communication the ACM, we go back to 1981. More than 30 years ago, I called on ACM members to employ more experimental methods and avoid confusing hacking (tinkering) with true science. Joining a long tradition of ACM Presidents speaking up about computing as science.
-
Ubiquity symposium: Evolutionary computation and the processes of life: evolutionary computation and evolutionary game theory: expecting the unexpected
by David B. Fogel
January 2013In this article, David Fogel discusses the relationship between evolutionary computation and evolutionary game theory. The mathematics of evolutionary game theory relies on assumptions that often fail to describe the real-world conditions that the theory is intended to model. This article highlights those assumptions and suggests evolutionary computation may ultimately serve as a more useful approach to understanding complex adaptive systems in nature.
-
Ubiquity symposium: The science in computer science: computer science revisited
by Vinton G. Cerf
December 2012The first article in this symposium, which originally appeared in the Communication the ACM, is courtesy of ACM President Vinton Cerf. Earlier this year, he called on all ACM members to commit to building a stronger science base for computer science. Cerf cites numerous open questions, mostly in software development, that cry out for experimental studies.
-
Ubiquity symposium: The science in computer science: opening statement
by Peter Denning
December 2012The recent interest in encouraging more middle and high school students to prepare for careers in science, technology, engineering, or mathematics (STEM) has rekindled the old debate about whether computer science is really science. It matters today because computing is such a central field, impacting so many other fields, and yet it is often excluded from high school curricula because it is not seen as a science. In this symposium, fifteen authors examine different aspects from what is science, to natural information processes, to new science-enabled approaches in STEM education.
-
Ubiquity symposium: Evolutionary computation and the processes of life: Darwinian software engineering: the short term, the middle ground, and the long haul
by Moshe Sipper
December 2012In this article, Moshe Sipper discusses a foreseeable future in which an entirely new paradigm of producing software will emerge. Sipper calls this software engineering revolution, "Darwinian Software Engineering"---a time when it will be possible to program computers by means of evolution.
-
Ubiquity symposium: Evolutionary computation and the processes of life: evolutionary computation as a direction in nature-inspired computing
by Hongwei Mo
November 2012In this article evolutionary computation (EC) is considered as a kind of nature-inspired computing (NIC) paradigm. EC not only has great effect on the development of computing methods from structure to process, but also has great effect on many aspects of our society as a ubiquitous or general computational thinking. EC is still one of the best choices for problem solving among all methods when people face more and more complex problems.
-
Ubiquity symposium: Evolutionary computation and the processes of life: the emperor is naked: evolutionary algorithms for real-world applications
by Zbigniew Michalewicz
November 2012During the past 35 years the evolutionary computation research community has been studying properties of evolutionary algorithms. Many claims have been made---these varied from a promise of developing an automatic programming methodology to solving virtually any optimization problem (as some evolutionary algorithms are problem independent). However, the most important claim was related to applicability of evolutionary algorithms to solving very complex business problems, i.e. problems, where other techniques failed. So it might be worthwhile to revisit this claim and to search for evolutionary algorithm-based software applications, which were accepted by businesses and industries. In this article Zbigniew Michalewicz attempts to identify reasons for the mismatch between the efforts of hundreds of researchers who make substantial contribution to the field of evolutionary computation and the number of real-world applications, which are based on concepts of evolutionary algorithms.
-
Ubiquity symposium: Evolutionary computation and the processes of life: on the role of evolutionary models in computing
by Max H. Garzon
November 2012In this article in the ACM Ubiquity symposium on evolutionary computation Max H. Garzon presents reflections on the connections between evolutionary computation, natural computation, and current definitions of computer science. The primary aim and result is a more coherent, comprehensive and modern definition of computer science.
-
Ubiquity symposium: Evolutionary computation and the processes of life: life lessons taught by simulated evolution
by Hans-Paul Schwefel
September 2012In this second article in the ACM Ubiquity symposium on evolutionary computation Hans-Paul Schwefel explores the effects of simulating evolutionary mechanisms.
-
Ubiquity symposium: Evolutionary computation and the processes of life: the essence of evolutionary computation
by Xin Yao
September 2012In this third article in the ACM Ubiquity symposium on evolutionary computation Xin Yao provides a deeper understanding of evolutionary algorithms in the context of classical computational paradigms. This article discusses some of the most important issues in evolutionary computation. Three major areas are identified. The first is the theoretical foundation of evolutionary computation, especially the computational time complexity analysis. The second is on algorithm design, especially on hybridization, memetic algorithms, algorithm portfolios and ensembles of algorithms. The third is co-evolution, which seems to be under studied in both theory and practice. The primary aim of this article is to stimulate further discussions, rather than to offer any solutions.
-
Ubiquity symposium: Evolutionary computation and the processes of life: opening statement
by Mark Burgin, Eugene Eberbach
August 2012Evolution is one of the indispensable processes of life. After biologists found basic laws of evolution, computer scientists began simulating evolutionary processes and using operations discovered in nature for solving problems with computers. As a result, they brought forth evolutionary computation, inventing different kinds operations and procedures, such as genetic algorithms or genetic programming, which imitated natural biological processes. Thus, the main goal of our Symposium is exploration of the essence and characteristic properties of evolutionary computation in the context of life and computation.
-
Ubiquity symposium: What have we said about computation?: closing statement
by Peter J. Denning
April 2011The "computation" symposium presents the reflections of thinkers from many sectors of computing on the fundamental question in the background of everything we do as computing professionals. While many of us have too many immediate tasks to allow us time for our own deep reflection, we do appreciate when others have done this for us. Peter Freeman points out, by analogy, that as citizens of democracies we do not spend a lot of time reflecting on the question, "What is a democracy," but from time to time we find it helpful to see what philosophers and political scientists are saying about the context in which we act as citizens.
-
Ubiquity symposium: What is information?: beyond the jungle of information theories
by Paolo Rocchi
March 2011Editor's Introduction This fourteenth piece is inspired by a question left over from the Ubiquity Symposium entitled What is Computation? Peter J. Denning Editor
Computing saw the light as a branch of mathematics in the '40s, and progressively revealed ever new aspects [gol97]. Nowadays even laymen have become aware of the broad assortment of functions achieved by systems, and the prismatic nature of computing challenges thinkers who explore the various topics that substantiate computer science [mul98].
-
Ubiquity symposium: Biological Computation
by Melanie Mitchell
February 2011In this thirteenth piece to the Ubiquity symposium discussing What is computation? Melanie Mitchell discusses the idea that biological computation is a process that occurs in nature, not merely in computer simulations of nature.
--Editor
-
Ubiquity symposium: Natural Computation
by Erol Gelenbe
February 2011In this twelfth piece to the Ubiquity symposium discussing What is computation? Erol Gelenbe reviews computation in natural systems, focusing mainly on biology and citing examples of the computation that is inherent in chemistry, natural selection, gene regulatory networks, and neuronal systems.
--Editor
-
Ubiquity symposium: Computation, Uncertainty and Risk
by Jeffrey P. Buzen
January 2011In this eleventh piece to the Ubiquity symposium discussing What is computation? Jeffrey P. Buzen develops a new computational model for representing computations that arise when deterministic algorithms process workloads whose detailed structure is uncertain.
--Editor
-
Ubiquity symposium: What is the Right Computational Model for Continuous Scientific Problems?
by Joseph Traub
January 2011In this tenth piece to the Ubiquity symposium discussing What is computation? Joseph Traub shares his views about using the Turing Machine model and the real number model for solving continuous scientific problems.
--Editor
-
Ubiquity symposium: Computation and Computational Thinking
by Alfred V. Aho
January 2011In this ninth piece to the Ubiquity symposium discussing What is computation? Alfred V. Aho shares his views about the importance of computational thinking in answering the question.
--Editor
-
Ubiquity symposium 'What is computation?': The enduring legacy of the Turing machine
by Lance Fortnow
December 2010
-
Ubiquity symposium 'What is computation?': Computation and information
by Ruzena Bajcsy
December 2010In this sixth article in the ACM Ubiquity symposium,What is Computation? Ruzena Bajcsy of the University of California-Berkeley explains that computation can be seen as a transformation or function of information.
--Editor
-
Ubiquity symposium 'What is computation?': Computing and computation
by Paul S. Rosenbloom
December 2010In this fifth article in the ACM Ubiquity symposium on What is computation? Paul S. Rosenbloom explains why he believes computing is the fourth great scientific domain, on par with the physical, life, and social sciences.
Editor
-
Ubiquity symposium 'What is computation?': Computation and Fundamental Physics
by Dave Bacon
December 2010In this seventh article in the ACM Ubiquity symposium, What is Computation?, Dave Bacon of University of Washington explains why he thinks discussing the question is as important as thinking about what it means to be self-aware. —Editor
-
Ubiquity symposium 'What is computation?': Computation is process
by Dennis J. Frailey
November 2010Various authors define forms of computation as specialized types of processes. As the scope of computation widens, the range of such specialties increases. Dennis J. Frailey posits that the essence of computation can be found in any form of process, hence the title and the thesis of this paper in the Ubiquity symposium discussion what is computation. --Editor
-
Ubiquity symposium 'What is Computation?': The evolution of computation
by Peter Wegner
November 2010In this second article in the ACM Ubiquity symposium on 'What is computation?' Peter Wegner provides a history of the evolution of comptuation. --Editor
-
Ubiquity symposium 'What is computation?': Computation is symbol manipulation
by John S. Conery
November 2010In the second in the series of articles in the Ubiquity Symposium What is Computation?, Prof. John S. Conery of the University of Oregon explains why he believes computation can be seen as symbol manipulation. For more articles in this series, see table of contents in the http://ubiquity.acm.org/article.cfm?id=1870596 Editors Introduction to the symposium. --Editor
-
Ubiquity symposium 'What is computation?': Opening statement
by Peter J. Denning
November 2010Most people understand a computation as a process evoked when a computational agent acts on its inputs under the control of an algorithm. The classical Turing machine model has long served as the fundamental reference model because an appropriate Turing machine can simulate every other computational model known. The Turing model is a good abstraction for most digital computers because the number of steps to execute a Turing machine algorithm is predictive of the running time of the computation on a digital computer. However, the Turing model is not as well matched for the natural, interactive, and continuous information processes frequently encountered today. Other models whose structures more closely match the information processes involved give better predictions of running time and space. Models based on transforming representations may be useful.
-
Ubiquity symposium 'What is computation?': Editor's Introduction
by Peter J. Denning
October 2010The first Ubiquity symposium seeks to discuss the question, "What is computation?"