acm - an acm publication

P. J. Denning Collection

  • Workings of science: The paradoxical faces of science

    Science has two faces: One sees settled science and all its laws; the other sees only the unsettlement of hypotheses yet to be verified and the uncertainty of whether they can be verified. Both faces are integral to the healthy workings of science.

    ...
  • Workings of science: Ubiquity editors weigh on the workings of science

    The COVID-19 pandemic has affected millions around the globe. But even despite countless deaths, global lockdowns, political unrest, and economic uncertainty there are many who insist the science behind public health policies, viral transmission research, and vaccine development is not only wrong but is part of a global agenda. So how did we get here, how has science become politicized, and what does it mean to "follow the science?" To answer these questions, Ubiquity's editors have come together to present a comprehensive overview on how science works.

    ...
  • COVID-19 and computation for policy

    Governments across the world are formulating and implementing medical, social, economic and other policies to manage the COVID-19 pandemic and protect their citizens. Many governments claim that their policies follow the best available scientific advice. Much of that advice comes from computational modeling. Two of the main types of model are presented: the SIR (Susceptible, Infected, Recovered) model developed by Kermack and McKendrick in the 1920s and the more recent Agent Based Models. The SIR model gives a good intuition of how epidemics spread; including how mass vaccination can contain them. It is less useful than Agent Based Modeling for investigating the effects of policies such as social distancing, self-isolation, wearing facemasks, and test-trace-isolate.

    Politicians and the public have been perplexed to observe the lack of consensus in the scientific community and there being no single 'best science' to follow. The outcome of computational models depends on the assumptions made and the data used. Different assumptions will lead to different computational outcomes, especially when the available data are so poor. This leads some commentators to argue that the models are wrong and dangerous. Some may be, but computational modeling is one of the few ways available to explore and try to understand the space of possible futures. This lack of certainty means that computational modeling must be seen as just one of many inputs into the political decision making process. Politicians must balance all the competing inputs and make timely decisions based on their conclusions---be they right or wrong. In the same way that democracy is the least worst form of government, computational modeling may be the least worst way of trying to understand the future for policy making.

    ...
  • An interview with Bushra Anjum: learning to be a generalist is valuable to your career

    Dr. Bushra Anjum is a senior editor for ACM's web-based magazine Ubiquity. Her research background is in performance evaluation and queuing theory. She is also a trained data scientist, having worked extensively with predictive analytics. Anjum, a Fulbright Scholar, has previously held academic positions in the U.S. and Pakistan, and is a keen enthusiast of promoting diversity in the STEM fields. She is a mentor at Rewriting the Code, GlobalTechWomen, ReigningIt, Empowering Leadership Alliance, LeanIn.org, Computing Beyond the Double Bind's mentoring network, and others. Dr. Anjum can be contacted via Twitter @DrBushraAnjum.

    ...
  • Big data: big data or big brother? that is the question now.

    This ACM Ubiquity Symposium presented some of the current thinking about big data developments across four topical dimensions: social, technological, application, and educational. While 10 articles can hardly touch the expanse of the field, we have sought to cover the most important issues and provide useful insights for the curious reader. More than two dozen authors from academia and industry provided shared their points of view, their current focus of interest and their outlines of future research. Big digital data has changed and will change the world in many ways. It will bring some big benefits in the future, but combined with big AI and big IoT devices creates several big challenges. These must be carefully addressed and properly resolved for the future benefit of humanity.

    ...
  • Big Data, Digitization, and Social Change: Big Data (Ubiquity symposium)

    We use the term "big data" with the understanding that the real game changer is the connection and digitization of everything. Every portfolio is affected: finance, transport, housing, food, environment, industry, health, welfare, defense, education, science, and more. The authors in this symposium will focus on a few of these areas to exemplify the main ideas and issues.

    ...
  • Computational design

    Computational thinking refers to a deliberative process that finds a computational solution for a concern. Computational doing refers to use of computation and computational tools to address concerns. Computational design refers to creating new computational tools and methods that are adopted by the members of a community to address their concerns. Unfortunately, the definitions of both "thinking" and "doing" are fuzzy and have allowed misconceptions about the nature of algorithms. Fortunately, it is possible to eliminate the fuzziness in the definitions by focusing on computational design, which is at the intersection between thinking and doing. Computational design is what we are really after and would be a good substitute for computational thinking and doing.

    ...
  • Rethinking Randomness: An interview with Jeff Buzen, Part II

    In Part 1, Jeff Buzen discussed the basic principles of his new approach to randomness, which is the topic of his book Rethinking Randomness. He continues here with a more detailed discussion of models that have been used successfully to predict the performance of systems ranging from early time sharing computers to modern web servers.

    Peter J. Denning
    Editor in Chief

    ...
  • Rethinking Randomness: An interview with Jeff Buzen, Part I

    For more than 40 years, Jeffrey Buzen has been a leader in performance prediction of computer systems and networks. His first major contribution was an algorithm, known now as Buzen's Algorithm, that calculated the throughput and response time of any practical network of servers in a few seconds. Prior algorithms were useless because they would have taken months or years for the same calculations. Buzen's breakthrough opened a new industry of companies providing performance evaluation services, and laid scientific foundations for designing systems that meet performance objectives. Along the way, he became troubled by the fact that the real systems he was evaluating seriously violated his model's assumptions, and yet the faulty models predicted throughput to within 5 percent of the true value and response time to within 25 percent. He began puzzling over this anomaly and invented a new framework for building computer performance models, which he called operational analysis. Operational analysis produced the same formulas, but with assumptions that hold in most systems. As he continued to understand this puzzle, he formulated a more complete theory of randomness, which he calls observational stochastics, and he wrote a book Rethinking Randomness laying out his new theory. We talked with Jeff Buzen about his work.

    Peter J. Denning
    Editor in Chief

    ...
  • What About an Unintelligent Singularity?: The technological singularity (Ubiquity symposium)

    For years we humans have worried about plagues, asteroids, earthquakes, eruptions, fires, floods, famines, wars, genocides, and other uncontrollable events that could wipe away our civilization. In the modern age, with so much depending on computing and communications, we have added computers to our list of potential threats. Could we perish from the increasing intelligence of computers? Denning thinks that is less of a threat than the apparently mundane march of automated bureaucracies. He also asserts that none of the possible negative outcomes is a forgone conclusion because humans teaming with machines are far more intelligent than either one alone.

    ...
  • An interview with David Alderson: in search of the real network science

    There has been an explosion of interest in mathematical models of large networks, leading to numerous research papers and books. The National Research Council carried out a study evaluating the emergence of a new area called "network science," which could provide the mathematics and experimental methods for characterizing, predicting, and designing networks. David Alderson has become a leading advocate for formulating the foundations of network science so that its predictions can be applied to real networks.

    ...
  • Interview with Mark Guzdial, Georgia Institute of Technology: computing as creation

    Mark Guzdial is a Professor in the School of Interactive Computing at Georgia Institute of Technology (Georgia Tech). His research focuses on the intersection of computing and education, from the role of computing in facilitating education to how we educate about computing. In this interview with him, he discusses how we teach computing and to whom, especially his contention that a contextualized approach is a powerful tool to teach everyone about computing.

    ...
  • Science and the spectrum of belief: an interview with Leonard Ornstein

    In 1965 Leonard Ornstein wrote a long and thoughtful essay on information and meaning. Shannon's idea that communication systems could transmit and process information without regard to its meaning just did not seem right to him. He was particularly interested in how scientists use and interpret information as part of science. Forty-eight years later, he is sharing how he sees science, discovery, information, and meaning with Ubiquity Magazine.

    ...
  • Ubiquity symposium: The science in computer science: opening statement

    The recent interest in encouraging more middle and high school students to prepare for careers in science, technology, engineering, or mathematics (STEM) has rekindled the old debate about whether computer science is really science. It matters today because computing is such a central field, impacting so many other fields, and yet it is often excluded from high school curricula because it is not seen as a science. In this symposium, fifteen authors examine different aspects from what is science, to natural information processes, to new science-enabled approaches in STEM education.

    ...
  • Writing secure programs: an interview with Steve Lipner

    Protecting computing systems and networks from attackers and data theft is an enormously complicated problem. The individual operating systems are complex (typically more than 40 million lines of code), they are connected to an enormous Internet (on order of 1 billion hosts), and the whole network is heavily populated (more than 2.3 billion users). Hunting down and patching vulnerabilities is a losing game.

    ...
  • Bringing architecture back to computing: an interview with Daniel A. Menascé

    Over the past 10 or 20 years, the subject of machine organization and system architecture has been deemphasized in favor of the powerful abstractions that support computational thinking. We have grown accustomed to slogans like "computing is bits, not atoms"---suggesting that bits are not physical and the properties of the physical world are less and less important for understanding computation.

    ...
  • Dark innovation: An interview with Jerry Michalski

    As computing technologists, we tend to think of innovations in terms of new products or services supported by, or made of, computing technologies. But there are other types of innovation besides products. There are process innovations, such as McDonald's method of making hamburgers fast; social innovations, such as Mothers Against Drunk Driving; and business model innovations, such as Starbucks replacing a coffee shop with an Internet cafe. In all these categories, we tend to think of innovations as new ways of doing things that positively impact many people.

    ...
  • A 10 Point Checklist for Getting it Off the Shelf: An interview with Dick Urban

    Far too many R&D programs in industry as well as government result in reports or prototypes that represent fundamentally good ideas but end up gathering dust on a shelf. Ellison "Dick" Urban, formerly of DARPA (Defense Advanced Research Projects Agency) and now the Director of Washington Operations at Draper Laboratory, has had considerable experience with technology transition. We talked to him about his guidelines for success.

    ...
  • Steve Jobs and the user psyche
    Much has been said about Steve Jobs's ability to anticipate what users would want. No one quite knows how he did it. Ubiquity's Peter Denning reflects on an interview with the Apple co-founder, which offers a glimpse into Jobs's process for understanding the user. ...
  • Honesty is the best policy---Part 2: an interview with Rick Hayes-Roth

    Untrustworthy information is an increasing threat to decision making in information environments. Rick Hayes-Roth has been studying how to detect and filter away untrustworthy information and base decisions on well-grounded claims that can improve outcomes. In last week's installment of this two-part interview, we focused on the problem and the principles that help ameliorate it. In this installment, we focus on the means to implement the principles in our information environments.

    ...
  • Honesty is the best policy---part 1: an interview with Rick Hayes-Roth

    Untrustworthy information is an increasing threat to decision making in information environments. Rick Hayes-Roth has been studying how to detect and filter away untrustworthy information and base decisions on well-grounded claims that can improve outcomes. We interviewed him to find out more about this problem and get advice for our readers. Although there are many subtleties in the shades of truth and the intentions of speakers and listeners, Hayes-Roth finds the essential core of what you can do to ward off untrustworthy information.

    ...
  • Ubiquity symposium: What have we said about computation?: closing statement

    The "computation" symposium presents the reflections of thinkers from many sectors of computing on the fundamental question in the background of everything we do as computing professionals. While many of us have too many immediate tasks to allow us time for our own deep reflection, we do appreciate when others have done this for us. Peter Freeman points out, by analogy, that as citizens of democracies we do not spend a lot of time reflecting on the question, "What is a democracy," but from time to time we find it helpful to see what philosophers and political scientists are saying about the context in which we act as citizens.

    ...
  • An Interview with Mark Guzdial

    Mark Guzdial is a Professor in the School of Interactive Computing at Georgia Institute of Technology (Georgia Tech). His research focuses on the intersection of computing and education, from the role of computing in facilitating education to how we educate about computing. In this interview with him, he discusses how we teach computing and to whom, especially his contention that a contextualized approach is a powerful tool to teach everyone about computing.
    --Editor

    ...
  • Ubiquity symposium 'What is computation?': Opening statement

    Most people understand a computation as a process evoked when a computational agent acts on its inputs under the control of an algorithm. The classical Turing machine model has long served as the fundamental reference model because an appropriate Turing machine can simulate every other computational model known. The Turing model is a good abstraction for most digital computers because the number of steps to execute a Turing machine algorithm is predictive of the running time of the computation on a digital computer. However, the Turing model is not as well matched for the natural, interactive, and continuous information processes frequently encountered today. Other models whose structures more closely match the information processes involved give better predictions of running time and space. Models based on transforming representations may be useful.

    ...
  • The New Ubiquity
    Ubiquity's new site will launch this month, marking a new editorial direction. Ubiquity is now a peer-reviewed online publication of ACM dedicated to the future of computing and the people who are creating it. ...
  • An Interview with Chris Gunderson: Are Militaries Lagging Their Non-State Enemies in Use of Internet?
    The increasing number of cyber attacks on military networks and servers has raised the question of what the global defense community is doing to safeguard military systems and protect the larger global Internet. Ubiquity's editor interviewed Chris Gunderson, who served in the U.S. Navy from 1973 to 2004 and became an expert in "network centric" warfare, on this question and in particular on how military philosophy must change to adapt to the rise of information networks. ...
  • An Interview with David Alderson: In Search of the Real Network Science
    David Alderson has become a leading advocate for formulating the foundations of network science so that its predictions can be applied to real networks. He is an assistant professor in the Operations Research Department at the Naval Postgraduate School in Monterey, Calif., where he conducts research with military officer-students on the operation, attack, and defense of network infrastructure systems. Ubiquity interviewed him to find out what is going on. ...
  • The somatic engineer
    Engineers trained in value skills will be superior professionals and designers. ...