acm - an acm publication

Interviews are organized by the month and year in which they first appeared. To find an interviewee by name, use the search bar (at upper right).

Interviews

  • Artificial intelligence in politics: an interview with Sven Körner and Mathias Landhäußer of thingsTHINKING

    Natural language processing, an area of artificial intelligence (AI), has attained remarkable successes. Digital assistants such as Siri and Alexa respond to spoken commands, and understand several languages. Google has demonstrated a machine can call up a restaurant and make a reservation in a manner that is indistinguishable from a human. Automated translation services are used around the world in over a hundred languages. This interview discusses a new and surprising application of language processing in politics. Though the AI software analyzes texts in German, it could be adapted to any language. The underlying technology has wider applications in text analysis, including legal tech, contracting, and others. Here is a summary.

  • Art Scott and Michael Frank on energy-efficient computing

    Clock speeds of computing chips have leveled off dramatically since 2005, and putting more cores in systems on a chip (SoC) has produced more heat, adding a new ceiling to further advances. Leading-edge researchers, like Mike Frank, and dedicated technologists with a wealth of experience, like Art Scott, represent a new vanguard of the leap-forward beyond Dennard scaling and Landauer's limit. Art looks for ways to reduce energy consumption and Mike looks for ways to "architect" future chips according to principles of reversibility. Is the future in reversible, adiabatic computing and simpler architectures using posit arithmetic? My guests think so.

  • Mixing computation with people: an interview with Marianne Winslett

    In this interview, we learn about five fascinating subjects: security in manufacturing, negotiating trust in the web, updating logical databases, differential privacy, and scientific computing (including its security issues). This is a confluence that has, at its roots, the thorny problems that arise when you mix computation with people. Some beautiful technical results, many originated by Marianne Winslett, now address those challenges, but some surprises crop up along the way.

  • Cybersecurity skeptics now embracing formal methods: an interview with Gernot Heiser and Jim Morris

    There is new hope for those who despair securing computer systems from external hackers. The recent DARPA HACMS project demonstrated conclusively that "certain pathways for attackers have all been shut down in a way that's mathematically proven to be unhackable for those pathways." Continuing research at DARPA and IARPA will eventually shut down all the pathways, and the external hackers will be out of business permanently.

  • Unums 2.0: An Interview with John L. Gustafson

    In an earlier interview (April 2016), Ubiquity spoke with John Gustafson about the unum, a new format for floating point numbers. The unique property of unums is that they always know how many digits of accuracy they have. Now Gustafson has come up with yet another format that, like the unum 1.0, always knows how accurate it is. But it also allows an almost arbitrary mapping of bit patterns to the reals. In doing so, it paves the way for custom number systems that squeeze the maximum accuracy out of a given number of bits. This new format could have prime applications in deep learning, big data, and exascale computing.

  • Rethinking Randomness: An interview with Jeff Buzen, Part II

    In Part 1, Jeff Buzen discussed the basic principles of his new approach to randomness, which is the topic of his book Rethinking Randomness. He continues here with a more detailed discussion of models that have been used successfully to predict the performance of systems ranging from early time sharing computers to modern web servers.

    Peter J. Denning
    Editor in Chief

  • Rethinking Randomness: An interview with Jeff Buzen, Part I

    For more than 40 years, Jeffrey Buzen has been a leader in performance prediction of computer systems and networks. His first major contribution was an algorithm, known now as Buzen's Algorithm, that calculated the throughput and response time of any practical network of servers in a few seconds. Prior algorithms were useless because they would have taken months or years for the same calculations. Buzen's breakthrough opened a new industry of companies providing performance evaluation services, and laid scientific foundations for designing systems that meet performance objectives. Along the way, he became troubled by the fact that the real systems he was evaluating seriously violated his model's assumptions, and yet the faulty models predicted throughput to within 5 percent of the true value and response time to within 25 percent. He began puzzling over this anomaly and invented a new framework for building computer performance models, which he called operational analysis. Operational analysis produced the same formulas, but with assumptions that hold in most systems. As he continued to understand this puzzle, he formulated a more complete theory of randomness, which he calls observational stochastics, and he wrote a book Rethinking Randomness laying out his new theory. We talked with Jeff Buzen about his work.

    Peter J. Denning
    Editor in Chief

  • Changing the Game: Dr. Dave Schrader on sports analytics

    Dave Schrader, known to his friends as Dr. Dave, worked for 24 years in advanced development and marketing at Teradata, a major data warehouse vendor. He actively gives talks on business analytics, and since retiring has spent time exploring the field of sports analytics. In this interview, Schrader discusses how analytics is playing a significant role in professional sports--from Major League Soccer to the NBA.

  • The Future of Technology and Jobs: An interview with Dr. R.A. Mashelkar

    The following interview with Dr. Raghunath Anant Mashelkar is on the prospects of how will technology change the face of employment in the future? What will the jobs of the future look like? What skills are needed to prepare students and researchers for employment in the digital age? As our world is getting digitized day-by-day, technology influences how students communicate, learn, work and interact with society as a whole more than any generation before. Ultimately, students nowadays have to compete with a more globalized, mobile work force, and rapid technological advancement.

  • The End of (Numeric) Error: An interview with John L. Gustafson

    Crunching numbers was the prime task of early computers. The common element of these early computers is they all used integer arithmetic. John Gustafson, one of the foremost experts in scientific computing, has proposed a new number format that provides more accurate answers than standard floats, yet saves space and energy. The new format might well revolutionize the way we do numerical calculations.

  • The Rise of Computational Biology: An interview with Prof. Thomas Lengauer

    In this wide-ranging interview, we will hear from a pioneer in computational biology on where the field stands and on where it is going. The topics stretch from gene sequencing and protein structure prediction, all the way to personalized medicine and cell regulation. We'll find out how bioinformatics uses a data-driven approach and why personal drugs may become affordable. We'll even discuss whether we will be able to download our brains into computers and live forever.

  • On Quantum Computing: An interview with David Penkler
    In recent months, announcements on the progress toward harnessing quantum computing have solicited divers and sometimes strong reactions and opinions from academia and industry. Some say quantum computing is impossible, while others point to actual machines-raising the question as to whether they really are quantum computers. In this interview, Dave Penkler---an HP fellow whose primary interests are in cloud and data-center scale operating systems and networks---shares his view on the present and future of quantum computing. Penkler has 40 years of experience with computer hardware and software and has always had a keen interest in their evolution as enabled by the advances in science and technology.
  • Interview with Mark Guzdial, Georgia Institute of Technology: computing as creation

    Mark Guzdial is a Professor in the School of Interactive Computing at Georgia Institute of Technology (Georgia Tech). His research focuses on the intersection of computing and education, from the role of computing in facilitating education to how we educate about computing. In this interview with him, he discusses how we teach computing and to whom, especially his contention that a contextualized approach is a powerful tool to teach everyone about computing.

  • An interview with David Alderson: in search of the real network science

    There has been an explosion of interest in mathematical models of large networks, leading to numerous research papers and books. The National Research Council carried out a study evaluating the emergence of a new area called "network science," which could provide the mathematics and experimental methods for characterizing, predicting, and designing networks. David Alderson has become a leading advocate for formulating the foundations of network science so that its predictions can be applied to real networks.

  • Science and the spectrum of belief: an interview with Leonard Ornstein

    In 1965 Leonard Ornstein wrote a long and thoughtful essay on information and meaning. Shannon's idea that communication systems could transmit and process information without regard to its meaning just did not seem right to him. He was particularly interested in how scientists use and interpret information as part of science. Forty-eight years later, he is sharing how he sees science, discovery, information, and meaning with Ubiquity Magazine.

  • Writing secure programs: an interview with Steve Lipner

    Protecting computing systems and networks from attackers and data theft is an enormously complicated problem. The individual operating systems are complex (typically more than 40 million lines of code), they are connected to an enormous Internet (on order of 1 billion hosts), and the whole network is heavily populated (more than 2.3 billion users). Hunting down and patching vulnerabilities is a losing game.

  • Bringing architecture back to computing: an interview with Daniel A. Menascé

    Over the past 10 or 20 years, the subject of machine organization and system architecture has been deemphasized in favor of the powerful abstractions that support computational thinking. We have grown accustomed to slogans like "computing is bits, not atoms"---suggesting that bits are not physical and the properties of the physical world are less and less important for understanding computation.

  • Dark innovation: An interview with Jerry Michalski

    As computing technologists, we tend to think of innovations in terms of new products or services supported by, or made of, computing technologies. But there are other types of innovation besides products. There are process innovations, such as McDonald's method of making hamburgers fast; social innovations, such as Mothers Against Drunk Driving; and business model innovations, such as Starbucks replacing a coffee shop with an Internet cafe. In all these categories, we tend to think of innovations as new ways of doing things that positively impact many people.

  • A 10 Point Checklist for Getting it Off the Shelf: An interview with Dick Urban

    Far too many R&D programs in industry as well as government result in reports or prototypes that represent fundamentally good ideas but end up gathering dust on a shelf. Ellison "Dick" Urban, formerly of DARPA (Defense Advanced Research Projects Agency) and now the Director of Washington Operations at Draper Laboratory, has had considerable experience with technology transition. We talked to him about his guidelines for success.

  • The Law, the Computer, and the Mind: An interview with Roy Freed

    2011 marked the 50th anniversary of the first educational program on computer law, sponsored by the Joint Committee on Continuing Professional Education of the American Law Institute and the American Bar Association (ALI-ABA). In 1971 at an ACM conference, Roy Freed and six colleagues founded the Computer Law Association (CLA), an international bar association (renamed later as the International Technology Law Association).

  • On experimental algorithmics: an interview with Catherine McGeoch and Bernard Moret

    Computer science is often divided into two camps, systems and theory, but of course the reality is more complicated and more interesting than that. One example is the area of "experimental algorithmics," also termed "empirical algorithmics." This fascinating discipline marries algorithm analysis, which is often done with mathematical proofs, with experimentation with real programs running on real machines.

  • Honesty is the best policy---Part 2: an interview with Rick Hayes-Roth

    Untrustworthy information is an increasing threat to decision making in information environments. Rick Hayes-Roth has been studying how to detect and filter away untrustworthy information and base decisions on well-grounded claims that can improve outcomes. In last week's installment of this two-part interview, we focused on the problem and the principles that help ameliorate it. In this installment, we focus on the means to implement the principles in our information environments.

  • Honesty is the best policy---part 1: an interview with Rick Hayes-Roth

    Untrustworthy information is an increasing threat to decision making in information environments. Rick Hayes-Roth has been studying how to detect and filter away untrustworthy information and base decisions on well-grounded claims that can improve outcomes. We interviewed him to find out more about this problem and get advice for our readers. Although there are many subtleties in the shades of truth and the intentions of speakers and listeners, Hayes-Roth finds the essential core of what you can do to ward off untrustworthy information.

  • An interview with Richard John: the politics of network evolution

    Richard John is a professor at the Graduate School of Journalism, Columbia University, and a historian of communications networks in the United States. His most recent book, Network Nation, won the inaugural Ralph Gomory prize from the Business History Conference and the AEJMC prize for the best book in the history of journalism and mass communications.

  • Empirical software research: an interview with Dag Sjøberg, University of Oslo, Norway

    Punched cards were already obsolete when I began my studies at the Technical University of Munich in 1971. Instead, we had the luxury of an interactive, line-oriented editor for typing our programs. Doug Engelbart had already invented the mouse, but the device was not yet available. With line editors, users had to identify lines by numbers and type in awkward substitution commands just to add missing semicolons. Though cumbersome by today's standards, it was obvious that line-oriented editors were far better than punched cards. Not long after, screen oriented editors such as Vi and Emacs appeared. Again, these editors were obvious improvements and everybody quickly made the switch. No detailed usability studies were needed. "Try it and you'll like it" was enough. (Brian Reid at CMU likened screen editors to handing out free cocaine in the schoolyard.) Switching from Assembler to Fortran, Algol, or Pascal also was a no-brainer. But in the late '70s, the acceptance of new technologies for building software seemed to slow down, even though more people were building software tools. Debates raged over whether Pascal was superior to C, without a clear winner. Object-oriented programming, invented back in the '60s with Simula, took decades to be widely adopted. Functional programming is languishing to this day. The debate about whether agile methods are better than plan-driven methods has not led to a consensus. Literally hundreds of software development technologies and programming languages have been invented, written about, and demoed over the years, only to be forgotten. What went wrong?

  • An interview with Bob Metcalfe: Bob Metcalfe is going meta on innovation

    Bob Metcalfe thinks we are in a bubble, an innovation bubble, seeing that the word "innovation" is on everybody's lips. To help ensure that this bubble does not burst, he has embarked on a new career path as Professor of Innovation and Murchison Fellow of Free Enterprise at the University of Texas at Austin. This is his fifth career, building on his work as an engineer-scientist leading the invention of Ethernet in the 1970s, entrepreneur-executive and founder of 3Com in the 1980s, publisher-pundit and CEO of InfoWorld in the 1990s, and venture capitalist in the 2000s. As General Partner with Polaris Venture Partners, he has invested primarily in cleantech and currently serves on the boards of five companies: Ember, Sun Catalyx, 1366 Technologies, Infinite Power, and SiOnyx.

  • An Interview with Peter Denning: the end of the future

    Ubiquity is dedicated to the future of computing and the people who are creating it. What exactly does this mean for readers, for contributors, and for editors soliciting and reviewing contributions? We decided to ask the editor in chief, Peter Denning, how he approaches the future, and how his philosophy is reflected in the design and execution of the Ubiquity mission. He had a surprisingly rich set of answers to our questions. We believe his answers may be helpful for all our readers with their own approaches to their own futures.

  • An interview with Melanie Mitchell: On complexity

    Melanie Mitchell, a Professor of Computer Science at Portland State University and an External Professor at the Santa Fe Institute, has written a compelling and engaging book entitled Complexity: A Guided Tour, published just last year by Oxford University Press. This book was named by Amazon.com as one of the 10 best science books of 2009. Her research interests include artificial intelligence, machine learning, biologically inspired computing, cognitive science, and complex systems.

  • An Interview with Joseph F. Traub

    Joseph F. Traub is the Edwin Howard Armstrong Professor of Computer Science at Columbia University and External Professor, Santa Fe Institute. In this wide-ranging interview, he discusses his early research, organizations and other entities he has created, and offers his view on several open-ended topics on the future of computing.
    --Editor

  • An Interview with Mark Guzdial

    Mark Guzdial is a Professor in the School of Interactive Computing at Georgia Institute of Technology (Georgia Tech). His research focuses on the intersection of computing and education, from the role of computing in facilitating education to how we educate about computing. In this interview with him, he discusses how we teach computing and to whom, especially his contention that a contextualized approach is a powerful tool to teach everyone about computing.
    --Editor

  • An Interview with Erol Gelenbe

    This is Part I of an interview with Professor Erol Gelenbe, conducted by Professor Cristian Calude. Gelenbe holds the Dennis Gabor Chair Professorship in the Electrical and Electronic Engineering Department at Imperial College London and is an associate editor for this publication. This interview also appeared in the October 2010 issue of the Bulletin of the European Association for Computer Science and is printed here with permission.
    --Editor

  • An Interview with Prof. Andreas Zeller: Mining your way to software reliability
    In 1976, Les Belady and Manny Lehman published the first empirical growth study of a large software system, IBMs OS 360. At the time, the operating system was twelve years old and the authors were able to study 21 successive releases of the software. By looking at variables such as number of modules, time for preparing releases, and modules handled between releases (a defect indicator), they were able to formulate three laws: The law of continuing change, the law of increasing entropy, and the law of statistically smooth growth. These laws are valid to this day. Belady and Lehman were ahead of their time. They understood that empirical studies such as theirs might lead to a deeper understanding of software development processes, which might in turn lead to better control of software cost and quality. However, studying large software systems proved difficult, because complete records were rare and companies were reluctant to open their books to outsiders. Three important developments changed this situation for the better. The first one was the widespread adoption of configuration management tools, starting in the mid 1980s. Tools such as RCS and CVS recorded complete development histories of software. These tools stored significantly more information, in greater detail, than Belady and Lehman had available. The history allowed the reconstruction of virtually any configuration ever compiled in the life of the system. I worked on the first analysis of such a history to assess the cost of several different choices for smart recompilation (ACM TOSEM, Jan. 1994). The second important development was the inclusion of bug reports and linking them to offending software modules in the histories. This information proved extremely valuable, as we shall see in this interview. The third important development was the emergence of open source, through which numerous and large development histories became available for study. Soon, workers began to analyze these repositories. Workshops on mining software repositories have been taking place annually since 2004. I spoke with Prof. Andreas Zeller about the nuggets of wisdom unearthed by the analysis of software repositories. Andreas works at Saarland University in Saarbrücken, Germany. His research addresses the analysis of large, complex software systems, especially the analysis of why these systems fail to work as they should. He is a leading authority on analyzing software repositories and on testing and debugging. -- Walter Tichy, Editor
  • An Interview with Chris Gunderson: Are Militaries Lagging Their Non-State Enemies in Use of Internet?
    The increasing number of cyber attacks on military networks and servers has raised the question of what the global defense community is doing to safeguard military systems and protect the larger global Internet. Ubiquity's editor interviewed Chris Gunderson, who served in the U.S. Navy from 1973 to 2004 and became an expert in "network centric" warfare, on this question and in particular on how military philosophy must change to adapt to the rise of information networks.
  • An Interview with David Alderson: In Search of the Real Network Science
    David Alderson has become a leading advocate for formulating the foundations of network science so that its predictions can be applied to real networks. He is an assistant professor in the Operations Research Department at the Naval Postgraduate School in Monterey, Calif., where he conducts research with military officer-students on the operation, attack, and defense of network infrastructure systems. Ubiquity interviewed him to find out what is going on.
  • An Interview with Peter Huber: Why 99.9 Percent Is Not Good Enough
    In the opening days of 2009, people are looking for the new President Obama to restore domestic and international confidence and help us find our way out of a dark recession. Electric power generation and distribution is a key part of a new direction. Can we produce enough of it to reduce our oil usage? Can electric cars become reliable and cover enough distance on a single charge? Can its availability be increased, especially since critical services in transportation, banking, computing and many other sectors can be shut down by power grid failures? In April 2000, Ubiquity Editor John Gehl spoke with energy expert Peter Huber about these issues. Huber's comments about the needs of the power grid were prophetic. We gladly bring them to you now in the hope that they will help you understand the power challenges ahead. --Peter Denning, Ubiquity Editor
  • An Interview with Frans Johansson: The Medici Effect
    In this time of recession, innovation has jumped to the fore in many people's minds. How can we create new value through innovations and pull our individual companies out of the doldrums? In 2004, Frans Johansson published his book, The Medici Effect, in which he discussed how crossing community boundaries leads to innovations, and he said that the most effective way to create the crossing is to mix people from the communities in a common setting. John Gehl spoke with Johansson shortly after the book was published. Johansson's words are worth thinking about now as we reflect on what we all must do next.
  • An Interview with Randy Pausch: Immersed in the Future: On the Future of Education
    Before he became ill, Randy Pausch spoke with Ubiquity Editor John Gehl in 2005. The declining enrollments in computer science were already very much on his mind. At that time, they were down 23 percent. Pausch called this a "huge problem". He noted that, even for those committed to teaching programming from the outset, kids programming in Alice were far more engaged than those trying to find Fibonacci numbers. The enrollments have since declined another 25 percent and the problem is even "huger" than before. Randy's ideas about what turns kids on are even more important today. --Peter Denning, Editor
  • An Interview with Michael Schrage
    It is November 2008 and much of the globe is in the throes of recession. Innovation is on many minds. We need new products and new services generating new value for our customers and our companies. It is more important than ever to innovate. The problem is that our collective success rate is abysmal -- 4 percent according to Business Week in August 2005. As we set out on new innovation initiatives, it is a good time to reflect on the illusions that drag our success rates so low. One illusion is that is innovation is a novel ideal or product, another is that those who spend more on R&D get more innovation, and another is that innovation is about great inventions. Michael Schrage of MIT has been challenging these illusions for a long time. He discussed them with Ubiquity editor John Gehl in February 2006. Now is the perfect time to reflect again on what Michael has to say to us about innovation. --Peter Denning, Editor
  • An Interview with Terry Winograd: Convergence, Ambient Technology, and Success in Innovation
    Terry Winograd is Professor of Computer Science at Stanford University, where he directs the program on human-computer interaction. His SHRDLU program done at the MIT AI Lab was one of the early explorations in natural language understanding by computers. His book with Fernando Flores, Understanding Computers and Cognition, critiqued the underlying assumptions of AI and much of computer system design, and led to completely new directions in those fields. He was a founder and national president of Computer Professionals for Responsibility. His remarks, made in 2002, are as relevant today as they were when first spoken.
  • An Interview with Richard A. Demillo
    Richard A. DeMillo is the Dean of Georgia Tech's College of Computing. He previously was Hewlett-Packard's chief technology officer and served as director of the Georgia Tech Information Security Center. Under DeMillo's leadership, Georgia Tech's College of Computing has replaced the core curriculum for undergraduates with an ambitious and innovative Threads program, as he explains in this interview with Ubiquity's editor-in-chief John Gehl.
  • An Interview with Wei Zhao
    Wei Zhao is currently the Dean of the School of Science at Rensselaer Polytechnic Institute. Before he joined RPI in 2007, he was a Senior Associate Vice President for Research at Texas A&M University. Between 2005 and 2007, he also served as the Director for the Division of Computer and Network Systems in the National Science Foundation. He completed his undergraduate program in physics at Shaanxi Normal University, Xi'an, China, in 1977. He received his M.Sc. and Ph.D. degrees in Computer and Information Sciences at the University of Massachusetts at Amherst in 1983 and 1986, respectively. During his career, he has also served as a faculty member at Amherst College, the University of Adelaide, and Texas A&M University. This interview was conducted by Ubiquity editor-in-chief John Gehl.
  • An Interview with Vaughan Merlyn on Management
    Vaughan Merlyn, who is a management consultant, researcher, and author, has had as his primary focus for more than three decades now has been the use of information and information technology for business value creation. He was interviewed about software consulting and management.
  • Interview with MIT's Robert Langer
    Dr. Robert Langers work is at the interface of biotechnology and materials science. A major focus is the study and development of polymers to deliver drugs, particularly genetically engineered proteins, DNA and RNAi, continuously at controlled rates for prolonged periods of time.
  • An Interview with Dr. Yi Pan of Georgia State University
    Ubiquity is proud to publish this inspirational interview, which starts with a discussion of the creation of the computer science department at Georgia State University, and concludes with the heroic efforts an impoverished student from Tsinghua University in China overcame many obstacles to rise to a significant position at Georgia. The interviewee is Yi Pan, Chair and Professor of Georgia State University's computer science department, who provided us with these inspirational reflections on computer science, academic success, and true success. The interview was conducted by Ubiquity editor-in-chief John Gehl.
  • An Interview with Michael Schrage on Ubiquity
    Author of several acclaimed books and numerous articles in such publications as Fortune and Technology Review, Michael Schrage is also a world-traveling consultant to all businesses great and small. He has been at MIT for many years, and his new academic home will be in that institution's Sloan Management School.
  • Ubiquity interview with Neumont's Graham Doxey
    Neumont University in Salt Lake City was featured in Ubiquity two years ago, with an interview with one of its founders, Scott McKinley. We wanted to go back and see how they're doing at this new and unique institution, about which senior vice president Julie Blake has explained: "The industry has said for years that even our best universities aren't preparing students for the workplace. Neumont was founded to fill that niche." Below is a Ubiquity interview with Neumont cofounder and President Graham Doxey.
  • An Interview with Scott McKinley: Project-Based Learning: The Neumont University story
    Neumont University co-founder and CEO Scott McKinley says the most innovative aspect of the Neumont curriculum is its focus on student projects: "Our freshmen are on project teams from the very beginning. Their first projects are simple, heavily scaffolded, and commensurate with their novice skills. By the time they enter their last three quarters, they're working on real industry projects for serious names that work with us, including IBM and Microsoft."
  • An Interview with William P. Dunk: On Collaboration
    Management consultant and futurist William P. Dunk says, "What collaboration is about is distributed intelligence, and I think that systems and governments and companies are all in such a degree of gridlock now that we desperately need to have broad-based intelligence coming into play everywhere."
  • An Interview with Alan Lenton: On Games
    Noted U.K. game designer Alan Lenton talks about his award-winning multi-player game Federation and discusses the sociology and psychology of gaming.
  • An Interview with John Markoff: What the dormouse said
    John Markoff is author of the new best-seller "What the Dormouse Said: How the 60s Counterculture Shaped the Personal Computer Industry," and is a senior writer for The New York Times. His other books include "Cyberpunk: Outlaws and Hackers on the Computer Frontier" and "Takedown: The Pursuit and Capture of Kevin Mitnick, America's Most Wanted Computer Outlaw."
  • An Interview with F-H Hsu: Chess, China, and Education
    Feng-Hsiung Hsu, whose book "Behind Deep Blue" told the story of world chess champion Garry Kasparov was defeated by the IBM computer known as Deep Blue, is now a senior manager and researcher at Microsoft Research Asia.
  • An Interview with Leonard Kleinrock on nomadic computing
    Leonard Kleinrock developed the mathematical theory of packet-switching, the technology underpinning the Internet, while a graduate student at MIT a decade before the birth of the Internet which occurred when his host computer at UCLA became the first node of the Internet in September 1969. He is now at UCLA, where he is Professor of Computer Science. He has won numerous awards and prizes for his achievements.
  • You should use both sides of your brain, right?
    Author Dan Pink argues that "nowadays, the fault line between who gets ahead and who doesn't is going to be mastery of these abilities that are more characteristic of the right hemisphere — artistry, empathy, big picture thinking. Those are the sorts of abilities that I think are really going to matter the most, not only in our individual career success, but also in our personal satisfaction."
  • Building smarter: an interview with Jerry Laiserin
    Architect and industry analyst Jerry Laiserin is an advocate for "building smarter" - the application of information technology to transform the way the built environment is designed, constructed and operated. His technology strategy publication, the LaiserinLetter, can be found at.
  • Joseph Konstan on Human-Computer Interaction: Recommender Systems, Collaboration and Social Good
    An interview with Joseph Konstan: Konstan is an associate professor of computer science at the University of Minnesota. His background includes a bachelor's degree from Harvard College and a PhD from the University of California-Berkeley. His principal interests are human-computer interaction, recommender systems, multimedia systems, information visualization, internet applications and interfaces.
  • Leonard and Swap on 'Deep Smarts'
    An interview with Dorothy Leonard and Walter Swap: The first issue that any organization has to face is the identification of the deep smarts. Dorothy Leonard and Walter Swap are co-authors of the new book 'Deep Smarts: How to Cultivate and Transfer Enduring Business Wisdom.' Leonard is a professor emerita at the Harvard Business School and Swap is a professor of psychology emeritus at Tufts, where he was also dean of the college.
  • Ken Sevcik on Performance Evaluation
    Ken Sevcik is Professor of Computer Science at the University of Toronto. He received his B.S. in 1966 from Stanford University and his PhD in 1971 from the University of Chicago. Sevcik joined the faculty at the University of Toronto in 1971, and was Chair of the Department from 1990 to 1992. He also served as Director of the Computer Systems Research Institute (CSRI). His research interests are in the use of analytic models for performance analysis of resource allocation, scheduling and file structures in computer systems, computer networks, and distributed data management systems.
  • Anita McGahan on Industry Evolution
    Anita M. McGahan is author of the new book 'How Industries Evolve: Principles for Achieving and Sustaining Superior Performance' (Harvard Business School Press). She is the Everett V. Lord Distinguished Faculty Scholar and Professor of Strategy & Policy at the Boston University School of Management, as well as a Senior Institute Associate at Harvard's Institute for Strategy and Competitiveness.
  • Ken Robinson on Telecom Policy
    Ken Robinson is a communications attorney in Washington, having worked at the Departments of Justice and Commerce, the FCC, and the Office of Telecommunications Policy during the Nixon Administration. He is editor of the weekly publication 'Telecommunications Policy Review.'
  • Czerwinski on Vizualization
    Mary Czerwinski is Senior Researcher and Group Manager Visualization and Interaction Research Group at Microsoft Research.
  • Mihai Nadin on Anticipatory Systems
    What is the difference between a falling stone and a falling cat? Mihai Nadin, who directs the newly established Institute for Research in Anticipatory Systems at the University of Texas at Dallas, holds a Ph.D. degree in aesthetics from the University of Bucharest and a post-doctoral degree in philosophy, logic and theory of science from Ludwig Maximilian University in Munich, West Germany. He earned an M.S. degree in electronics and computer science from the Polytechnic Institute of Bucharest and an M.A. degree in philosophy from the University of Bucharest. He has authored 23 books, including "The Civilization of Illiteracy," "Mind: Anticipation and Chaos," and "Anticipation: The End is Where We Start From."
  • Michael Schrage on Innovation
    Looking for the great clients who are the true innovators? Co-director of the MIT Media Lab's eMarkets Initiative, a senior advisor to MIT's Security Studies Program, and a consultant to MIT's Langer Labs on technology transfer issues, Michael Schrage conducts research on the economics of innovation. His particular focus is on the role of models, prototypes and simulations in managing interactive iterative design, an area in which he works with a number of companies.
  • Mihai Nadin on Anticipatory Systems
    What is the difference between a falling stone and a falling cat? Mihai Nadin, who directs the newly established Institute for Research in Anticipatory Systems at the University of Texas at Dallas, holds a Ph.D. degree in aesthetics from the University of Bucharest and a post-doctoral degree in philosophy, logic and theory of science from Ludwig Maximilian University in Munich, West Germany. He earned an M.S. degree in electronics and computer science from the Polytechnic Institute of Bucharest and an M.A. degree in philosophy from the University of Bucharest. He has authored 23 books, including "The Civilization of Illiteracy," "Mind: Anticipation and Chaos," and "Anticipation: The End is Where We Start From."
  • Checking in with Ben Bederson
    By focusing on the user experience, the University of Maryland's Human-Computer Interaction Lab aims to improve lives through projects such as the International Children's Digital Library. Benjamin B. Bederson, interviewed here, is an Associate Professor of Computer Science and director of the Human-Computer Interaction Lab at the Institute for Advanced Computer Studies at the University of Maryland, College Park. His work is on information visualization, interaction strategies, and digital libraries.
  • Patterns for Success
    Scott D. Anthony speaks about using innovation theory to transform organizations and create the next wave of growth. Anthony is a partner at Innosight, a management, consulting and education company located in Watertown, Massachusetts, and is co-author with Clayton M. Christensen and Erik A. Roth of the new book, "Seeing What's Next: Using the Theories of Innovation to Predict Industry Change."
  • An Interview with Joichi Ito: The world wide blog
    Joichi Ito, founder of Neoteny and other Internet companies, finds that cyberspace is embracing it roots — collaboration, community, and personal communications — with bloggers leading the way.
  • S. Joy mountford on interface design
    The ultimate technology world will be soft, flexible and addressable. But the issues will remain the same, according to interface designer S. Joy Mountford: What do people like and what do people want?
  • Ann Kirschner on marketing and distribution of online learning
    Outside of business schools, the very word "marketing" makes most universities uncomfortable, as does the idea of students as customers. But the world of higher education is becoming increasingly competitive. Fathom, named for the double idea of comprehension and depth, was a milestone in the evolution of online learning and a prototype of where things are headed.
  • An Interview with Steven Weber: Why open source works
    Author Steven Weber looks beyond the hype on Open Source. More than a self-governing utopia, it's a practical, sustainable way of organizing and innovating. Its method may soon be applied successfully in other sectors. Plus, a "crazy" idea for Microsoft.
  • An Interview with Jesse Poore: Correct by design
    Jesse Poore suggests a revolution in programming - holding software developers to the same level of rigor of training and workmanship as other professionals, developing software that's correct by design, and constraining the release of software-intensive products until they are scientifically certified as fit for use.
  • Roger Brent and the alpha project
    The work of a multidisciplinary genomic research lab in Berkeley may yield big changes in drug therapy and medicine. Roger Brent is President and Research Director of the Molecular Sciences Institute, an independent nonprofit research laboratory in Berkeley, CA, that combines genomic experimentation with computer modeling. The mission of the MSI is to predict the behavior of cells and organisms in response to defined genetic and environmental changes.
  • An Interview with Peter Denning: The great principles of computing
    Peter Denning teaches students at the Naval Postgraduate School how to develop strategic, big-picture thinking about the field of computing. Denning, a past president of ACM (1980-82), has been involved with communicating our discipline, computing, to outsiders since 1970. Along the way he invented the working set model for memory management, developed the theory of virtual memory, promulgated operating systems theory, co-invented operational analysis of system performance, co-founded CSNET, and led the ACM Publications Board while it developed the Digital Library. He is an ACM Fellow and holds five major ACM awards. He just completed a five-year term as chair of the ACM Education Board.
  • Esther Dyson ... In focus
    Venture capitalist Esther Dyson is the chairman of EDventure Holdings, which publishes the influential monthly computer-industry newsletter Release 1.0 as well as the blog Release 4.0. The company also organizes the high-profile technology conference PC (Platforms for Communications) Forum, March 21-23, 2004. In this interview, she discusses her current interests, many to be covered at PC Forum. They include her investments, how to stop spam, outsourcing, and the overall high-tech industry environment.
  • An Interview with Thomas Kalil: Where politics, policy, technology and science converge
    From the White House to Berkeley, Thomas Kalil has worked on shaping the national agenda for science and technology research initiatives. Kalil, President Clinton's former science and technology advisor, now holds a similar post at the University of California, Berkeley, where he helps develop new research initiatives and increase UC Berkeley's role in shaping the national agenda.
  • An Interview with David Rejeski: Making policy in a Moore's Law world
    The accelerated rate of scientific discovery and technological innovation makes it difficult to keep up with the pace of change. What do policymakers know of nanotechnology and genetic modification? David Rejeski helps government agencies anticipate emerging technological issues.
  • Talking with security expert M. E. Kabay
    Adaptive attackers, novice computer users, indifferent management - it's no wonder our defensive mechanisms need continuous refinement.
  • Talking with Ben Chi of NYSERNet
    How the Internet began in New York State, the current state of Internet2, and the remote possibility of Internet3
  • A whole new worldview
    Anthropologist Christopher Kelty on programmers, networks and information technology
  • A designing life: Blade Kotelly
    A speech-recognition software expert explains the difference between good design and ambiguity, how good designs go bad, and why everyone is a designer.
  • Building an inventive organization
    A creativity expert distinguishes the concept of creativity from that of innovation and discusses how to create a corporate culture that really fosters creativity
  • The Virtues of Virtual
    Abbe Mowshowitz talks about virtual organization as way of managing activities and describes the rise of virtual feudalism.
  • A model of democracy
    When can you have freedom, equality, moral reciprocity and a paycheck? Brook Manville on the surprising blueprint for organizational management.
  • Information access on the wide open web
    RLG's James Michalko discusses the issues surrounding the access and retrieval of scholarly information in today's environment of choice.
  • Do you know what's in your project portfolio?
    Cathleen Benko and Warren McFarlan, authors of "Connecting the Dots: Aligning Projects with Objectives in Unpredictable Times" discuss the dangers of ignoring your IT portfolio.
  • Putting it all together with Robert Kahn
    The co-founder of the Internet recalls the non-commercial early days and looks at today's issues of fair use, privacy and the need for security.
  • Talking with John Stuckey
    A conversation with the Director of University Computing at Washington and Lee University
  • Robert Aiken on the future of learning
    In the hands of skilled teachers, technology will provide students with the best possible education -- both face-to-face and distant, collaborative and individualized, and entertaining and instructional.
  • Inside PARC
    Johan de Kleer talks about knowledge tracking, smart matter and other new developments in AI.
  • The new computing
    Ben Shneiderman on how designers can help people succeed.
  • Mastering leadership
    Richard Strozzi-Heckler on moving to the next level.
  • Sold!
    Ajit Kambil on the inevitable, strategic use of electronic markets and auctions.
  • Quantum leaps in computing
    John P. Hayes on the next killer app, entangled states, and the end of Moore's Law.
  • A conversation with Ruby Lee
    Innovative computer scientist Ruby Lee talks about secure information processing, efficient permutations, fair use in the digital age, and more.
  • Computer science meets economics
    Yale's Joan Feigenbaum talks about the possibilities for interdisciplinary research, the new field of algorithmic mechanism design, and her radical views on security.
  • Talking with Erol Gelenbe
    An international perspective on ubiquitous computing and university education.
  • Bringing resources to innovation
    A ten-year study follows the venture capital business from relative obscurity to boom to retrenchment
  • Complexity in the interface Age: An Interview with Jeremy J. Shapiro
    Do you control technology or does it control you? Jeremy J. Shapiro talks about the power struggle in machine/human relationships and what it means today to be information-technology literate. Shapiro is a faculty member in the Human and Organization Development Program at the Fielding Graduate Institute.
  • Complexity in the interface age
    Do you control technology or does it control you? Jeremy J. Shapiro talks about the power struggle in machine/human relationships and what it means today to be information-technology literate. Shapiro is a faculty member in the Human and Organization Development Program at the Fielding Graduate Institute.
  • What's in a name? Ask yahoo!
    A company's brand is one of its most valuable assets, one that few high tech companies -- most recently HP and Compaq -- understand how to leverage, according to Sam Hill. Hill is co-author (with Chris Lederer) of the new book, The Infinite Asset: Managing Brands to Build New Value. He is the former chief marketing officer at Booz Allen & Hamilton and currently a partner at Helios Consulting Group and also co-author of Radical Marketing, now in its fourth printing.
  • Think globally, act strategically
    John Parkinson relays the challenges for a global financial services firm including anticipating technologies, winning the war for talent, and finding innovative ways to maintain a corporate presence in a worldwide market.
  • Richard Leifer on Radical Innovation
    A group of six faculty members of Rensselaer Polytechnic Institute's Lally School of Management and Technology began work on something they called the Radical Innovation Research Project in 1994, focused on finding out how game-changing innovation occurs in established, mature organizations. The results of the project are reported in a new book from Harvard Business School Press entitled "Radical Innovation: How Mature Companies Can Outsmart Upstarts" and written by the six faculty members: Richard Leifer, Christopher M. McDermott, Gina Colarelli O'Connor, Lois S. Peters, Mark P. Rice, and Robert W. Veryzer. To find out more about the project, we talked with Richard Leifer, who has been at RPI since 1983 and whose academic specialties include organizational behavior, high-performance management, and leadership.