acm - an acm publication

Articles

Digital Economy: The Evolution of the Digital Economy

Ubiquity, Volume 2023 Issue March, March 2023 | BY Irving Wladawsky-Berger


Full citation in the ACM Digital Library  | PDF


Ubiquity

Volume 2023, Number March (2023), Pages 1-6

Ubiquity Symposium: Digital Economy: The Evolution of the Digital Economy
Irving Wladawsky-Berger
DOI: 10.1145/3587258

We are in the early stages of a historical transition from the industrial economy of the past two centuries to a new kind of a 21st century digital economy. This transition is being driven by dramatic advances in digital technologies and the internet over the past two decades, as well as the more recent advances in big data and artificial intelligence. The paper will discuss some of the key technologies that are driving this long-term transformation, including artificial intelligence, blockchain, cloud computing, and next generation user interfaces, as well as their potential impact on the economy, society, and our personal lives.

The digital age was born in the early 1990s with the opening of the internet to commercial users and Internet service providers, the development and public release of the Netscape browser, and the growing number of dot-coms and new business models—some of which turned out to be very innovative, and some rather silly. The advent of the internet marked the historical transition from the industrial economy of the previous two centuries to a new kind of "digital economy."

Anyone with a personal computer, a browser, and an internet connection could now communicate via email with family, friends, colleagues, customers, and business partners anywhere in the world. They could access huge amounts of information on the fast-growing World Wide Web. And they could deal with the many transactions in their everyday life, from shopping to banking.

The digital economy continued to advance over the past two decades, driven forward by powerful technology and market forces. And then came COVID-19. The adoption of technological behaviors in response to the pandemic, from videoconferencing to online shopping, meant digitalization had already reached levels that were not expected for many more years. Just about everyone has predicted an increasingly digital post-pandemic new normal. "The COVID-19 crisis seemingly provides a sudden glimpse into a future world, one in which digital has become central to every interaction," said McKinsey in an April 2020 article.

I was fortunate to have seen firsthand the rise of the digital economy thanks to my position as general manager of IBM's newly formed Internet Division in 1995. Ever since, I've been closely following the evolution of internet technologies, wondering what technology trends are likely to have the greatest impact on the digital economy.

So, based on a number of research papers and industry reports that I've written about in my weekly blog, let me offer my opinion of the key technology trends that are likely to have a major impact on the digital economy in the next 10 years or so. Four major technologies stand out: artificial intelligence, blockchain, cloud computing, and next generation user interfaces. Let me briefly discuss each of these trends.

Artificial Intelligence

After decades of promise and hype, the necessary ingredients have finally come together to propel AI from early adopters to the general marketplace: powerful inexpensive computers, advanced algorithms, and huge amounts of data on almost any subject. And, as shown in Stanford University's 2021 AI Index Report, AI advances are continuing to accelerate.

These advances are expected to have profound economic implications. Two 2018 reports, one by PwC and the other by McKinsey concluded, over the next few decades, AI will be the biggest commercial opportunity for companies and nations. Using different methodologies, they each estimate that AI will increase global GDP by roughly 15% by 2030, adding around $12 trillion to the global economy. "Throughout the pandemic, we've seen organizations across sectors adopting and scaling AI and analytics much more rapidly than they previously thought possible," noted a McKinsey survey on The State of AI in 2020.

"Just as earlier general-purpose technologies like the steam engine and electricity catalyzed a restructuring of the economy, our own economy is increasingly transformed by AI," wrote Stanford professor Erik Brynjolfsson in a recent paper. "A good case can be made that AI is the most general of all general-purpose technologies: after all, if we can solve the puzzle of intelligence, it would help solve many of the other problems in the world. And we are making remarkable progress."

Blockchain Technologies

I consider blockchains and distributed ledger technologies (DLT) as major next steps in the evolution of the internet based on two key promises: enhancing the security of the internet; and reducing the inefficiencies and overheads in applications involving multiple institutions.

The design decisions that shaped the internet back in the 1980s didn't optimize for security and privacy, or for the ability to authenticate transactions between two or more parties. That's all the responsibility of the applications running on top of the TCP/IP layer, and they each generally do it their own way, sometimes not at all. Not surprisingly, the lack of standards for security, privacy and transactional integrity have been the biggest challenges facing the internet in the digital economy.

Blockchain promises to help us enhance the security and integrity of the Internet by developing a middleware fabric on top of the TCP/IP layer with the required security, privacy, and transaction services, and by providing open-source software implementations of these standard services which all blockchain platforms and applications would support.

The second major promise of blockchain technologies is to help lower the costs and improve the efficiency of applications that involve multiple institutions—e.g., global supply chains, financial services, health care—by providing an immutable, non-revocable record of all the transactions among these institutions, which, for example, can help in the timely resolution of errors and disputes. Moreover, blockchain-based smart contracts can bypass traditional intermediaries and automate many of the simpler interactions between these institutions, thus reducing transaction costs.

Cloud Computing

The impact of the global pandemic reinforced the tremendous value and necessity of cloud computing to the world's economy. Without the cloud, businesses could not have sent millions of workers home, maintained global supply chains, or shifted entire industry business models in a matter of weeks. For example, Moderna's CEO has said once it received the genetic sequencing of the COVID-19 virus in early 2020, cloud enabled the company to significantly accelerate the development of its mRNA vaccine, as well as the planning for manufacturing and supply chain capabilities needed to be ready for massive distribution.

IT analysts predict most companies expect to accelerate their shift to cloud-centric infrastructures, applications, and data services over the next few years. Beyond rapid scaling and enhanced resilience, cloud service providers enable companies to adopt agile product development practices such as deploying emerging technologies and applications to get early feedback from their customers. In addition, cloud service providers can offer access to leading-edge capabilities beyond what most companies could afford to develop on their own in rapidly advancing areas including AI, blockchain, cybersecurity, IoT, and augmented and virtual reality.

Next Generation User Interfaces

For years, companies found all kinds of reasons for not embracing work from home, virtual meetings, online learning, telemedicine, and other online applications. But not only have these digital applications worked remarkably well over the past two years, but they offer a number of important benefits, like not having to travel for hours to participate in a 45-minute meeting or having to wait in a room full of sick people for a simple medical consultation. In the coming years, we can expect major innovations to improve the user experiences of such online applications.

PCs brought us graphical user interfaces (GUI) in the 1980s, and browsers played a major role in enhancing our Web interactions. But given how much time we now spend online, it's time for the next major advances in user interfaces and the overall user experience. The video-game industry offers the most advanced user interfaces and platforms. But beyond videogames, such platforms and interfaces have had very limited success.

Whether we call them 3D virtual interfaces, immersive platforms or the metaverse, this is likely to be one of the most exciting areas of innovation in the coming decade, promising to bring highly visual, interactive, appealing interfaces to all sorts of applications in business, research, education, and healthcare.

Author

Dr. Irving Wladawsky-Berger is a research affiliate at MIT's Sloan School of Management. He is a fellow of the Initiative on the Digital Economy, of Cybersecurity at MIT Sloan, of MIT Connection Science, and of the Stanford Digital Economy Lab. He retired from IBM in May of 2007 after a 37-year career with the company, where his primary focus was on innovation and technical strategy. He led a number of IBM's company-wide initiatives including the internet, supercomputing, and Linux. He's been adviser on digital strategy and innovation at Citigroup, at HBO, and at MasterCard; adjunct professor at the Imperial College Business School; and guest columnist at the Wall Street Journal's CIO Journal. Since 2005 he has been writing a weekly blog, irvingwb.com. Dr. Wladawsky-Berger is a member of the Committee on Science, Engineering and Public Policy of the American Association for the Advancement of Science, and of the Linux Foundation Research Advisory Board. He was co-chair of the President's Information Technology Advisory Committee, and a founding member of the Computer Sciences and Telecommunications Board of the National Research Council. He is a fellow of the American Academy of Arts and Sciences. A native of Cuba, he was named the 2001 Hispanic Engineer of the Year. Dr. Wladawsky-Berger received an M.S. and a Ph. D. in physics from the University of Chicago.

2023 Copyright held by the Owner/Author.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2023 ACM, Inc.

COMMENTS

POST A COMMENT
Leave this field empty