Putting Net ownership in the hands (and homes) of the people -- a future scenario
At long last, there's lots of work going on in the area of microcellular packet networks. This is about one of the possible outcomes, and it's based on the old saying "Possession is nine points of the law." Here the point is that we could possess the future Internet, and if we do so, we can also be in possession of our data, and we can have more freedom, not less. My vote says that would be a good thing.
We could own the whole Internet, operating as a kind of mixed barter economy. Everyone would buy their own pieces of the network -- basically a computer with a local radio hub that would receive services in exchange for services provided. Take a simple example of a movie you wanted to watch. Your home machine would broadcast a routing request that would spread through the network to open sufficient virtual channels to a source of the movie, and neighboring machines would relay the packets you had requested -- and in exchange, your machine would provide similar services for them. If you were constantly making more demands on the network than your machine was able to reciprocate, your service requests would start getting rejected, and you'd have to make some choices. You might watch fewer movies, or upgrade your equipment, or even make up the difference in cash, though the goal would be for the network to operate mostly on a service-swapping basis.
A system like this would really be driven by the input of creative content and services that people were willing to pay for, but intermediated by all the machines in the middle. Using a movie example again, your machine might help set up a virtual circuit, but at the same time notice that the nearest cached copy was many hops away. If your machine had free storage, it could put in a bid for local redistribution rights, and if the bid was approved, it would save a copy of the movie locally -- and a later request from your neighborhood would be handled locally, with your machine receiving a cut of the royalties. An intelligent network system would do most of this automatically and transparently, though there should be room for intelligent intervention if you, the owner, wanted to intervene. When I was shopping for my piece of the Internet, I'd be looking for one with intelligent guidance dialogs for maximizing its own profitable participation in the network -- but in a free market with everyone else doing the same thing, Adam Smith's old invisible hand will keep things working smoothly. In fact, this particular movie-related loop might be closed if the creator of the movie was actually producing his movies as distributed-computer-generated animations, buying the cycles from the Net with royalties received from the viewers of his earlier work.
Another major feature I'd be looking for would be privacy protection -- I'd very much want my personal information to be secured properly in my possession in my own machine. Sure, the neighboring machines could ask nosy personal questions, but -- privacy-loving guy that I am -- my machine would be set not to answer. Other folks might want to trade their own personal information or agree to receive ads in exchange for more services, but that would be their free choice, too.
The manufacturers would have a clearly defined mission in such a system. By providing base stations with more capabilities, they would compete in terms of delivering the best performance for the price, though the tradeoffs and capabilities would keep it a complex market. However, since the demand for services is supposed to be unlimited, the manufacturers ought to be pretty happy with that open-ended situation. As long as no manufacturer got too strong, it would be in the best interests of all of them to comply with the standards -- which is of course where the ACM fits in. The standards would be the basis of the exchanges, and consumers would eagerly avoid nasty proprietary wrinkles that interfered with the barter basis of the network. The government's role would be as referee and to take care of cheaters, though proper security standards should preempt most of that.
There are two major technical factors that will make this system possible. Perhaps most important is that such a system is completely scalable as regards local bandwidth. By reducing the transmission power, the size of the cells would decrease, but every local node would remain fully connected. Consider a tessellation of hexagons, and then shrink the hexagons -- every cell still has exactly six neighbors. In fact, as density increases, such a network would tend to become more regular and efficient. The other factor is that the required capabilities are increasingly within the range of the hardware. Yes, the routing and caching protocols to do this efficiently are complicated, but I already own more cycles than a supercomputer of a few years back. Local storage density is also exploding.
One developmental path is as an evolutionary extension from home networks based on systems like Bluetooth and Jini. It would be natural for these home networks to start talking to each other, for example to provide neighborhood security and backup services, and this could easily evolve to large-scale cooperation.
An opposing vision is that we just go the way we've been going, with bigger hubs and bigger servers run by companies controlling more and more of our lives. This is the vision of a few big companies just naturally getting bigger and bigger, with large chunks of their business consisting of buying and selling what they've managed to learn about us -- and too bad if we don't like it. Our data will be in their possession -- nine points in their favor.
After it's arrived, it usually looks obvious and inevitable, but predicting the future in intrinsically very speculative. I hope these speculations have at least managed to amuse. And we should be optimistic, because I believe the historical trend is clearly towards higher degrees of freedom -- and this future would be based on important choices freely made.
Shannon Jacobs is a technical editor for a subsidiary of IBM, with concentration on software-related research reports.
A Ubiquity symposium is an organized debate around a proposition or point of view. It is a means to explore a complex issue from multiple perspectives. An early example of a symposium on teaching computer science appeared in Communications of the ACM (December 1989).
To organize a symposium, please read our guidelines.
Ubiquity Symposium: Big Data
- Big Data, Digitization, and Social Change (Opening Statement) by Jeffrey Johnson, Peter Denning, David Sousa-Rodrigues, Kemal A. Delic
- Big Data and the Attention Economy by Bernardo A. Huberman
- Big Data for Social Science Research by Mark Birkin
- Technology and Business Challenges of Big Data in the Digital Economy by Dave Penkler
- High Performance Synthetic Information Environments: An integrating architecture in the age of pervasive data and computing By Christopher L. Barrett, Jeffery Johnson, and Madhav Marathe
- Developing an Open Source "Big Data" Cognitive Computing Platform by Michael Kowolenko and Mladen Vouk
- When Good Machine Learning Leads to Bad Cyber Security by Tegjyot Singh Sethi and Mehmed Kantardzic
- Corporate Security is a Big Data Problem by Louisa Saunier and Kemal Delic
- Big Data: Business, technology, education, and science by Jeffrey Johnson, Luca Tesei, Marco Piangerelli, Emanuela Merelli, Riccardo Paci, Nenad Stojanovic, Paulo Leitão, José Barbosa, and Marco Amador
- Big Data or Big Brother? That is the question now (Closing Statement) by Jeffrey Johnson, Peter Denning, David Sousa-Rodrigues, Kemal A. Delic