Volume 2011, Number July (2011), Pages 1-9
Untrustworthy information is an increasing threat to decision making in information environments. Rick Hayes-Roth has been studying how to detect and filter away untrustworthy information and base decisions on well-grounded claims that can improve outcomes. In last week's installment of this two-part interview, we focused on the problem and the principles that help ameliorate it. In this installment, we focus on the means to implement the principles in our information environments.
Peter J. Denning
Ubiquity: In many of your circles you have become known as "Mr. VIRT," for valued information at the right time. What is VIRT about? Why is it important in the Internet?
Rick Hayes-Roth: VIRT asks the question, "If people are overloaded, what information would they value and how can the low-value information be automatically filtered out?" Leaving aside for the moment the "pleasure" value of information, let's focus on the "pragmatic" value of information. Information has pragmatic value when it improves expected outcomes by enabling some adaptive advantage. For example, if you know in advance that a company will shortly report profits down, you may be motivated to sell in advance and increase your gains relative to those who sell after the news becomes public.
We spend a large part of our lives on "automatic," following plans in pursuit of certain outcomes. Information that causes us to change course is valuable because it enables us to find an alternative way to achieve the outcome, rather than being stopped. For example, we drive to work on the same road every day, but occasionally we take a different route when we hear from the traffic report that the road is blocked by an accident. Our minds filter out most of what the radio says except for the traffic report. VIRT describes a general service whereby everyone can delegate to intelligent sensors the job of monitoring for conditions that would cause them to change their plans. This can be a very rewarding way to use the Internet.
By filtering out information, VIRT can also lead to unwanted outcomes. Someone who likes a particular kind of pleasure-producing video might ask his VIRT service to filter out all other content, leading that person to become addicted to the video screen. If 24-hour news could be tailored exclusively to individual tastes, I would expect many susceptible people to be glued to their devices.
Ubiquity: Can VIRT networks help with the information glut problem?
Hayes-Roth: Yes, a credible approach to helping people cope with information glut depends in a fundamental way on VIRT networks. Let's separate human information-seeking behaviors into three categories: (1) goal-oriented pragmatic; (2) knowledge-oriented browsing; and (3) pleasure-seeking entertainment. When we seek the best route or lowest fare to visit our family we are exhibiting the goal-oriented pragmatic behavior. When we examine various websites for points of interest and historical facts about the place we'll be visiting, we are exhibiting the second type; in this case we aren't sure what we were looking for and we're open to useful tidbits. When we are looking for a movie with good car chases to occupy the children in the back seat, we are exhibiting the third type.
In all three cases, glut refers to the volume of information overwhelming the capacity of our brains to process it. Our brains can actually be slowed down from our emotional reactions to the oversupply.
VIRT is tailor-made to help people transduce an intractable amount of data into a small manageable set. It does this relative to user-specific profile and plan information. VIRT networks look for information that significantly differs from the user's beliefs that is most likely to materially affect the user's future outcomes. In ranges of tasks I've studied, where users adapt plans continually to changing situations, VIRT can reduce the volume of information reaching the user by a factor of 10,000 or more. This significantly reduces the user's burden of constantly monitoring, evaluating, and considering actions.
This description of VIRT applies to the first of the three categories, where there is a plan and an interest to know when to deviate from it. In the other two categories, there is no plan. However, the user has preferences and interests, which can often be deduced by simple heuristics applied to observations of the user's past choices. A VIRT system based on preferences can present information in the order most likely to appeal to the user. Netflix does something like this with its suggestions about movies you can rent next; based on your evaluations of movies you've already seen. It is not hard to envision a future version of Netflix with VIRT-like qualities that understands that I'm about to board a four-hour flight, I'm carrying an iPad, and I've watched two "Dirty Harry" classics on the last flightsand can suggest, and even preload, movies for the current flight. The current interest in advertising and marketing that tailors information, product suggestions, and coupons around user preferences is very VIRT-like.
Ubiquity: VIRT is a technology that manages the information reaching you. This could obviously affect your ability to ground your claims. What's the relation between VIRT and truthfulness?
Hayes-Roth: If you're a person trying to execute a mission, VIRT gets the most significant information to you. It enables you to ground your decisions to shift the plan on solid evidence. VIRT supports truthfulness and adaptive behavior in this case.
The military is very interested in this technology because it can save lives. Soldiers on the ground or in the air thank the VIRT network for interrupting them and helping them avoid hazards and protect their buddies.
In the other cases, where VIRT-like technologies support preferences and reinforce desires, the user might not see information of potential value because it did not match prior preferences. In this case the technology supports the status quo and does not alert users to possible surprises.
My conclusion is that we have to be careful about technologies for supporting information flow, as they can be used for good or for ill.
Ubiquity: The Internet is touted as a way to level the truth playing field. You can go check information you've received to see what is out there about the topic. Is that what is happening?
Hayes-Roth: The Internet is a big space. Early optimism that the Internet will level the field by enabling more connections is not holding up. People tend to associate with others who have similar beliefs and preferences, and avoid people they find disagreeable. I think this tendency is dangerous because it does not help people to learn to respect opposing views and inquire into whether claimed truths are actually true.
If we take a quick survey, we can see various distinct organic forms in the Internet. Social media and social networks dominate the activity. Most of that reflects cliques of compatible people sharing information they find mutually interesting. Very few of these cliques are scientists, journalists, or fact-checking organizations. Many of them are social, political, or entertainment-centered groups, who affiliate around activities and interests that have little dependency on facts. Whether Beyoncé is cooler than JLo, or Justin Bieber gets a new haircut, or Barack Obama is a Muslim, or Sarah Palin is a quitter, or a range of other questions, generate a lot of enthusiasm among partisans. Publicity consumes a lot of the Internet, and nobody seems to care whether it's fact-based.
Another organic form in the Internet is the various groups involved in science. They are generally skeptics who want to see evidence that supports claims. While they are not perfect at this, a high percentage of their claims are well grounded. They are joined by educators, journalists, publishers, and other professional skeptics.
Wikipedia and like services are another organic form. Various studies of Wikipedia indicate that topics of great interest bring many informed people into the editing process, so that articles become more factual over time. However, it's also clear that people and companies who find descriptions of themselves unappealing routinely delete that material, even when it's factual. This is a major contributor to instability and incompleteness in Wikipedia.
Although the Internet supports collaboration and information sharing, it also hosts forces that are not so friendly to truth. Broadcasting, gaming, and pornography are sometimes praised for generating the funds to pay for wide-scale adoption of the Internet and excellent broadband infrastructure, which helps everyone. Yet the percentage of the world's resources devoted to finding, vetting, and assuring truth is trivial in comparison to the amount spent on entertaining the populace. I am troubled by this. I see it as a threat to our civilization.
Ubiquity: Rene Descartes, circa 1630, dreamt of replacing emotion with rational logic in political discourse. He had grown up in the awful times of the Thirty Years' War. He started a new branch of philosophy, which became very popular and influential. We call it Cartesian philosophy, in which the rational mind and emotional body were separated. That philosophy obviously didn't stop people from making emotional decisions and claims. What is different about your approach?
Hayes-Roth: Well I'm not opposed to emotions and greed, they are facts of life. They also help each of us predict and understand those around us. But progress mostly depends on people becoming better informed over time and making better decisions. As the world population continues to grow and non-sustainable resources grow scarce, it seems likely that human happiness increasingly depends on humanity making better, smarter decisions. Descartes probably wanted that too. I have an approach that avoids having to make distinctions between rationality and emotions.
Ubiquity: How can that be done, and what would it look like?
Hayes-Roth: I envision a service where people can register claims and receive seals guaranteeing their veracity. I call this service "TruthSeal." The TruthSeal database can help other services filter out untruthful claims. The peer review process for scientific papers does this for individual papers; it certifies that author claims are well supported by facts and logic. The challenge is to extend this idea to other fields and make the results available throughout the Internet.
I have cofounded TruthSeal.org to do this. TruthSeal is an experiment that will demand many people, much time and effort, and financial resources. We are aiming to create market incentives to support the work. If people or organizations want to be trusted as truth tellers, TruthSeal can help them. They would pay the cost of having their grounded claims vetted and sealed. The vetting organization would have to be completely neutral and fastidious about avoiding conflicts of interest between evaluators and claimers. Who can be trusted to do that work? Only a non-governmental, non-profit, unbiased third-party.
TruthSeal also builds in a check-and-balance system so that those who don't believe the seal is warranted can challenge sealed claims. A successful challenge results in a penalty against the claimer, revocation of the seal, and a bounty to the challenger. The bounty creates an incentive for informed people to challenge falsehoods with credible evidence.
Many details are yet to be worked out. We believe there is a market for validated claims bearing a licensed TruthSeal mark of truth. We would love to see political campaigns make good use of TruthSeal for their messages about other candidates.
So, with TruthSeal.org, I am not trying to eliminate emotions and truthiness, and I am not trying to outlaw lying and deception. Instead, I'm trying to amplify the volume and salience of truth, using market mechanics, in the hope that this will provide a foundation for improved information services and decision making.
Ubiquity: How can you make it worthwhile for people, at least in large scale political and corporate arenas, to be more rigorous about truth?
Hayes-Roth: We are trying to make it possible for information consumers to make choices among competing alternatives based on the credibility of the information. Consumers already use the Internet to find customer ratings of sellers and their products. Customer ratings are not entirely credible or trustworthy, because they can be manipulated and are not vetted for hidden agendas. Negative feedback about any seller can be extremely harmful in such an environment, because it can be turned up by a search engine and affect many people's decisions even if it is ungrounded.
Our world today is overloaded with information. In making buying decisions or casting votes in elections, citizens have no easy way to identify grounded claims. We aim to give them an easy way to do that. In this way, we hope to instill some of the essence of practical science into the everyday lives of consumers and citizens.
We believe that the offer for a company to associate itself with truthfulness, certified by a neutral third party, will have business value and will attract customers. We hope that political candidates will also want to associate themselves with truthfulness and will seek seals for their main claims.
Ubiquity: How does TruthSeal.org relate to reputation.com and to snopes.com?
Hayes-Roth: All are interested in truthfulness, but with different purposes, incentives, and governance. Reputation.com is a for-profit company that serves individuals and businesses, seeking out information that harms their reputations and removing it from the infosphere. Truthfulness per se is not a principal focus. Many companies use public relations efforts to burnish their images, and their legal offices to eliminate the sources of negative information. Reputation.com invites companies to outsource these functions to them.
Snopes.com is managed by a husband and wife team who evaluate urban myths and other hot memes in the Internet. They choose the topics based on popularity, interest, and tractability. They act as secondary investigators, looking for primary sources that confirm or refute the associated claim. They win by getting lots of traffic to their website and selling advertising and licensing their content. They are like a publishing house for the popular material, specializing in limited efforts to ground popular claims.
TruthSeal.org is a non-profit organization that serves the public interest through an open, independent, trustworthy marketplace where sponsors can guarantee vetted claims and challengers can win bounties by falsifying erroneous claims. TruthSeal supports rapid filtering of misinformation, potentially reducing much glut by eliminating untrue claims. It gives high-integrity companies, politicians, and agencies a competitive advantage through an amplification of their truthful claims. It should help consumers and citizens seek truth and reward truth tellers. TruthSeal conforms to the high standards required for IRS 501(c)(3) charitable organizations and the additional standards imposed on non-profits chartered in California. Its officers, directors, and employees work on the public-service mission, are free of conflicts of interests, and use clear procedures for licensing TruthSeal marks and adjudicating challenges. TruthSeal has no stake in any dispute between guarantors and challengers.
Ubiquity: Is truthfulness good business?
Hayes-Roth: Absolutely. Professor Mike Jensen of Harvard Business School has shown that business groups incur a great number of costs when they don't maintain high "integrity," that is, when they don't "honor their word." To honor your word you need to be honest about your intentions, base your actions on grounded claims, and own up to your mistakes.
Honesty within a group rests on your personal practice of honesty. There are numerous messages for young people that seem to punish honesty and reward lying. It costs less to claim a result than to produce it, and it is easier to fabricate a situation than to bring it to reality. There may be some short-term payoffs in truthiness and lying, but the long-term effects are simply not worth it. Just as college professors use software to detect plagiarists, employers are turning to software and services that search for disqualifying information about you. Dishonest statements and other embarrassing events are likely to be captured and saved in the infosphere; evading them is getting harder and harder. We are entering a world in which it will be easy to know who lies and who tells the truth, who can't be counted on and who keeps their word. Your history affects who will hire you, who will trust you, and who will buy from you. If you are a provider, customers will shun you if they think you are a "rip-off" artist. If you are a political candidate, voters will know whether you lied or misrepresented the truth last time around.
Ubiquity: How would a practice of grounding claims help everyone, especially the readers of this journal?
Hayes-Roth: Claude Shannon showed us that information is what enables us to reduce our uncertainty. If we are confident we know something or are confident we don't care about something, data or bits in those categories afford us no information. Information improves our odds of understanding something correctly or identifying our situational state more accurately. It reduces our probability of erroneous judgments. The "value of information" reflects what difference such improved accuracy would make for us.
So, when information comes to us in the form of claims that appear to be of high value, it is also of value to know if the claims are grounded. I would expect most of us would be delighted if all falsehoods, lies, and misrepresentations would be automatically filtered out unless specifically requested. Going further, when we are presented with information that challenges one of our beliefs, we want a quick answer to our first question, "What is the evidence for that claim?" Why can't we have a "truth hyperlink" that would allow us to click on a claim and immediately learn the source, the supporting evidence, and what guarantees exist about the evidence. TruthSeal marks do that: they make just that meta-data visible when a reader mouses over the marked text.
In the end, people want information for diverse reasons, but they value information that improves their outcomes. Who wants to waste time and money on false claims? Let's make it easy for people to learn the pedigree of claims as a routine property of information display.
Ubiquity: Could you summarize your advice for our readers? How can they make their lives and workplaces more truth-friendly?
Hayes-Roth: First, learn the difference between truthfulness and truthiness of claims. Truthful claims are based solidly on facts and are open to being falsified; truthy claims are based on opinions, feelings, and emotions, and often present no obvious way to falsify them.
Second, practice sincerity. You are sincere if your expressed intentions are the same as the intentions you say privately to yourself.
Third, be a discerning listener. Can you tell the difference between a truthful claim and a truthy claim? Do you know when to trust a claim and when to ask for the evidence supporting it?
Honesty is always the best policy.
Ubiquity: Thank you.
Hayes-Roth: You're welcome.
©2011 ACM $10.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2011 ACM, Inc.