acm - an acm publication


Workings of science
Character traits of science

Ubiquity, Volume 2022 Issue March, March 2022 | BY Phil Yaffe 

Full citation in the ACM Digital Library  | PDF


Volume 2022, Number March (2022), Pages 1-24

Ubiquity Symposium: Workings of science: Character traits of science
Phil Yaffe
DOI: 10.1145/3512333

People who properly understand and appreciate science seem also to have an unlimited capacity to understand and appreciate most other things in life such as art, music, philosophy, poetry, sports, etc. By contrast, many people who don't properly understand and appreciate science seem to really hate it, even to the point of saying that science is "dehumanizing" and therefore they want nothing to do with it. This essay proposes a possible means of overcoming this unfortunate (and dangerous) misconception by positioning science as if it were an actual human being and then defining its many admirable qualities. It further suggests how the concept of science as a human being might be introduced into the educational system K–12. Not as a subject for study itself, but rather as the indispensable, rock-solid foundation on which the teaching of all other subjects in the curriculum would depend.

I had one of the most shocking and saddening experiences of my life when I became a university student. For the first time, I met someone who seemed to be proud of not understanding science.

This experience was so shocking because the idea of taking pride in ignorance seemed to be inherent nonsense. The experience was so saddening because proclaiming ignorance of science is equivalent to proclaiming ignorance of life. Whether we like it or not, we live in a world which, if not dominated by science, is massively influenced by it. Science illiteracy is not just abandonment of a part of life, but of life itself.

Ever since that time, I have been promoting the idea that not only must science be better taught in schools K–12, but that science underlies all other subjects students are exposed to. In other words, the fundamentals of science should suffuse the entire curriculum, no matter what the subject matter.

A few years ago, I developed an idea which could help achieve this vital objective. It is somewhat superficial in the sense that it uses terms such as "real," "knowable," "predictable," "testable," etc. as the lay public would probably understand them. It does not take a deep dive into philosophical lucubrations about what does it really mean, is everything eventually knowable or are some things inherently unknowable, etc.

In 2013 I published an essay on the subject, presenting it as a possible means of suffusing science into every aspect of the K–12 school curriculum. The idea was to instill ideas and attitudes about science that would stay with all students through graduation from high school and beyond. Since it would start with children as young as 5–6 years old, its concepts had to be simple and easily digestible. A more profound look at the nature and application of scientific thinking could come later, perhaps starting from about the age of 10 or 11.

I am not a pedagogue and make no pretension to know how this could be done. I leave this for the professionals. However, I would now like to update my original essay to bring some of these deeper and perhaps less tangible ideas to the fore.

In the following, you will see the original text of the 2013 essay and, where appropriate, discussion of some the deeper issues that students nearing the end of their K–12 career would probably be better equipped to handle.

Many people believe understanding science requires a special kind of thinking, i.e. that your brain must be wired differently from other people. According to Albert Einstein, who knew a thing or two about it, "The whole of science is nothing more than a refinement of everyday thinking." In other words, scientific thinking is just an extension of the way everyone already thinks. This is excellent news, because it means that people who say they are incapable of understanding science are probably wrong. They do understand science but were never aware of it.

Realizing this is extremely important because most people who think they don't understand science are constantly being called upon to vote on issues about science. To cite just a few examples:

  • Should we be so concerned about global warming as to spend billions and billions of dollars to fight it?
  • Should we ban nuclear energy as being too dangerous and too untrustworthy?
  • Are genetically modified foods potentially so damaging that they should be prohibited even if they may be the best hope for feeding the world's burgeoning populations?
  • Should homeopathy and other alternative medical treatments be recognized as legitimate, and paid for by government and private insurance plans?
  • Which computational model of the spread of COVID-19 should we believe?

What distinguishes scientific thinking from ordinary thinking is only that scientists do it in a more rigorous way. Scientists are practiced in constructing packages of evidence to support their claims. Other people admire that.

Almost everyone has heard of the scientific method and understands it has something to do with setting up hypotheses, then running experiments to test those hypotheses. This is essentially correct, and not difficult to understand. What seems to cause most of the problems is lack of understanding of what I call the "scientific approach," without which the scientific method would be useless.

The scientific approach is a set of fundamental principles to which all scientists should adhere. These fundamental principles are the intellectual raw materials from which the scientific method was fashioned.

There are 16 of these principles or "character traits" of the scientific approach. Don't be put off by this apparently unwieldy number. Many of the traits are closely interrelated and all of them are just common sense. So remembering and using them should present little difficulty.


Don't be shocked by this statement; it has no religious content. It simply means that science relies on assumptions. If you remember your high school geometry, you will recall everything depended on a handful of axioms, i.e. accepted but unproved assumptions on which geometers rely to build their proofs. But geometry is not the only branch of mathematics that depends on unproved assumptions. All mathematics does, as does all science.

The fact that science is based on faith (axioms) seems to be a mystery to people who should know better. A few years ago, a leading international newspaper ran an editorial article by a "thinker" denouncing science for not admitting that, like religion, it is based on faith, which evoked the following response.

Re: Having faith in science

Every scientist worthy of the name knows that science is based on faith. Even mathematics, the purest of the sciences, is based on axioms, i.e. unproved but necessary laws. The difference between science and religion is that scientific faith is constantly being tested and, when found wanting, constantly being adjusted (sometimes acrimoniously) to fit the facts. Non-Euclidian geometries, quantum mechanics, matter-antimatter, and relativity are telling examples.

Paul Davies asserts: "(Science's) claim to be free of faith is manifestly bogus." Hogwash! It is the claim that science makes such a claim that is bogus. Mr. Davies, of all people, should know this.

In religion, faith is unshakable. In science, faith can be shattered by new evidence that casts doubt on the assumptions.


Again don't be shocked by this statement; it has no religious content. It simply means that whenever one of its axioms (assumptions) comes into doubt, science is ready to investigate it and, if found wanting, to throw it out.

For example, one of the axioms you learned in high school geometry was that if you have a straight line and a point located off the line, only one new line can be drawn through that point parallel to the first line. In other words, given a situation like this (/ .), only one line can be drawn through the point to give a second line parallel to the first (/ /).

In the mid-19th century, some mathematicians seriously began questioning if this apparently "self-evident truth" was really correct. So they did some investigating and concluded that it wasn't. These investigations were the source of what became known as "non-Euclidian geometry."

What are the fundamental assumptions of science?

This is not an easy question to answer because different people come up with different lists. Here are four assumptions on which most people seem to agree.

  1. The world is real. In other words, we are not lying in a coma, imagining that things are happening. They really are.
  2. There is much about the real (physical) world that is knowable and comprehensible. If we take the time and trouble to do so, we can progressively come to understand more of the world.
  3. There are laws that govern the real world. The world operates as a system with internal logic.
  4. Laws that govern the real world are discoverable. We can use these laws to understand what has already happened, what is currently happening; and most importantly, what is going to happen. This is called "prediction."

For many people, these four assumptions are enough. However, in this age of space exploration, a fifth assumption is becoming increasingly important.

  1. The laws that govern the real world are essentially the same everywhere in the universe. In other words, the laws of gravity we know on earth are the same as those that operate on other planets around other stars; the chemical reactions we know on earth are the same as those that operate on other planets around other stars, etc.

Therefore, when we finally travel to other planets and other stars, everything we have learned about how things work here on earth will still be valid way out there. However, if we discover things work differently out there, we shouldn't be surprised. After all, it is an assumption.

• Digging Deeper

Some observers fret about the "reality" assumption (no. 1) because different people give different interpretations to the same phenomena. The kind of reality referred to in this assumption is the reality we cannot escape regardless of our individual interpretation. For example, the phenomenon we call a wall is quite real whether we think of it is as something physical or a mysterious force field. We simply cannot pass through it.

Some observers take issue with the idea that everything in the world is ultimately knowable. A common example of such an unknown is a black hole. They argue that what is going on inside a black hole is unknowable because all the particles that could bring information out simply cannot escape. Likewise, they argue that the laws that govern what happens inside a black hole are also unknowable but may not be the laws seen elsewhere in the universe.

These conjectures run afoul of a couple of facts.

  • Recent discoveries about black holes show that some particles do escape. Maybe some of that "unknowable" information can escape after all.
  • The laws that govern what goes on in a black hole are not necessarily different. They may be only an extreme case of the laws seen elsewhere in the universe, about which we currently have an imperfect understanding.

Isaac Newton once characterized his amazing body of work as follows: "I do not know what I may appear to the world, but to myself I seem to have been only like a boy playing on the seashore and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me."


It is common sense to believe the sun moves while the earth is stationary; we see this happening every day. But we know it isn't true. It is also common sense to believe that no small lump of uranium could release as much energy as 20,000 tons of dynamite, but we know this is true.

Scientists sometimes make discoveries so surprising that they totally transform our perception of the world and fundamentally challenge our most deep-seated ideas about how the world works.

Imagine the shock when Anton van Leeuwenhoek (1632–1723), a pioneering inventor of the microscope, turned his crude instrument on what he thought was just a bowl of pond water and found it teeming with thousands of tiny, previously invisible plants and animals. The microscope had revealed a whole new plane of life no one had ever suspected existed. His discoveries eventually led to major advances in physiology and Pasteur's germ theory of disease, on which much of modern medicine is based.

Imagine the shock of many philosophers who had believed for a hundred years that there was an algorithm for determining the truth of any proposition, when in the 1930s Alan Turing, Kurt Godel, and Emil Post proved that no such algorithm could possibly exist?

In short, the raison d'être of science is to go beyond the obvious to determine what is real in the physical (not the metaphysical) sense, often with surprising, counterintuitive results. Counter-intuitivity seems to be why many people are suspicious of science. While they accept that the earth orbits around the sun, some do not accept there were ever astronauts on the Moon because intuitively they believe it to be impossible.

Common sense (intuitivity) and science should never be put into opposition, for the very good reason that they have little or nothing to do with each other.

As noted by biochemist (and nonpareil science and science-fiction writer) Isaac Asimov: "The most exciting phrase to hear in science, the one that heralds new discoveries, is not 'Eureka!' (I found it!), but 'That's funny.'…"

• Digging Deeper

Much to my surprise, one of the statements in this section generated a lively discussion between two of the Ubiquity editors. I wrote:

"Counter-intuitivity seems to be why many people are suspicious of science. While they accept the earth orbits the sun, they do not accept there were ever astronauts on the Moon because intuitively they believe it to be impossible.

"Common sense (intuitivity) and science should never be put into opposition, for the very good reason that they have little or nothing to do with each other."

The discussion revolved largely around what is meant by "common sense." This is important, because the term common sense is so commonly used; however, if different people have different (and possibly diametrically opposed) ideas about what it means, scientific progress must inevitably be hampered. Here is the gist of the discussion.

"It is common sense to believe the sun moves while the earth is stationary; we see this happening every day. But we know it isn't true. It is also common sense to believe that no small lump of uranium could release as much energy as 20,000 tons of dynamite, but we know this is true."

Editor A: My experience from my middle school teaching was that the rising sun was shown to be an illusion. I was seeing the effect of the earth's rotation, not the sun's movement. After that, I always interpreted the "rising sun" this way. The rising sun is not now part of my "common sense."

Editor B: This is something you had to learn because what your senses daily seem to be telling you bears witness to the opposite.

Editor A: I realize there is a definition of "common sense" that restricts to what our five senses tell us. However, the signals from our sensors are being interpreted by our brain relative to assumptions and beliefs we have. So "rising sun" is an interpretation of a visual experience, but so also is "falling horizon."

Editor B: Imagine telling members of an isolated Amazonian tribe that the sun is not actually rising and setting, which is counter to what they see every day of their lives. To them it is counter-intuitive that the horizon falls.

Editor A: OK, but I have no way of knowing how an Amazonian tribe interprets the daily sunrise. Has someone researched this? Do they also believe the sun circles earth?

Editor B: Counter-intuitivity seems to be why many people are suspicious of science. While they accept that the earth orbits the sun, they do not accept there were ever astronauts on the moon because intuitively they believe it to be impossible.

Editor A: That is an interesting claim. Most people have no direct experience of planetary motion. They go by what astronomers tell them. Yet today they believe it, whereas several centuries ago they believed that the sun revolves around the earth. When NASA streamed live images from the moon, most people believed it was a real transmission; however, a few believed it was a hoax and the astronauts were broadcasting from a hidden TV studio. Why does virtually no one believe the sun revolves around earth, yet a minority continues to believe no astronaut was ever on the moon? This seems to indicate that we should be skeptical of our intuition (or counter-intuition) and look to science and other sources for reliable evidence to support the interpretation we are asked to believe.

Editor B: Common sense (intuitivity) and science should never be put into opposition, for the very good reason that they have little or nothing to do with each other.

Editor A: While we might like common sense and science not to be in opposition, it looks to me that there are real cases where they are. A recent example is the U.S. president disagreeing with the CDC director because the president has a different intuition about the seriousness of COVID-19 from the director. A famous example in science was the intuitive belief that light is transmitted through space via an unseen fluid called the "luminiferous ether." This belief was brought into serious doubt when the Michelson-Morely experiment (1887), despite the extreme high precision of its instruments and techniques, could not detect an ether. In a thought experiment, Albert Einstein (who was famous for thought experiments) assumed there is no ether and light always moves at the same speed. He then let his mind (and mathematics) see where these audacious assumptions would lead. The result was his special theory of relativity (published in 1905), replacing the old common sense, which was no longer tenable.

Editor B: For me, common sense about the physical world is not a learned response but what is intuitively evident, like the rising and setting of the sun. If it takes education to convince you otherwise, then it is not common sense about the physical world, but learned sense, which of course is what science is all about.

Editor A: What about the examples of science making hypotheses that were eventually accepted (learned) by everyone but were not part of the common sense at the time? We know that people thought the earth was flat because that it how it appeared to them. Over the centuries, science has proved beyond doubt that the earth is round. Today it is most people's common sense that the world is round, although I'm sure there are few flat-earth holdouts.

Editor B: Again I would proffer that this is learned sense because it must be taught and people have to be convinced that it is true. However, once this has been achieved, it becomes common sense, but only to those people who have been convinced.

What should we make of this?

There is a difference between common sense of everyday experience and learned sense. Science has frequently exposed gaps in our common sense, but not because we were looking for them. You don't examine things that are obvious to everyone in the community and seem to need no explanation or justification. Thus, common sense can be an impediment to science, which is exactly why I say common sense and science should have little or nothing (preferably nothing) to do with each other.


In discussing the work of the eminent astronomer and science popularizer Carl Sagan, Joshua Gough wrote the following:

"The dogmatist knows all the answers. He or she accepts no criticism and opens no ears. Merely questioning the dogmatist amounts to overt and intolerable criticism ipso facto in his or her mind. The dogmatist listens to no questions and throws all criticism onto the scrap heap.

"The true scientist has more questions than answers. He or she explicitly seeks criticism and opens ears to others. Strong questions about the scientist's ideas afford opportunities to confirm or deny their validity. The scientist must accept questions and readily understands that he or she may be forced to throw cherished ideas onto the scrap heap."

Even the most distinguished of scientists sometimes tend to put up bulwarks against new ideas when they differ too radically from those they already hold. Albert Einstein, whose thoughts on time, space, matter, and energy so surprised his professional colleagues (let alone the general public) in the early 20th century, was himself so astonished by some aspects of the then-emerging science of quantum physics that he simply refused to believe them. He was particularly vexed by the contention of quantum physics that at the subatomic level, cause and effect break down, i.e. you can never tell with certainty which of several possible effects will be produced by a given cause. To which he retorted, "God does not play dice with the universe."

Practically and psychologically, it is never easy to jettison what one already knows, and in which one has confidence to replace it with something radically new whose bona fides have yet to be fully established. The continuing battle about global warming between many scientists who, after years of discussion, now accept the idea and the minority who are still skeptical about it is only the latest case in point. However, the number of recruits to the cause has now become so overwhelming that the naysayers are less and less seen as legitimate critics, but as irrational holdouts.

This is frequently how science proceeds:

  • First, someone advances a hypothesis.
  • Second, many others subject it to severe scrutiny.
  • Third, only when it accumulates enough allies does it become scientific "truth."

It is of course always possible for a hypothesis later to be discredited by new evidence. However, until that time (if ever), the hypothesis takes its place as another sturdy plank in science's ever more solid foundation of reliable ideas.

• Digging Deeper

Some time ago I heard the story of an eminent scientist (I don't remember the discipline) who had developed a controversial ground-breaking idea which had been under attack for several years. Each time it was attacked, he had been able to successfully repel the criticisms.

One day he was attending a symposium. One of the speakers had found a fatal flaw and spent about 20 minutes completely destroying his idea. The scientist walked onto the stage, shook the speaker's hand, and said, "Thanks for revealing this fundamental error. Now I can stop defending it and turn my attention to doing something really useful." The story is probably apocryphal, but how I wish it were true.

On the other hand, here is a story I know to be true. But oh how I wish it weren't because it demonstrates that even someone with incredible native intelligent and sterling academic credentials can ignore everything they know when it conflicts with what they prefer to believe.

I am referring to Kurt Patrick Wise. Director of the Truell McConnell University's Creation Research Center. Wise has a Ph.D. in paleontology from Harvard, an M.A. in geology from Harvard, and a B.A. in geology from the University of Chicago. At Harvard, he studied under the renowned paleologist and evolutionary biologist Stephen Jay Gould.

Wise is a confirmed young age creationist. Although he believes that there are scientific reasons to accept a young earth and a young universe (less than 10,000 years old), these are secondary. "As I shared with my professors years ago when I was in college, if all the evidence in the universe turns against creationism, I would be the first to admit it. But I would still be a creationist because that is what the Word of God seems to indicate."

Science is open-minded. Alas, not all scientists are.


No scientific "truth" is ever fully safe because it is never possible to prove a positive, only a negative. For example, if you do something 1,000 times in a row and it always works, you cannot be absolutely certain that it won't fail on the 1,001 time. If it again succeeds on try 1,001, you have the same problem with try 1,002, 1,003, 1,004, etc.

To put it into less abstract terms, suppose a racehorse wins 10 times in a row. You cannot be absolutely certain that it will win the 11th time; however, this would probably be the way to bet.

Here is another example of this idea, which exposes a common scientific fallacy.

Suppose you flip a coin, and it comes up heads nine times in a row. Someone bets you $50 against $5 that on the next flip it will come up tails. Should you take the bet? Absolutely! The coin has no memory. It doesn't know the flip has already come up heads nine times in a row, so the chances that it will come up heads again are just the same as if the previous nine flips had never happened, i.e. 50-50 (one out of two).

However, if someone bets you $50 against $5 that you couldn't flip and get 10 heads in a row before you start, you would be very wise to refuse. The odds against flipping 10 heads in a row from the first flip are more than a thousand to one; you would almost certainly lose.

On the other hand, a theory that consistently predicts the results of an experiment or application (theories are supposed to be able to predict things) is not a 100 percent guarantee that the theory is correct. It is always possible that you have not yet tried an experiment or application that would show up a flaw. As Einstein said about relativity, "No amount of experimentation can ever prove me right; a single experiment can prove me wrong."


Many scientific terms are arbitrary; someone or some committee has decided that this is how we should talk about these things, so that is what we do. This is not unreasonable. Unless we agree on what we are talking about, it is unlikely that we will have a useful discussion.

For example, you may have heard of the controversy about whether Pluto, discovered in 1930, should still be considered a planet or classified as something else.

Closer to home, you may have heard that the tomato is really a fruit, not a vegetable. But why? Ask yourself what you mean by the words "fruit" and "vegetable", and you will probably discover that you really don't know. It's just that as you were growing up, you always heard people referring to certain plants as fruits and others as vegetables, so you learned to do the same.

Scientists can't be so casual. Botanists have very carefully defined what they mean by "fruit" and "vegetable." Study of the tomato has shown that it meets the definition of a fruit. You might be surprised to learn that eggplants, cucumbers, and squashes such as zucchini and pumpkins also meet the definition of a fruit. In common conversation, if you prefer, you may still refer to these plants as vegetables, but you should be aware that this is custom, not science.

On a more serious note, recently in medicine there has been a change in the definition of "death." At one time death was considered to have occurred when someone's heart stopped beating; however, modern medical equipment can keep the heart beating long after the body wouldn't be able to do so itself. This led to revising the definition death to be when the brain is no longer capable of functioning, so-called "brain death."


Some people harbor the idea that many scientific laws and theories are eventually overthrown, so nothing in science can ever be trusted.

This is an overstatement. People who say Einstein proved Newton was wrong, and that someday someone will prove Einstein was wrong, are making a serious error. Einstein extended Newton by considering cases Newton did not think of.

By definition, science seeks to provide the best possible description or explanation of what is currently known. It is normal that as new things are discovered, descriptions and explanations need to be modified. This does not mean that the previous descriptions and explanations were wrong, only inadequate. Since we never know everything about anything, this is only normal.

Or to quote Isaac Newton, who had reason to know what he was talking about, "If I have seen far, it is because I have stood on the shoulders of giants." Newton is still very much a giant, as is Einstein.

• Digging Deeper

When a new scientific theory is put forward, it may be rejected by many skeptics until a surprisingly unlikely discovery or event it predicts is verified. This happened with Albert Einstein in 1905. There were many skeptics of his claim that light would be bent as it passes near a star. In the famous solar eclipse experiments of 1919 it was verified that a beam of light passing near the sun did in fact bend. What a shock for the doubters! And a triumph for Einstein's science.

A similar thing happened for Louis Pasteur in 1880 when skeptics doubted his claim that he had found an anthrax vaccine. In a famous demonstration 50 sheep, 25 of which had been inoculated, were all exposed to anthrax. All 25 vaccinated sheep survived, and all 25 unvaccinated sheep perished. What a shock for the doubters! And a triumph for Pasteur's germ theory of disease.


A fundamental quest among scientists is to uncover commonality among things that on the surface seem to be dramatically different. For example, it is not at all obvious that the reason objects fall to the ground is the same reason that the earth and other planets orbit the sun. Isaac Newton's elaboration of this idea greatly simplified things by showing that one and the same force, gravity, is responsible for both phenomena.

Another way science seeks simplicity is in how it defines units of measure. Whenever possible, units are defined in such a way that they describe the mathematics needed to use them. For example, a calorie is defined as the amount of heat energy required to raise the temperature of 1 milliliter of water by 1° Celsius.

Suppose as an example, that we want to calculate how much heart energy is needed to raise the temperature of 5 milliliters of water by 8°C. The definition talks about 1 ml, but we have 5, so we must multiply by 5. It also talks about 1 degree, but we have 8 degrees, so we must also multiply by 8: 5 x 8 = 40. The answer is: it requires 40 calories of heat energy to raise the temperature of 5 ml of water by 8°C.

If you are not used to calories, which is a metric unit, try the same thing in British Thermal Units. A BTU is defined as the amount of heat energy required to raise the temperature of 1 pound of water by 1° Fahrenheit. The same logic, working with this unit, tells us the number of BTUs needed to raise the temperate of water.

Scientists frequently "follow the units" to calculate answers. Suppose, for example, that a scientist runs field experiments from a generator. At the end of the month, the lab records show 2,000 kilowatt-hours of electric usage. If the price for the generator to produce 1 kilowatt-hour is $2, the monthly cost of the generator is 2,000 kilowatt-hours x $2/kilowatt-hour = $4,000. The average power output of the generator that month is 2,000 kilowatt-hours / 24 hours per day / 30 days per month = 2000/(30 x 24) = 2.8 kilowatts.


Scientists love to measure things. Mainly to see if their measurements confirm their theories, but also to see if they reveal unexpected patterns that can be converted into new theories.

The problem is, no matter how accurately things are measured, we can never be certain that they are accurate down to the last decimal place. In most cases, the level of accuracy we can reach is sufficient, but occasionally it isn't. Discovering an extremely small variation in an extremely accurate measurement sometimes reveals new and unexpected scientific knowledge.

• Digging Deeper

And sometimes the opposite. The lack of an expected small variation under extremely precise measurement can reveal a problem.

For example, in the late 1800s, physicists thought light traveled through "luminiferous ether" in the same way as sound travels through air. Luminiferous ether (or just ether for short) was the hypothesized medium in space in which light waves propagated. It was unthinkable that the waves could travel through empty space. They set up numerous experiments to detect and measure the ether without success. They explained the many failures by their instruments not being sufficiently precise. In 1887, Albert Michelson and Edward Morley built an instrument that was generally accepted to have the required precision. However, when they put it to the test, to everyone's surprise they still couldn't detect and measure the effects of the ether. The speed of light through space seemed to be a constant in all directions and at all times, unaffected by movements of ether. What if there were no such thing as the luminiferous either?

Given these virtually unbelievable results. Albert Einstein threw out the assumption that light moved through a medium, i.e. it traveled through space as through a complete vacuum, with no physical means of propagation. This was the basis of his epoch-making theory of relativity, the first part of which "The Special Theory of Relativity" was published in 1905 and the second part "The General Theory of Relativity" in 1915.

Another way scientists are precise is in the way they speak.

Suppose you enter a room where there are two other people and say, "It's very hot today." One of those people comes from Alaska; in his mind "hot" means 25°C (77°F). The other one comes from Texas; to him "hot" means 45°C (113°F).

You are off to a rather poor start because each person has a totally different idea of what you want to them to know. But suppose you say, "It's very hot today; the temperature is 30°C (86°F)." Now there is no room for confusion. They both know quite clearly that it is 30°C outside and that you consider this to be very hot.

When scientists write or speak, they need for the audience's mind to go only where they direct it and nowhere else. This is also what the audience wants. After all, they are paying attention to find out what the scientist thinks; they already know what they think. Because they can be interpreted in unknown ways, ambiguous terms (so-called "weasel words") such as "hot," "cold," "big," "small," "good," "bad," etc. allow the audience's mind to wander off into totally unpredictable directions, which defeats the purpose of the communication. This is why scientists are usually very precise in what they say.

Avoiding weasel words is good practice not only for scientists, but also for almost everyone, whatever their walk of life.


An absolute, fundamental, indispensable sine qua non of science is predictability, i.e. the ability of science to predict what is going to happen long before it does.

For our early ancestors, the most obvious things to be predicted were the changes of the seasons, the rising and setting of the sun, the phases of the Moon, and the movement of heavenly bodies across the sky. The ability to predict lunar eclipses, and especially solar eclipses, was close to wizardry. Even today, we are still obsessed by predicting the weather.

Scientific predictions don't always have to go forward in time. Based on what we have already found out and analyzed, sciences such as anthropology, geology, and paleontology predict what still may be hidden, then set about to find it. Physics does the same thing. The Large Hadron Collider, buried in a 27-kilometer (17-mile) long tunnel in Switzerland, was built in part to look for certain subatomic particles predicted by theory but not yet actually seen in fact.

This does not mean that predictions must be 100 percent accurate, but they must be sufficiently close to reality to inspire confidence. Any disparities from predictions should be treated as invitations to investigate further in order to refine the theory or eventually replace it with a better one.

• Digging Deeper

It is interesting that some scientific theories predict that some things cannot be predicted by any theory. For example, chaos science (complexity science) says it cannot predict with any confidence the timing or extent of future chaotic events. Computability theory (in computer science) says there are well-defined problems for which it is impossible to develop an algorithmic machine to find solutions. Of course, these conclusions are part of the existing theories. Perhaps some future chaos theory will predict some chaotic events, or some future quantum machine will find solutions to non-algorithmic problems.


No scientific theory can be proved, only disproved; yet scientists must live in the day-to-day world. So what do they do? They rely on probability. Scientists select the theories where the overwhelming preponderance of evidence suggests they are correct and assume they are correct until such time as they discover a negative example.

F = ma (force = mass x acceleration), Newton's second law of motion, is a pertinent example. We assume it is correct because we have yet to discover a case where it fails; however, as noted in character trait 2, we assume physical laws are the same everywhere in the universe. Newton's second law may be deterministically true in our little sliver of the universe, but it could fail somewhere else. Based on long experience, the assumption that the law is universal has a probability of being correct, but it is still an assumption, not a certainty.

Medicine provides a more down-to-earth example. Before a new pharmaceutical product can be released for use, approval from national health authorities must first be satisfied. In general, this means submitting a registration file containing massive amounts of data, most importantly regarding efficacy and safety. Clinical trials (tests on volunteers) are organized under very strict rules to obtain this information.

Not all patients will respond to the drug or respond in the same way. So there are two crucial questions:

  • Does the drug actually do anything, i.e. does it produce results significantly better than a placebo (sugar pill)?
  • Will the new drug do enough good for enough patients that it deserves to be put into the hands of doctors? Clinical trials are designed to answer this and other questions.

The results are usually expressed in terms of probability, i.e. among all persons suffering the malady the drug is destined to treat, what percentage are likely to substantially benefit from it? If the probability of benefit is high enough and the drug meets other criteria, it is released to the medical community. Otherwise, it must return for additional trials, additional development, or be abandoned.


Many of us have engraved on our minds the picture of Archimedes (287– 212 BCE) sitting in a bath, contemplating. Suddenly he jumps up, shouts "Eureka!" (I have found it), then runs out into the street wearing a great big smile on his face, and nothing else. He had just discovered the principle of water displacement, which provided a means of measuring the volume of irregular objects. This represented a major gain in knowledge and was probably accepted almost immediately.

Today, scientific breakthroughs almost never take place overnight, or at least they do not become accepted overnight.

Science is all about trying to discover things that are universally true, that do not depend on place and time. Therefore, it is not enough for one scientist to announce he has done something in his laboratory, and it worked. He must give a detailed description of what he did so other scientists can try the same thing in their laboratories to see if still works.

This is why most scientific research papers include a lengthy section describing exactly what the author did, with what equipment, and following what procedure. This is known as "reproducibility." Only when several scientists, independent of each other, have reproduced the experiment and come to the same conclusion can a discovery be integrated into the general body of science.

Reproducibility is a cornerstone of modern scientific knowledge because it is essential to acquiring enough allies to a hypothesis for it to become scientific "truth."

The recent claim from CERN that the Higgs Boson exists shows this principle at work. The initial press release said they were "sure" they had found the boson but admitted some slight uncertainty. Another team performed its own analysis of the data and compared it with the first team. The second team said they were "really really sure" and put the possibility of error at one chance in 550 million.

Reproducibility does not require that trials be run in separate locations, only that they be run totally independent of each other. Additionally, the raw data will probably be submitted to and analyzed by different experts around the world.


In searching for new ideas to explain how the world works, scientists follow the guiding principle of seeking simplicity to reduce complexity.

This principle is generally known as Occam's razor. It is attributed to the 14th century English logician and Franciscan friar Father William of Occam. Although the idea was known and accepted long before, Occam's formulation is the one most widely quoted: "Entities must not be multiplied beyond necessity." Another popular formulation is: "Plurality should not be posited without necessity." In more up-to-date language, Albert Einstein said, "Everything should be as simple as possible, but not simpler."

Adhering to Occam's razor does not mean among competing scientific theories the simplest one should always be preferred, because the simplest theory may be wrong. What scientists are looking for is the simplest possible theory that actually works, i.e. can be tested and shown to be accurate.


The way scientists talk about things often depends on what was said about them when they were invented or discovered.

For example, why is the power of engines generally rated in horsepower and the power of light bulbs in watts? Quite simply, because when engines were first developed, their power was compared to horses, which were doing much of the work engines were designed to replace. The power of light bulbs is expressed in watts in honor of the pioneering efforts of James Watt (1736–1819) in understanding and applying electricity for useful purposes.

You may have noticed that history is about to change. Introduction of the new low-energy light bulbs means the watt is no longer useful for expressing their lighting performance; in fact, it never was. The watt is a unit of electrical power, i.e. how fast an instrument is using electrical energy. For light bulbs, it is therefore now being replaced by the "lumen," which is a measure of lighting intensity. Given their exceptionally long life, meaning very infrequent changes, it is likely that within a generation hardly anyone buying a new light bulb will even remember that watts and lighting ever had anything do with each other.

• Digging Deeper

The term "historicity" is used to refer to present effects that are the product of past events. In other words, what went before still manifests in the present. The term "history" usually refers to narratives about what happened in the past, but not about what is going on in the present.

When Newton found his three laws, he did so by looking at all that had been learned before him about the movement of objects in the heavens and on earth. His laws revealed a pattern that explained these motions in simple terms. The context in which he did his work was his history, and his laws, which are very much with us today, are part of our historicity.


Preconceived notions (preconceptions) are ideas that are believed to be true without any firm reason for the belief. Certain groups reject results of science because they do not conform to their preconceived notions of what they believe must be true rather than what can be demonstrated to be true.

Even scientists fall into this trap. For example, from the time of the ancient Greeks up until about the 16th century, astronomers believed the circle was geometrically "perfect" while other curves were somehow less noble. Therefore, the orbits of the sun and planets around the earth (when it was still believed that the earth was the center of the universe) had to be circles.

This caused numerous problems because the observed movements of the sun and planets in the sky did not seem to be circular. Astronomers therefore had to invent complex combinations of circles within circles in order to make the "anomalies" disappear.

The idea of the perfection of the circle persisted even when it had been generally agreed that the sun is the center of the solar system and that the earth and other planets orbit around it. Again, astronomers had to invent complex designs of circles within circles to explain the anomalies.

Finally, when Johannes Kepler (1571–1630) suggested these orbits might be ellipses, everything fell into place. Although "imperfect" compared to the circle, considering obits to be elliptical gave accurate calculations and predictions. Suddenly everything became clear.

A similar thing happened in the history of chemistry. Organic chemistry (carbon-based molecules) was making great progress in the early and mid-1800s. However, there was a fundamental problem. Through experimentation, chemists were learning more and more about what reactions would take place by putting specific molecules together under specific conditions, but they didn't really know why.

At the time they believed that organic molecules were strings of atoms forming chains of different lengths. However, in many cases this idea did not allow them to look at the structure of different chains and predict how they would react with each other.

Then in 1865 Friedrich Kekulé proposed that benzene, which did not behave as if it were a chain, might in fact be a ring. In other words, instead of the ends moving about freely, they joined up to form a kind of circle.

This idea revolutionized organic chemistry.


True scientists are humble about their achievements because they know that whatever they have accomplished, there is still so much more to be done—and anything they have done could be invalidated in the future. Each new level of understanding opens new mysteries to be explored. This aspect of science is well expressed in this quotation from Isaac Newton:

"I was like a boy playing on the seashore and diverting myself now and then finding a smoother pebble or a prettier shell than ordinary, while the great ocean of truth lay all undiscovered before me."

And this from Richard Feynman:

"I think it's much more interesting to live not knowing than to have answers which might be wrong. I have approximate answers, and possible beliefs, and different degrees of uncertainty about different things, but I am not absolutely sure of anything.

"There are many things I don't know anything about, such as whether it means anything to ask Why are we here? I think about it a little bit, and if I can't figure it out, then I go on to something else. But I don't have to know an answer. I don't feel frightened by not knowing things, by being lost in the mysterious universe without having any purpose—which is the way it really is as far as I can tell."


The opening lines of "Star Trek," the iconic science-fiction television series, are:

Space: The Final Frontier
These are the voyages of the Starship Enterprise.
Its five-year mission,
To explore strange new worlds,
To seek out new life and new civilizations,
To boldly go where no man has gone before.

These few words dramatically set the scene for "Star Trek," one of the most successful TV programs of all time. However, it contains a serious misstatement. If you are talking about physical exploration, space may very well be the final frontier. If you are talking about intellectual exploration, space is only a small part of science, whose explorations are never-ending.

It probably wouldn't have been so dramatically effective, but a more accurate rendition of the Star Trek introduction would have been:

Science: The Eternal Frontier
These are the voyages of the human spirit.
Their never-ending mission,
To explore strange new phenomena,
To seek out new ideas and new insights,
To boldly go where no human mind has gone before.

And to keep going forever and ever. The universe may have had a beginning (Big Bang) and may have an end (Big Crunch). But between these two endpoints, mankind will always seek to discover and understand what is going on around them.

Science is a great game, and the universe is its playground.

This essay is adapted from Philip Yaffe's book Science for the Concerned Citizen: What You Don't Know CAN Hurt You (2011).

The author would like to acknowledge Peter Denning, Ubiquity's editor-in-chief, for his time, energy, patience, and keen insights that contributed so much to this essay.


Philip Yaffe was born in Boston, Massachusetts, in 1942 and grew up in Los Angeles, where he graduated from the University of California with a degree in mathematics and physics. In his senior year, he was also editor-in-chief of the Daily Bruin, UCLA's daily student newspaper. He has more than 40 years of experience in journalism and international marketing communication. At various points in his career, he has been a teacher of journalism, a reporter/feature writer with The Wall Street Journal, an account executive with a major international press relations agency, European marketing communication director with two major international companies, and a founding partner of a specialized marketing communication agency in Brussels, Belgium, where he has lived since 1974. He is the author of more than 30 books, which can be found easily in Amazon Kindle.

2022 Copyright held by the Owner/Author.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.


Leave this field empty