acm - an acm publication


Workings of science
Is science limited to the sciences?

Ubiquity, Volume 2022 Issue March, March 2022 | BY Philip Yaffe 

Full citation in the ACM Digital Library  | PDF


Volume 2022, Number March (2022), Pages 1-11

Ubiquity Symposium: Workings of science: Is science limited to the sciences?
Philip Yaffe
DOI: 10.1145/3512334

Albert Einstein once said, "The whole of science is nothing more than a refinement of everyday thinking." This thought was echoed by Carl Sagan, who said, "Every kid starts out as a natural-born scientist, and then we beat it out of them." These observations, and those of numerous other intellectual luminaries, strongly suggest that the common distinction made between what is science and non-science, say between physics and history, is more apparent than real. These, of course, are personal opinions based on personal observations This essay explores the intriguing idea that virtually everything is science. It also provides some recent scientific evidence that trying to distinguish between science and non-science is not only fruitless but can also do real harm to individuals and society as a whole.

People have the habit of dividing different areas of human activity according to whether it is a science or not a science. Even leaders in disciplines that bear the name "science" sometimes raise the question. A few years ago a leading light in computer science wrote a paper asking "Is computer science science?"

In the popular mind, the border between science and non-science is often blurred. However, I wonder if everyone is asking the wrong question. To me, it is not a matter of whether this or that activity is a science. The real question is: Is everything science?

My tentative answer to this unusual question is "yes." Why do I say this? Because to me it seems virtually any activity (history, music, dance, painting, poetry, basketball, basket weaving, etc.) uses some or all of the tools of scientific inquiry whether the general public, or even the practitioners themselves, recognize it.

Even great luminaries in the battle to bring science to the masses appear to have overlooked this, as for example C.P. Snow in his celebrated 1959 lecture and book The Two Cultures and the Scientific Revolution.

Snow's position can be summed up by this often-quoted passage.

A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is the scientific equivalent of: Have you read a work of Shakespeare?

I now believe that if I had asked an even simpler question—such as, What do you mean by mass, or acceleration, which is the scientific equivalent of saying, Can you read?—not more than one in ten of the highly educated would have felt that I was speaking the same language.

So the great edifice of modern physics goes up, and the majority of the cleverest people in the Western World have about as much insight into it as their Neolithic ancestors would have had.

Although Snow's two cultures analysis was widely applauded by the scientific community, my guess is that our Neolithic ancestors are still very much with us.

The problem is not that there are two cultures, but that there is only one culture—except that most people don't know it.

Do an internet search for "the science of" for just about any topic you could imagine and you will almost certainly get results. For example, the science of acting, cooking, history, humor, idleness, poetry, storytelling, time-wasting, well-being, etc. I even did a search for the science of ditch-digging and got a raft of responses. In all cases, the articles demonstrated that underlying these apparently non-scientific fields of activity were many of the principles and practices commonly associated with "true" science.

Legendary journalist H.L. Mencken (1880–1956) once famously said, "There are no dull stories, only dull writers." Why? Because no matter the subject, even ditch-digging, there is always a backstory beyond most people's imagination. Virtually all of these backstories reflect application of some or all of the principles and practices of science.

Having been a journalist, I do not have to take Mr. Mencken's word for it. I have experienced being accused of dull writing myself.

The astute reader will probably have noticed that to this point I have failed to pay due attention to a sine qua non of logical thinking. Define your terms. The key term in this essay is of course the word "science" itself.

In agreement with the Science Council, I would argue that there are two fundamental definitions for science, or more precisely a two-part definition of science.

  1. General definition. Science is the pursuit and application of knowledge and understanding of the natural and social world following a systematic methodology based on evidence. Another general definition might be: Science is the investigation of any and all observable phenomena in order to discover the laws of their operation.
  2. Functional definition. Science has adopted a set of standard practices to ensure that investigations are repeatable and independently validated. The "scientific method" is built on the following practices: objective observation; measurement, and data; evidence; experiment and/or observation as benchmarks for testing hypotheses; inductive reasoning to establish general rules or conclusions drawn from facts or examples; repetition; critical analysis; verification and testing; critical exposure to scrutiny; peer review and assessment.


Philosopher George Santayana (1853–1952) admonished, "Those who cannot remember the past are condemned to repeat it." Sage advice. If we don't know what has been done before, how can we judge the probable outcome, positive or negative, if we do it again? In essence, this is very similar to one of Albert Einstein's most celebrated quips: "Insanity is doing the same thing over and over again and expecting different results."

Avoiding this trap across the broad sweep of history presupposes that we have a reasonably accurate picture of what has already been done in the past, without which Santayana's advice is useless. But how can we know we have a reasonably accurate picture of the past? Again this is where the practices of science come in.

True, much of what we think of as history deals with confirming the authenticity of names, dates, places, events, etc. However, it also deals with the probable causes of such events, i.e. Is it possible to divine patterns across the ages that seem to generally lead to the same outcome?

For example, historians may conclude that wars are likely when certain conditions exist in the world. But the best they can do is say so as a probability, not as a reliable certainty. If an historian today were to say that these war conditions exist and we need to back away, that might be considered a prediction. However, few governments would back away, believing that: 1) the conditions for war have not been validly established, and 2) today we are more advanced so this time it will be different.

In the more prosaic aspects of their discipline, historians make hypotheses about whether events actually happened as claimed in someone's record. To confirm that an event might have happened, they look at alternative sources for mentions of the claimed event. If they find them, then their confidence in the claim goes up. Nevertheless if they find a contradictory report, their confidence in the claim goes down, although they do not necessarily dismiss the claim because the contradictory report might be false. However, if they find lots of contrary reports, they are then left in a quandary. Which reports are true, or are any of them true?

Another thing that unites history with science is the principle of falsifiability. This principle states any information which by its very nature cannot be tested must be granted little credence, no matter how intriguing and potentially useful it might seem.

For example, in their discussions and debates, historians may conclude that one or more of their hypotheses can be falsified by uncovering pertinent new information, and then go looking for it. If they find it, the hypothesis is falsified and either needs to be tweaked, revised, or rejected.

The principle of falsification was articulated by philosopher Karl Popper (1902–1994), who argued it is an essential feature of science. This would seem to be a self-evident truth. However, it is well to remember that until the advent of the scientific revolution beginning in the 17th century, experimentation was not held in high esteem. For countless centuries previous, in some quarters science was believed to be largely a combination of observation and innate intelligence. You saw something, then you thought about why it was happening. Experimentation was not considered necessary, and even frowned upon. What about the claim that Francis Bacon spoke about a scientific method and the need for experiments? Francis Bacon (1561–1626) was one of the founding fathers of the scientific revolution. My comment refers to the centuries before the scientific revolution.

Another apparently self-evident truth about science is that it is the study of nature, which implies whatever is manmade is not natural and therefore outside the realm of science. The counter-argument, which I favor, is that human beings are part of nature, so whatever is produced by humans is totally within the realm of science.

Computer science nearly fell victim to this artificial and, to my mind, inimical distinction. In the early days (1950s), computer scientists said their job was to study and advance information processes. Physicists and others said information processes are effects of manmade machines and, therefore, did not qualify as natural processes. That belief persisted for nearly 50 years. However, it changed in the 2000s when biologists and others started claiming biological processes are information processes.

The application of the principle of falsifiability is well exemplified by the astronomical confirmation in the early 20th century of Albert Einstein's theory of relativity.

In the mid-1800s, physicists had been diligently trying to detect something called the "luminiferous ether" by which light was propagated through space. Analogous to sound, which is propagated through something material (notably air), light also had to be propagated through something material. However when no such material could be detected, Einstein decided to assume that it didn't exist. He hypothesized that one of the consequences of the lack of a luminiferous ether was that light traveling through space should bend when passing near a massive gravitational object such as the sun. If such bending could be detected, it would be a confirmation of the theory; if not, it would be a falsification of the theory. As it turned out, during a solar eclipse in 1919, astronomers around the world confirmed that light actually does bend, and exactly as much as Einstein had predicted.

This of course did not totally validate this aspect of the theory of relativity, because all scientific theories are tentative. As Einstein himself said, "No amount of experimentation can ever prove me right; a single experiment can prove me wrong."


One of the most contentious aspects of the "science of history" is the attempt to prove the historicity of key events in religious holy books, notably the Torah (were Hebrews actually slaves in Egypt?), the New Testament (was Jesus really crucified and then rose from the dead?), the Quran (did Mohammed really ascend to heaven on a winged horse?), etc.

Such claims carry with them deep social and psychological importance. Thus, while historians may never be able to truly justify any of them, the fact that such claims are seriously being investigated can cloud the minds of some people who ought to know better. Or shut down their minds completely.

Recently I was watching a public access TV program. A caller enthusiastically proclaimed scientists had just discovered and authenticated Jesus's tomb. He also proclaimed the self-same scientists had also just found and authenticated the cross on which Jesus was crucified.

The show's host, a devoted skeptic, immediately began asking for details about how the scientists had confirmed the tomb was really that of Jesus. The caller had no details; neither could he supply references for where such details could be found. "I just heard about it. I haven't actually checked it out for myself," he said. This being an ungrounded claim (a claim made without evidence), it fails the Popper test of being falsifiable.

However, the piece de résistance concerned the cross on which Jesus was crucified.

Host: "What do you mean, that they found the actual cross?"
Caller: "There it was, exactly where it happened!"
Host: "You mean on Calvary, the hill just outside of Jerusalem where the crucifixion is said to have taken place? And it was perfectly intact?"
Caller: "Yes, yes!"
Host: "If it were still there atop the hill and still perfectly intact, you would have thought someone over the past 2,000 years would have noticed it long before now. How did this just become news?"

At this point, the caller hung up. The claim failed the Popper test because an obvious consequence was falsified.


To conclude on a less contentious (but perhaps more pertinent) note, let's take a look at perhaps the most salient passage of an intriguing essay published in 1865 by T.H. Huxley. Thomas Henry Huxley (1825–1895) was an English biologist and educator. Today he is best remembered for his steadfast defense of Charles Darwin's then recently published book (1859) on the theory of evolution by natural selection, which earned Huxley the sobriquet "Darwin's bulldog."

The essay is titled "We Are All Scientists." Near the end, Huxley tries to demonstrate that even the apparently inconsequential act of going into a shop to buy an apple is based on science, whether the purchaser knows it or not. While the language may now seem somewhat archaic, Huxley's message is smack up to date.

Suppose you go into a fruiterer's shop, wanting an apple—you take up one, and, on biting it, you find it sour; you look at it, and see that it is hard and green. You take up another one, and that too is hard, green, and sour. The shopman offers you a third, but before biting it, you examine it, and find that it is hard and green, and you immediately say that you will not have it, as it must be sour, like those you have already tried.

Nothing can be more simple than that, you think; but if you will take the trouble to analyze and trace out into its logical elements what has been done by the mind, you will be greatly surprised.

In the first place, you have performed the operation of induction. You found that, in two experiences, hardness and greenness in apples went together with sourness. It was so in the first case, and it was confirmed by the second. True, it is a very small basis, but still it is enough to make an induction from; you generalize the facts, and you expect to find sourness in apples where you get hardness and greenness. You found upon that a general law, that all hard and green apples are sour, and that, so far as it goes, is a perfect induction.

Well, having got your natural law in this way, when you are offered another apple which you find is hard and green, you say, "All hard and green apples are sour; this apple is hard and green, therefore this apple is sour." That train of reasoning is what logicians call a syllogism, and has all its various parts and terms—its major premise, its minor premise, and its conclusion. And, by the help of further reasoning, which, if drawn out, would have to be exhibited in two or three other syllogisms, you arrive at your final determination, "I will not have that apple." So that, you see, you have, in the first place, established a law by induction, and upon that you have founded a deduction, and reasoned out the special conclusion of the particular case.

Well now, suppose, having got your law, that at some time afterward, you are discussing the qualities of apples with a friend. You will say to him, "It is a very curious thing but I find that all hard and green apples are sour!" Your friend says to you, "But how do you know that?" You at once reply, "Oh, because I have tried them over and over again, and have always found them to be so. Well, if we were talking science instead of common sense, we should call that an experimental verification. And, if still opposed, you go further, and say, "I have heard from the people in Somersetshire and Devonshire, where a large number of apples are grown, that they have observed the same thing. It is also found to be the case in Normandy, and in North America. In short, I find it to be the universal experience of mankind wherever attention has been directed to the subject."

Whereupon, your friend, unless he is a very unreasonable man, agrees with you, and is convinced that you are quite right in the conclusion you have drawn. He believes, although perhaps he does not know he believes it, that the more extensive verifications are—that the more frequently experiments have been made, and results of the same kind arrived at—that the more varied the conditions under which the same results are attained, the more certain is the ultimate conclusion, and he disputes the question no further. He sees that the experiment has been tried under all sorts of conditions, as to time, place, and people, with the same result; and he says with you, therefore, that the law you have laid down must be a good one, and he must believe it.

In science we do the same thing. The philosopher (scientist) exercises precisely the same faculties, though in a much more delicate manner. In scientific inquiry it becomes a matter of duty to expose a supposed law to every possible kind of verification, and to take care, moreover, that this is done intentionally and not left to a mere accident, as in the case of the apples. And in science, as in common life, our confidence in a law is in exact proportion to the absence of variation in the result of our experimental verifications. For instance, if you let go of your grasp of an article you may have in your hand, it will immediately fall to the ground. That is a very common verification of one of the best established laws of nature—that of gravitation.

The method by which men of science establish the existence of that law is exactly the same as that by which we have established the trivial proposition about the sourness of hard and green apples. But we believe it in such an extensive, thorough, and unhesitating manner because the universal experience of mankind verifies it, and we can verify it ourselves at any time; and that is the strongest possible foundation on which any natural law can rest.

If Mr. Huxley's analysis is correct, then C.P. Snow's notion that there are two cultures would seem to be somewhat overstated. We are a single culture. However, for some reason best known to (or best researched by) psychologists, psychiatrists, neurologists, sociologists and the like, at some point about some things we all tend to jettison our innate science for imbued non-science. Often with disastrous results for individuals, their cohort—and occasionally the entire world.

Finding ways and means of preventing such mishaps (at least warning us that we are about to subvert our innate scientific sense) should be at the top of the list of things mankind must do to ensure a better future. Or indeed any future at all.

Albert Einstein once famously said, "The whole of science is nothing more than a refinement of everyday thinking." Carl Sagan observed, "Every kid starts out as a natural-born scientist, and then we beat it out of them. A few trickle through the system with their wonder and enthusiasm for science intact."

These, of course, were opinions based on personal observation and experience. However early in the 21st century, their opinions were given credible scientific support.

In 2012, the National Science Foundation published an article based on work being done at the University of California at Berkeley titled "Babies Are Born Scientists." As described in the opening paragraphs:

Very young children's learning and thinking are strikingly similar to much learning and thinking in science, according to Alison Gopnik, professor of psychology and affiliate professor of philosophy at the University of California, Berkeley.

New research methods and mathematical models provide a more precise and formal way to characterize children's learning mechanisms than in the past. Gopnik and her colleagues found that young children, in their play and interactions with their surroundings, learn from statistics, experiments, and from the actions of others in much the same way that scientists do.

"The way we determine how they're learning is that we give them, say, a pattern of data, a pattern of probabilities or statistics about the world and then we see what they do," said Gopnik.

Although the U.C. Berkeley research is still open to criticism, the reported results seem to provide a clue as to how to re-unite the so-called two cultures—and better yet how to prevent them from being split in the first place. It's called education.

The way "born scientists" are introduced to and encouraged to explicitly employ their natural scientific bent in grades K-12 could make all the difference. The key thing is not to let children grow up believing that a few people (very few) have a scientific sense while the majority of us do not. To paraphrase Shakespeare (King Lear), "That way lies madness."

For some provocative thoughts on how this might be achieved, click through to "Character Traits of Science" also published in the Ubiquity Workings of Science Symposium.


Philip Yaffe was born in Boston, Massachusetts, in 1942 and grew up in Los Angeles, where he graduated from the University of California with a degree in mathematics and physics. In his senior year, he was also editor-in-chief of the Daily Bruin, UCLA's daily student newspaper. He has more than 40 years of experience in journalism and international marketing communication. At various points in his career, he has been a teacher of journalism, a reporter/feature writer with The Wall Street Journal, an account executive with a major international press relations agency, European marketing communication director with two major international companies, and a founding partner of a specialized marketing communication agency in Brussels, Belgium, where he has lived since 1974. He is the author of more than 30 books, which can be found easily in Amazon Kindle.

2022 Copyright held by the Owner/Author.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.


Leave this field empty