Notes on Engineering Health, May 2022

Life, the Universe, and Everything

In Douglas Adams’ The Hitchhikers Guide to the Galaxy, a supercomputer known as Deep Thought has calculated the answer to the “Ultimate Question of Life, the Universe, and Everything.” Unfortunately, Deep Thought has forgotten what the question actually is, and thus creates Earth as a supercomputer to answer the question, what is the question?

As creation myths go, this is perhaps no more unlikely than those propounded by many societies. And, in fact, the idea of Earth, or at least life on Earth, as a form of computation has attracted a number of adherents:

Once we regard living things as agents performing a computation — collecting and storing information about an unpredictable environment — capacities and considerations such as replication, adaptation, agency, purpose and meaning can be understood as arising not from evolutionary improvisation, but as inevitable corollaries of physical laws. In other words, there appears to be a kind of physics of things doing stuff, and evolving to do stuff. Meaning and intention — thought to be the defining characteristics of living systems — may then emerge naturally through the laws of thermodynamics and statistical mechanics. (Quanta Magazine)

Konrad Zuse is credited with building the first working programmable computer in 1941. And in 1967, he seems to be the first to propose that physics (and thus biology) is just computation. This idea was extended amongst others by Edward Fredkin in the 1980s, Juergen Schmidhuber in the 1990s, and Stephen Wolfram in the 2000s.

Another thread of this type focusing more specifically on life on Earth stretches back to the mid-1800s and the invention of statistical mechanics by scientists such as Ludwig Boltzmann and James Clark Maxwell. Maxwell in particular was struggling with the second law of thermodynamics which states that as energy is transferred or transformed, more and more of it is wasted. Thus, the ability to get useful work from energy resources is always diminishing. The asymptote of this law is that the universe will ultimately be reduced to a state of equilibrium where entropy has been maximized and nothing of meaning will occur again.

In order to avoid this dreary fate, living organisms use energy from their surroundings to sustain a non-equilibrium state. Physicists Erwin Schrodinger wrote in his 1994 book What is Life? The Physical Aspect of the Living Cell that living organisms survive on “negative entropy” by capturing and storing information. This information allows living things to stay out of equilibrium by behaving in a way that allows them to extract energy from changes in its surroundings. Failure to interact with this information would lead to gradual reversion to equilibrium, and thus death.

In this conception of life, biology is thus a computation that seeks to optimize the storage and exploitation of information. And this gives us  a toe-hold to think about health as a process of computation as well. The thermodynamics of copying information dictates that there is a trade-off between the accuracy or precision of the copying and the energy required for the copying. As any organism has only a finite supply of energy, errors build up over time. An increasing amount of the energy captured by the organism must over time go to error correction, until such time as there are too many flaws to overcome and death occurs. This view is captured in ideas such as the Hayflick Limit which observes that cultured human cells have a finite number of times they can replicate and divide before they become senescent. And perhaps may explain why the natural life time of humans seems to cap out around a century.

While we have been living through the coming together of information technologies and life sciences over the past few years as machine learning and other techniques have been transforming our ability to explore the mysteries of biology, the ideas summarized above point to an even more fundamental intersection of life and information. One researcher has gone so far as to argue that all the data that humans produce during the regular course of their activities has created a new life form of its own, “the dataome”:

The dataome is the sum total of our nongenetic information, whether in books, electronic bits, paintings, temporary neural impulses, or even structures like libraries and machines, which support and encode information in themselves. It's the informational “ome” that coexists with us here on earth. It's a new lens to examine our world through because it seems to have its own evolutionary imperatives.

In fact, I propose that the dataome is an alternate living system, in a deeply symbiotic relationship with us. That may sound outrageous, but it seems to fit with many ideas about the nature of information as a thing—akin to energy or entropy—and what we think life is. In a sense, life is what happens to matter when information takes control. (Caleb Scharf)

These ideas of the symbiosis between biological life and information provide a provocative new way to explore more deeply not just the intersection of life sciences and data sciences, but also our understanding of social, economic, and cultural determinants of health. These are themes we expect to continue to explore in our work for years to come.

Jonathan Friedlander, PhD & Geoffrey W. Smith