Comments on the Digital Universe Idea
To begin -- Kunio has asked me to think hard about the ideas of a Prof. Nakagomi,
who has a mathematical framework for trying to understand physics. That framework
is part of the general tradition which tries to imagine the universe as a kind of digital system
or "its from bits." It is related to the tradition of people who have tried to understand the universe more as a "great mind, not
a great machine." Greg Bear's novel Moving Mars is one especially fun product of that tradition.
The history and psychology of these ideas is fascinating, and similar in many ways to the history
of religion, but I will try to resist discussing those topics here. Instead, I will try focus
on the question in the subject line of this email: i.e., what can we really learn here from the
limited empirical data we have to work with? And I will try to write in plain language as much as I can.
Also, I will carry this story further than it has been carried before anywhere.
Leaving aside imagination, leaving aside the data of first person "mystical" experience, and even leaving
aside totally unexplained anomalies in physics --
What can we actually learn about the underlying dynamical laws which govern our universe, FROM
the empirical data which underlie the most well-verified theories of physics today?
The three such theories which really confront empirical data, which are not a matter of sheer invention, are:
(1) Einstein's general relativity (GR), a theory of gravity; (2) quantum chromodynamics (QCD), sometimes called
the "quark model," our best unified understanding today about strong nuclear interactions; and
(3) electroweak theory (EWT), a relatively recent unification of Maxwell's laws for electric forces and magnetism, together with
the behavior of the electron and the weak nuclear forces. The second two theories, taken together,
are called "the standard model" of physics. The theories about gravity, electrons, electric force and magnetism
have been tested thousands of times over; they are the foundation of a lot of our technology, and we use these theories
everyday in practical work funded by my Division at NSF. But it is very difficult to calculate what QCD actually
predicts for most experiments, and some aspects of EWT are also hard to test; thus we really can't be sure about
some of those details. People who work with practical nuclear physics often use "phenomenological models"
which have large prediction errors and may not always be related to what QCD would predict.
Even in the realm of electromagnetic systems -- the elevated theorists of physics have not always
assimilated what the practical, empirical people have learned about quantum measurement and things
That's all just a starting point.
Now.... So far, I have yet to see ANY of the "its from bits" or "digital universe" models
ACTUALLY displaying or usefully proposing a path to able to match this basic empirical data.
Without that, it's a kind of philosophy or hope, not a theory. Hopes are fine... but without
at least a STRATEGY to achieve a theory... it's not so real. What's more, we would need
a theory with some hope of explaining the data better or on simpler assumptions than the
alternative "analog" theories (like GR itself).
So far as I know, the CLOSEST that anyone has actually come to achieving this goal
is Wolfram's "New Kind of Science," chapter 9. The whole book is "freely" available on the web --
with registration, which requires cookies. So I borrowed it from the library yesterday,
in order to study more closely what it promises -- and what the hopes really are of
creating a useful "digital" model.
Wolfram actually visited NSF for a day awhile back. The book is better in a way, and worse in a way,
than I would have guessed. Wolfram has also discussed his ideas with a more mainstream reviewer:
The book also reminds me of some of the reviews I have seen of a recent
popular book by Laughlin of Stanford. Laughlin is an extremely serious physicist as
well as a Nobel Prize winner, with a strong sense of empirical reality. He has been
a great help to NSF at times. According to the reviewer's, Laughlin's new book
is a manifesto for "emergentism" against "reductionism." He "says": Nothing we humans have
ever studied has ever turned out to be the deepest level of nature, yet.
Maybe everything we see at the deepest level of physics today is nothing but a collection
of EMERGENT phenomena -- not fundamental any more than the waves of the ocean are fundamental.
We should remember that the same kind of emergent phenomena COULD BE the result
of any one of millions of possible models of what lies underneath. We really have very little knowledge at all
of what the deepest model could be, because there are so many millions of possibilities.
And so -- Wolfram argues that his kind of "causal network automaton" (CNA?) can reproduce the
predictions of GR, as a kind of emergent statistical result. He claims he can get the same predictions
as GR, from a simpler and purely digital model. That sounds very exciting at first, but...
At NSF, Wolfram said "I can reproduce the flat-space special case of GR..." That I found very puzzling.
The flat-space version would be basically the same as
CALL it GR?
But in fact ... the book is far more interesting and promising than that.
To explain this further... to nonphysicists... I have to make an analogy and even explain the analogy a bit.
The GR model belongs to a class of nonlinear dynamical system called partial differential equations (PDE).
PDE are extremely common in science and engineering. When
museum a week ago, Christopher was very excited by the displays on how to design an airplane...but
then he noticed the Navier-Stokes equations posted above one of the displays. "What are THEY daddy?
I don't understand THAT kind of equation. Do you? Can you explain them?" It was an interesting challenge.
They are a system of PDE made up of... 7 equations, was it? They are used describe and predict all
kinds of normal fluids... the air which gives the airplane lift and drag, the hot flame which gives it thrust, the flows
around the body of the airplane and in the engine... To explain the idea, someday I will show him a very SIMPLE PDE system,
the heat equation ( a single equation), which is relatively easy to understand.
If you try to PREDICT the future temperature, in each point on some kind of a disk or plate... you can use the heat equation
to make accurate predictions, if you know three things: (1) the PRESENT temperature at each point, to start;
(2) the FLOWS of heat (or laws governing that flow) at the BOUNDARY or EDGE of the plate; and (3) the
one unspecified "parameter" of the heat equation -- the CONDUCTIVITY of the material that the plate is made of.
(If the plate is not made of one material, or has variations in shape, a more complicated form of the heat equation is needed,
and one has to know all that extra information.) It is well-known that the heat equation is an EMERGENT law,
which results from microscopic collisions of molecules and the like.
In fact... even if there is no source of heat WITHIN the plate, only on the edges... there is a lot of physics involved
in figuring out how heat flows FROM the external sources, and interacts, within the plate.
WOLFRAM's simplified version of GR basically gets all the complex, nonlinear dynamics of GR right
(he claims)... BUT it does not account for SOURCES (or boundaries). It is NOT so simple
as the heat equation (or a linear wave equation, which is similar). It does include those special
features which make GR what it is. But it does not have anything at all to explain sources.
Now let me try to be more precise. GR, like the heat equation, is actually
a SINGLE-equation model. I believe that it can be written as:
where c is a parameter, and where R an T are 4-by-4 matrices that vary as a function of space
and time. (There are objects called "R" that are scalars, a matrix, and a 4-index tensor, and
stuff like that, but you don't need THAT much precision here and now!).
R represents the CURVATURE of the fabric of space-time. T represents the density of energy and momentum at
the same point in space-time. In English, one might describe Einstein's equation by saying
"Energy/momentum bends the fabric of space and time. The amount of bending at any point
in space-time is exactly proportional to the amount of energy and momentum at that point."
Wolfram claims that he gets GR exactly for the case where T=0. And he says
that the CNAs he uses to get the same results are much simpler than the whole
apparatus of GR -- the ideas used to DEFINE what "R" is and the ideas used to actually
solve the Einstein equation. But is this really simpler? It is a matter of taste.
Since all we observe IS a bending of space-time, near massive sources of gravity,
we can say that GR is the most direct statement of what we SEE empirically,
without throwing in extra details. But yet the CNA models really
are simpler in some way, and they are promising as the start of a whole new way to understand nature.
The proper scientific method demands that we consider multiple models IN PARALLEL,
if they all fit empirical data as well as each other.
But... the Kurtzweil review does point out some legitimate problems.
The reviewer states that Wolfram CLAIMS he can more or less PROVE that his
results about the Ricci scalar imply that he can do what I just described. Wolfram
told the reviewer that he HAS the calculations somewhere in a notebook. But they
aren't in the book.
The reviewer also says that Wolfram totally screws up the need to reconcile the model
with quantum mechanics. In fact, GR itself is not a quantum field theory. Quantum field theories (QFT)
are a family of dynamical system closely related to PDE, but very distinct in many ways.
In summary -- Wolfram's version of GR **COULD** be important, as an alternative to GR, if two problems
were fixed: (1) the "unpublished calculations" need to be published (or discovered!);
and (2) the T sources need to be understood and modelled. Even then, that leaves us
totally high and dry about how to unify GR OR CNA with EWT and QCD, both of which ARE
quantum field theories. **IF** Kunio is truly sincere in his interest
in the digital universe, or has friends who are, they might consider beginning with the first of these tasks.
Now -- I believe I see a way to solve (2) and the unification problem both. It is
even more speculative, in its way, than Wolfram's GR/CNA, but I think it should
be viable -- the best hope at present for a digital universe model to actually
match known empirical physics. I do not **BELIEVE** in a digital
universe; physicists should not be BELIEVERS. But we should do justice to the
possibility! And we should learn what we can from the exercise.
In essence, what Wolfram is REALLY telling us can be interpreted through the lens of Laughlin's
vision. He is showing us how all kinds of LOCAL dynamics of CONNECTIVITY networks
tend to level out curvature, in much the same way as heat flows level out temperature.
Laughlin discussed a kind of GENERAL principle. Here, we have a more specific case ..
a KIND of emergent behavior (levelling out of curvature) which results from ANY model,
digital or analog, which generates space-time as an emergent result of space-time
connections which are actively rewired, and which obey certain very broad conditions for
locality and balance. A more general theorem should be possible to that effect, a theorem about
emergent properties of a large class of systems.
BUT.. what about the SOURCES of curvature? What about the standard model?
For GR, I have just tried to describe general lessons from Wolfram. For the standard
model (EWT+QCD), I would draw similar lessons from the work I just posted at:
The key property of EWT and QCD is that they are both "gauge models."
(By the way, I just bumped into an old book on my shelves called
Classical Fields: General Relativity and Gauge Theory, by Moshe Carmeli.
I wonder whether I should read it now?)
It seems that the "gauge fields" (like electricity and magnetism!) EMERGE FROM
TOPOLOGY. And really, the classic paper by Wilczek and Zee that I discuss
was the first to spell out equations that lead to this understanding.
In fact, PARTICLES like electrons and protons, I claim, can be explained most easily,
in an objectively conservative (even mainstream?) fashion, as "topological solitons."
So that is where the mass, the source term for gravity and the foundation of the standard model,
come from. SO LONG AS OUR UNDERLYING MODEL
has a concept of "topological charge" (a TWISTING of fields into a kind of knot
in space-time!), gauge fields and the main characteristic o the standard model fall
out as a consequence.
Wolfram halfway realized this. For TWO-DIMENSIONAL "causal networks," he proposed crossover
lines a kind of way to introduce a kind of topology. But it only works in 2D. And he didn't get very far with it.
To be "opportunistic" -- our best chance of getting to a KIND of digital model of the universe
as soon as possible (and our best to get to a PURE digital model someday) may be as follows.
Relax the absolute purism of Wolfram's model. Still treat space-time as an emergent property...
but ADD a set of "wheel" to every node or arc in the network. The "wheel" would just be a set of
vectors or tensors taken from something like the unit sphere. (If the sphere happened to be
very simple -- just a circle, the sphere in two dimensions -- then the new state variables we are adding to the system would
really look like the setting of a little circular "clock" floating over each node.) The update
laws would have to be a kind of hybrid of analog and digital; they would depend on the clock settings.
But -- the combination of TOPOLOGICAL CONSTRAINT (fields defined over circles or spheres instead
of infinite lines) WITH fluctuating connections...
THAT combination is what defines what we ACTUALLY KNOW from the empirical data!
Indeed, as Laughlin says, millions upon millions of theories may result in predicting what
we have seen as an emergent behavior... but WHICH millions of theories, out of the universe
of trillions of possible theories, most of which would NOT reproduce what we have seen in physics?
For the most part... theories which adapt connectivity in ways which reproduce GR (LIKE Wolfram's
or like many others), AND WHICH embody topological constraints, are what we need.
There are a few other "filters" here, constraints on the structure of possible underlying theories,
but not so many.
Also -- by starting from millions of possible theories, we may be in better shape to use
NEW empirical data -- beyond or inconsistent with the standard model or GR -- to
develop and test new alternative theories based on actual data. That goes very far!!!
Actually, in hep-th 0505023, I am very proud to have sketched out what I view
as something that CAN become the FIRST mathematically well-defined model
of physics that actually unifies GR and the standard model -- and at
the same time fulfills Einstein's vision of a PDE model that reflects
OBJECTIVE REALITY. But... that's one example. It is essential
to prove it out... and then to widen it to many alternatives, as per the above.
The Kurtzweil reviewer of Wolfram argues that Wolfram totally futzes
on the treatment of
That is only half-true. Wolfram's version of CNA relies heavily on an idea
I called "hypertime" in my 1973 paper (which Penrose's Road to Reality cites).
That structure is quite enough to fit the backwards time interpretation of quantum theory,
elaborated on in my various arxiv.org papers, which is enough to take care of all
those paradoxes. (Though some of my discussions with Yanhua Shih
I never did publish yet... in my electronic files... the PRINCIPLES
are in the open papers, but some of the concrete details for
how they work out in specific experiments needs to be written down someday.)
All for now. Maybe all for awhile. These issues might be life-and-death issues
in the long-term, but the world is drowning right now in so MANY life-and-death issues,
no exaggeration... and others also need my attention. Who knows?
Best of luck...