Publications New and Old
My
views of reality have evolved a lot, as I have probed the limits of our
knowledge further and further.
For
convenience – I am using this older version of my web page (reflecting older
understanding) for now as an annotated list of publications,
with
discussion of some issues I will not have time to update today. I hope to
update it before too long – though there is also some new work I may do first.
Because this a very complex subject, because you all start out in different places, and because I see an unconventional way to put together the very complicated mathematical concepts of mainstream physics, this web page will start out by describing the general picture as I see it, in informal language.
Those who prefer more precision may prefer to start out with: New in April 2008: Paul J. Werbos, Bell’s Theorem, Many Worlds and Backwards-Time Physics: Not Just a Matter of Interpretation, International Journal of Theoretical Physics, online publication 2 April 2008. (Hardcopy in press; click on “pdf” to see the full article.) That article begins by explaining how the usual “Copenhagen” version of quantum mechanics, taught as truth to most physics today, is grossly inconsistent with what we now know from experiments. It also serves as a kind of introduction to new possibilities for a more concrete unified model of physics, grounded in experiment and without requiring imaginary additional dimensions of space. Those of you who prefer to start out at the axiomatic level might be interested in a more formal, concise summary of what lies behind the proposed class of models, updated to July 2006.
At the United Nations University on consciousness in 1999, the organizer, a Japanese physicist, asked us all to start out with a very brief haiku-like summary of what is different about our viewpoint. I said:
“Reality is strange but real.
We need more color in our lives and less in our quarks.”
Today I would add:
“Quantum field theory (QFT) says that objective reality does not exist.
But objective reality says that QFT does not exist (mathematically).
Only if both ghosts reconcile and rise above the flat earth can either one become solid.”
This path has taken me to visit
and study very closely much of the work of the great experimentalist, Prof. Yanhua Shih. Shih’s
experiments with quantum optics,
”Bell’s Theorem” experiments, quantum teleportation, quantum lithography and so
on are perhaps the most complete and accurate in the world. I have often
wondered: how can people talk about quantum measurement for years and years,
and profess to be experts, without taking a very long and hard look at the best
empirical data – in its full depth and richness – about how quantum measurement
actually works in the experiments which tell us the most about it?
As I look ahead… I see a possible path which starts out in the lowlands of much more rigorous mathematical methods than those now used in high-energy physics, but leads up ahead to future peaks – obscured somewhat by clouds and uncertainties – of serious possibilities for enormous technological breakthroughs (and hazards) after we develop our fundamental understanding further. It also allows for the possibility that the greater universe – full of force fields we have yet to fully understand – may be far stranger, in qualitative terms, than everyday common sense seems to suggest. Just how strange could it be? Just to begin… go to What is Life?… or consider the backwards time interpretation of quantum mechanics which I first published in 1973, and have refined a great deal since then (and even since 2000).
Circa 1900, the conventional wisdom in physics said: “We already know it all, with only a few very small holes to fill in. These holes in our understanding are so small that they could not possibly have technological implications…” We have less reason to believe that today than we did then, and look what happened in the twentieth century! If we can avoid the historic tendency of aging civilizations to become jaded and slowly repress creativity and diversity in myriad ways, the objective opportunities look very exciting.
Today’s high-energy physics has tended to become very myopic in its focus on large accelerators and collisions between individual particles. Certainly these accelerators are an important tool in reasearch, but we should never forget the larger question: what are we missing by not paying more attention to large scale many-body effects for anything but the effects of electricity and magnetism? Is it possible that the elevated realms of grand unified field theory have something to learn from the humble literature of people who build lasers and chips? Is it even possible that new technologies could emerge from that way of thinking?
First, some basic facts of life for the nonphysicist.
Today’s best empirical understanding of how the universe works comes from two very different pieces: (1) the “standard model of physics” (EWT+QCD), which is a specific model within the large class of models called “quantum field theory” (QFT); and (2) Einstein’s general relativity (GR), which is an example of what people now call “classical field theory.” GR predicts gravity, while the standard model predicts almost everything else now known, with two main exceptions: (1) there are some strange phenomena called “superweak interactions,” known for decades, and recently studied closely by CERN’s “Babar” project, but still not understood; and (2) the predictions of QCD are hard to calculate for most of empirical nuclear physics, forcing the use of “phenomenological models” for most empirical work.
There are several truly beautiful quantum theories – so beautiful that many theorists believe “they have to be true” – about how to reconcile the standard model with GR. The most popular are superstring theory (and its n-brane variations) and quantum loop gravity ala Hawkings. They are as elegant, as popular, as authoritative and as beautifully argued as Aquinas’s “proofs” of the existence of God. But there is no empirical data at all yet to support these theories (though people have made efforts to try to find such data, and are still looking).
Even before the first successful QFT was developed, Einstein, Schrodinger and DeBroglie argued that quantum mechanics had taken a wrong path. Einstein argued that the intricate complexities of quantum mechanics could perhaps be explained as an emergent statistical outcome of a more fundamental “classical field theory” operating over space-time. In such a theory, the entire state of reality can be specified in principle by specifying the state of a finite number of “fields” (force fields) over space-time; each field is basically just a nice, continuous differentiable function, defined over Minkoswki space. The “law of evolution” of these fields is a set of local relationships, called “partial differential equations” (PDE) or “wave equations.” My first heresy is that I believe that Einstein was right after all – and, furthermore, that I can see where to find a set of PDE which can do the job. The key papers which explain this are, in chronological order:
The Backwards-Time Interpretation of Quantum Mechanics - Revisited With Experiment
Realistic Derivation of Heisenberg Dynamics
Equivalence of Classical Statistics and Quantum Dynamics of Well-Posed Bosonic Field Theories
Proof of Partial Equivalence of Classical and Quantum Dynamics in Bosonic Systems
A Conjecture About Fermi-Bose Equivalence
August 2006: Back to Reality: A New Synthesis
(A more intuitive, “magazine style” version of a more concise paper.)
Is all of this just a matter of philosophy? Does it make any difference to empirical reality? Consider, for example, the following question: what will happen if we find really new experimental setups, different from what has happened by accident already in the atmosphere, which can produce small black holes? (Several major labs are spending money on major efforts to do just that.) According to Hawking, and according to superstring theories which rely on Hawking-style approximations, the black holes will simply just evaporate away very quickly. But a unification based on Einstein’s approach would probably support the original prediction from the Einstein school – the prediction that a small black hole would gradually grow over a few thousand years, hidden away inside the earth, and then suddenly gobble up the earth in an unforeseen catastrophe rather similar to the comic book story about the planet Krypton. Does it really matter whether the entire planet earth might be gobbled up? To some of us it would. This is only a hypothetical example, but it would be nice to know what the true story is.
More seriously – years ago I started to prove and publish theorems showing that the statistics which emerge from “classical” PDE are more or less equivalent to those implied by the dynamic laws of quantum mechanics. (For a more precise statement, click on the papers above.) When I did so, there were two groups of people who reacted very differently to this. One group said that this was utterly impossible and utterly crazy. (No, they didn’t suggest any flaw in the logic; it was essentially a group loyalty kind of thing.) The other big and established group in physics said “Oh, we already knew that.” The second group were the people who knew how to build real technologies with lasers, with quantum optics. Classical-quantum equivalence turns out to be absolutely essential as a practical mathematical tool in building a host of modern devices like lasers – devices which exploit many-body effects. What I had discovered was really a kind of easy generalization and (more important) a new way of using that kind of mathematics – but the same principles were in regular use every day, an essential tool in the exploitation of many-body effects to generate large physical effects from things which might seem very small. The easiest standard introduction to classical-qauntum equivalence can be found in chapter 4 of Quantum Optics, by Walls and Milburn, but a more rigorous version appears in Chapter 3 of Howard J. Carmichael’s book on statistical physics; the material is also discussed in chapter 11 of Optical Coherence and Quantum Optics by Mandel and Wolf, after a long discussion of practical semiclassical methods. If people can now produce X-ray lasers and atom lasers… who knows what else we can do? Could we do things that do not happen by chance in nature with any significant probability… like breaking down the “mass gap” that prevents us from converting protons and neutrons directly into energy? Or like refining nuclear physics in the way we refined quantum optics years ago, based on empirical data and an open mind? But I do hope that anyone who explores this will maintain a link to the paragraph above, and be careful. Becoming famous won’t help you much if we all end up dead.
Many physics courses today begin with a long list of reasons why Einstein could not possibly have been right. In many cases, however, the list is intended to motivate introductory study of quantum mechanics, rather than to explain what is really going on. For example, many would say “classical physics is the limiting case where Planck’s constant (h) goes to zero; however, since h is not zero, we know CFT must be wrong.” But in fact, when h goes to zero, we end up with a physics in which the entire universe is made up of exact point particles. That isn’t at all the same as the continuous universe Einstein was proposing! Likewise, many rely heavily on intuitive understanding of Heisenberg’s uncertainty principle, which isn’t really part of the underlying mathematics of QFT; an experiment which brings that out is posted at:
Experimental realization of Popper's Experiment: Violation of the Uncertainty Principle?
The most serious researchers in quantum foundations would
point towards the “
But
Some physicists now argue:
“My brain cannot imagine a universe in which causality does not always flow
forwards in time. That is hardwired into all of our brains.” But that is
exactly the same as what the Cardinals were saying back at the time of
Copernicus. People said “the direction down is hardwired into our very bodies
and minds. Therefore, it must be the same direction in all places in the entire
universe. Therefore the world must be flat. In any case, our minds are designed
to assume this, and cannot possibly learn to live with any other point of
view.” Wrong, wrong, wrong. There is no more reason for the arrow of time to be
a universal invariant at all levels of nature than there is for the direction
“down” to be. And the human brain is perfectly capable of learning to make
predictions based on a more powerful model. (Still, I highly recommend Huw
Price’s work on “overcoming the old double standard about time,” cited in the
links above – though Price
himself recommends a more recent URL.) The true, underlying story here is
as follows: many, many humans would rather insist that the larger universe of
objective reality, does not exist at all, rather than admit that it does not
slavishly and universally follow an anthropocentric coordinate system (for
“down” or for “future”). These points are described in more than enough detail,
for now, in the links above.
During 2002, I had extensive discussions
with Yanhua Shih and with Jon Dowling, and later with Huw Price, about the
concrete explanation of
The Most Urgent Next Task in Basic (Mathematical)
Research
As of 2005, I saw the most
important work needed in basic physics to be basic mathematical work. A very
quick summary:
1. When high-energy physics demands
that a theory of the universe be renormalizable, it is demanding, in effect,
that a
2. The price of this
unreasonable demand is that large coupling constants are ruled out, and hence
all those bosonic field theories which generate solitons which are capable of
being mathematically meaningful without renormalization! In order to
avoid the weird ad hoc assumption that God somehow intervenes to gobble up the
infinite energy of self-repulsion that comes with point particle models, and in
order to have a truly well-posed mathematical model, we need a new approach to
axiomatic theory and a nonperturbative foundation. (Others like Arai have said
this much, but more people need to listen.)
3. Many physicists would prefer
to find an axiomatic
formulation of QFT which starts from the nice clean elegant picture of
Streater and Wightman (see the image on the right). But this has not really
worked, for a variety of reasons. Perhaps it would be more realistic to start
from the original, canonical version of QFT (as in Mandl and Shaw), and show
how that can constitute a well-posed nontrivial dynamical system. Some ideas on
those lines are given in A
Conjecture About Fermi-Bose Equivalence. To get back to elegance, we
can then exploit the quantum-classical equivalence relations, which turn out to
be mirror images and generalizations of mappings previously discussed by
Glauber and Wigner, widely used in quantum optics widely used in quantum optics.
4. Even at the classical level,
the most well-studied wave equations tend to generate
singularities (blow-up and
ill-posedness) or else be “trivial,” in the sense that energy dissipates away
into infinitely weak radiation. The only real exceptions are models which
generate something like solitons. More precisely, models which generate true or
metastable “chaoitons” as discussed in New
Approaches to Soliton Quantization and Existence for Particle Physics. Thus for a nontrivial
well-defined QFT, this classically-motivated approach is the most promising
available.
5. A careful study of Walter
Strauss’s monograph suggests that a proper starting point is to prove similar
inequalities – like Holder and Sobolev inequalities – for quantum mechanical
expressions,
using normal products, starting from an
initial state defined by some smooth initial density matrix with compact
support.
6. In New Approaches to Soliton
Quantization and Existence for Particle Physics, Ludmilla and I had
some difficulty in finding a counterexample to the conjecture by Markhankov,
Rybakov and Sanyuk that stable “solitons” can only exist in classical field
theories which are “topologically nontrivial.” I now suspect that such an
example could be constructed simply by considering a bound state of a pair of
‘tHooft or ‘tHooft/Hasenfratz patterns, which are acceptable in CFT only when
bound together (because of boundary conditions). Particles constructed in this
matter should be able to model fermions, in a very general way, and thus the
entire standard model.
Of course, the issues and experiments on quantum measurement,
discussed in the previous section are also important. As of July 2006, I see
some exciting opportunities to actually use this kind of mathematics more
directly in empirically-oriented nuclear physics, as noted in a recent concise summary, and in a more
“magazine style” paper in press in