This is the 12th and final Dennis Sharma lecture, series, and first let me thank the
Sharma family, all souls college, and physics department for having sponsored this
I thought I'd just read to you the names of the amazingly illustrious speakers have had in
the past, Roger Penrose, George Ellis, Stephen Hawking, Julian Barber, John Barrow, Mary
Kim Braubowitz, Kipthorn, Martin Reese, Tim Palmer, James Binney, Philip Candles, and today
it's a great pleasure to have David Deutsch, and I'm going to invite Philip Candles to introduce
So, David Deutsch did his degrees at Cambridge, where he read natural sciences, and then
did part three, and then he came to Oxford where he came to work with Dennis Sharma.
He wrote a thesis on quantum field theory and curve space time, but I remember it was mostly
to do with acceleration in flat space time and also on cosmic energy.
And even before finishing his thesis, David went to Austin, to the University of Texas
at Austin, on what can best be described as a year's pre-doc, and there he talked with
John Wheeler, and this led to a series of visits over a number of years, where each year
And during this time, David became interested in foundational issues of quantum mechanics,
and I must tell you what must I count as probably my greatest mistake, which is that one
day riding the elevator with Steve Meinberg in one of these terms where David wasn't there,
Steve turned to me and he said, where's this fellow David Deutsch, and I said, I think
he's gone off trying to understand the foundations of quantum mechanics.
So, Steve looked at me and we both sighed, and we thought that David had gone down a dark
road that others had trodden before and been lost to science.
Now, we were. David has since been elected to the Royal Society, and I won't try and improve
on the citation. The citation says, David Deutsch laid the foundations of the quantum theory
of computation and subsequently made or participated in many of the most important advances
in this field, including the discovery of the first quantum algorithms, the theory of quantum
logic gates, and quantum computational networks, the first era corrections, the first
quantum era corrections scheme, and several quantum universality results. He has set the
agenda for worldwide research efforts in this new interdisciplinary field, made progress
in understanding the philosophical implications via variant of the many universities interpretation
and made it comprehensible to the general public, notably in his book, on the fabric of reality.
David has since published a second book, the beginning of infinity, explanations that transformed
the world. He was awarded the Dirac Medal of the Institute of Physics in 1998, the
International Award on Quantum Communication in 2002, the H computational Science Prize in 2005,
David is currently a visiting professor here in Oxford, and he is also working on constructor
theory. This is an attempt to generalize the theory of quantum computation to cover not
just computation, but all physical processes. This will be part of the subject that
is based on this. Thank you. I am also grateful to the organizers and
to the Sharma family for giving me this opportunity to honor Dennis Sharma's memory. He was
my boss when I was a graduate student and later as a postdoc here in Oxford. He was in charge
of theoretical astrophysics and astronomy, which is a branch of physics that is unusually
close to all the fundamental laws of physics, general relativity, nuclear and elementary
particle physics, thermodynamics, foundations of quantum theory. Of course, fundamental
physics in any conception in any branch is about universal laws, but research in astrophysics
and cosmology, explaining a single phenomenon can involve several or all the fundamental laws.
Perhaps the first problem in physics that human beings ever try to solve out of sheer curiosity
namely the appearance of the night sky, why does it look as it does? It is remarkable
that even the crudest true explanation of that already requires all the fundamental laws
we know today to explain the basic fact that it is black and not white requires general
relativity. The colours of the stars, thermodynamics, why they don't go out, nuclear physics
and the aurora and thunder and lightning, many other phenomena, electricity and magnetism.
Confluence of fundamental laws in astrophysics and cosmology is a hint that there might
be a type of unity in nature that is deeper than the mere fact that there are universal laws,
namely that there might be a level of explanation of those laws. I first encountered that
idea from Dennis, but long before I even met him because when I was in school I'd read
his popular book The Unity of the Universe. Here it is, this isn't the original copy
I read then, that was a library copy. So, I borrowed its title for this talk.
Now, the subtitle, Man's Evolving View of the Cosmos from Ancient Greece to Mount Palomar.
Yeah, but I wasn't really interested in the history of cosmology, but I did love the book.
And a few years later, still before I'd met Dennis, I read it again and I was amazed
to find that it contained, among other things, a powerful advocacy of a false theory, the
steady state theory. That's the cosmological theory under which the universe is eternal,
has always existed, will always exist in its present state on the cosmological scale.
That theory, I knew, had been comprehensively refuted by observations long before I read
the book, but I hadn't even noticed that the steady state theory was in the book. In other
words, what was in a sense the main thesis of this book had entirely passed me by when
I read it. And that was because it wasn't the main thesis. The real theme of this book was
what the title says, the unity of the universe. And that unity, as I said, wasn't just
this. It wasn't just this. It was this, a unifying principle that would explain something
about, not everything, but something about why the laws of nature are as they are. The
principle in question in the book was a very natural guess for a cosmologist to make. They
called it the perfect cosmological principle. It said simply that the universe on cosmological
scales is homogeneous in time. That sounds good because there was already an ordinary cosmological
principle that said it's homogeneous in space. And that one is true, as far as we still
know. To say a word about principle, physics terminology isn't actually standardized in regard
to which laws of nature we call principles and which we just call laws. I'm using the term
principle specifically to mean a universal law about universal laws. So the perfect cosmological
principle, it placed a constraint on the other laws of physics. It didn't fully explain
them, but it would have placed constraints on all the other laws of physics as they were
then known. So that, for example, there could be the creation of matter out of nothing,
so that the density of the universe could remain constant, even though it's expanding.
And the total entropy could remain constant, even though stars were burning their fuel and
so on. Now, despite being totally false, this principle has some desirable features that
make the steady state theory a good theory intrinsically. And the first of these, of course,
most famously, is that this principle made the theory highly falsifiable. To conform to it,
the laws of motion and the various parameters, constants of nature, had to be just so to
make it, done right. So all the constants of nature were
going to have to be exactly right to make it happen that things like galaxies swirling
through the intergalactic gas would cause density variations, which would just result
in the form in the later in the formation of fresh galaxies of just the right size and
type containing stars of just the right composition and so on to reproduce, maybe a billion years
later of something, the all the statistics that have existed forever, which makes the
principle itself hard to vary, which makes it an intrinsically good explanation. And one
consequence of that was that it was strongly falsifiable by observation. And indeed, it was
duly falsified, for instance, by cosmological standards, light actually travels very slowly,
so that when we look out at something very far away, we're actually seeing how it was in a distant
past. So if the universe is homogeneous in space and time, then very distant vistas on the universe
should look very much like the universe looks here, or nearby here, and so astronomers looked
and it didn't look the same. And then there was the famous discovery of microwaves,
pervading space, and microwaves don't last in an expanding universe. They get red or in red
up rather like the Doppler effect. And so to maintain a steady state, they'd have to be replenished
and to fix that up required ad hoc modifications so nasty that it made the theory a bad explanation
after all, especially as its rival, the Big Bang Theory, did have an elegant and hard to vary
explanation for the microwaves. Now, I should say I'm drawing a distinction here between
a true explanation, which means objectively corresponding to reality, and being an intrinsically
good explanation, which is a transient property depending on the state of other knowledge at the
time, it's the property, as I said, of being hard to vary while still accounting for the
things it purports to explain. The Big Bang Theory in the steady state theory,
another mute button. The Big Bang Theory in the steady state theory were both very good
explanations because they were both severely constrained by other good explanations and by evidence.
One of the two was false, which is why Dennis went to work on the other. But that idea,
this idea, that there are universal principles which at least partially explain
the universal laws, which in turn explain the phenomena, was not overturned by the observations.
Only the particular principle that had as it were auditioned for that role, the perfect cosmological
principle, that had been overturned. Now, the second nice thing about the steady state theory
was the way in which it dealt with the initial conditions of the universe. Now, this may seem
like a technicality, but it isn't. It's quite fundamental conceptually. You see,
ever since the time of Galileo and Newton, the prevailing conception of how theories are supposed
to explain the world is that they provide laws of motion, which, given the state of the world
at any one time, predict or contradict it at any other time, or its probability at any other time,
but never mind that. The awkward fact is that while we have superb theories about what the laws
of motion are, we have never had a successful theory specifying the initial conditions.
And it's awkward because in the prevailing conception, the state of the universe,
what actually happens at all times and at any time, is the very thing that science sets out
to explain. So it's at least as fundamental as the laws of motion. We would like to explain it. In
the classic Big Bang theory, the initial conditions were that the state of the universe was spatially
homogeneous across an initial singularity that was causally extended, even though it was zero
in size. But that couldn't be exactly right, because if that were the exact initial state,
then nothing would ever happen. What starts exactly homogeneous stays exactly homogeneous under
the laws of motion. So there were various ideas, maybe quantum fluctuations spontaneously break
this imagery. But there was never actually a viable theory that predicted the details of the
in-homogeneity such as would lead to galaxy formation under things we observe. Roger Penrose had
the elegant idea that the vile curvature is zero at the beginning of the universe and maximum at
the end. But that doesn't seem to have been fruitful either. And today's prevailing theory,
which is called inflationary cosmology, is actually worse in that respect because it doesn't even
address initial conditions. That doesn't mean that inflation didn't happen. It just means that
it doesn't by itself solve the explanatory problem about the initial conditions that it was intended,
I think, to solve. You could always fudge these type of initial condition questions by resorting
to the anthropic principle, namely that there are lots of universes with all possible initial
conditions and that we're in one of the few in which astrophysicists exist to ask what the initial
conditions were. But if that were the only thing explaining the initial conditions, it would predict
that it's overwhelmingly likely that we're living in a bubble of order,
which is going to be snuffed out nanoseconds from now. And so it's refuted.
But there's another niggling problem, or I could say extremely fundamental problem,
depending on your outlook, with the initial states being one of the basic explanatory ideas
from which other explanations ought to be derived. There's no reason for anything else that we
know about physics that singles out the initial conditions as being preferred.
And all incidents are also predictably equal. In fact, the idea that the initial conditions
are special in the scheme of things has uncomfortable echoes of a pre-scientific
conception of what the physical world even is. See, there's a moment of creation
before which the physical universe didn't exist. Then the initial conditions are set by something.
And then, beautiful laws come into operation from which everything that subsequently happens
emerges. No wonder some people took the big bang theory as vindicating creationism
while other people, for the same reason, didn't want the big bang theory to be true.
Well, the steady state theory would have elegantly solved all those problems at once,
though it doesn't contradict the prevailing conception. It does radically augment it.
The state of the universe would now be deduced, at least in principle,
not from conditions at any preferred time, but from the perfect cosmological principle itself.
Since they'd also be the conditions that every other time no instant would be preferred,
and symmetry would not be broken, and even the size and character of the deviations from
homogeneity would have been determined by the principle. Nice, isn't it? But not true as it turned out,
which brings me to the third inherently nice thing about the steady state theory.
The perfect cosmological principle introduces a new mode of explanation into physics,
which supplements the prevailing conception. It doesn't only relegate the initial conditions
to being a mere consequence rather than fundamental principles. It also requires
that the laws of physics be fine-tuned to make a particular thing happen,
namely the steadiness of the steady state. In fact, that makes it much more fine-tuned than you might
think because, well, the steady state people were aware that their theory wouldn't work
if the process that reproduced the state over time were not also stable, because if it were unstable,
say if a small deviation from ideal steadiness produced a larger deviation, let's say a million
billion years later, or a trillion years later, or 10 to the 100 years later, then after a certain
number of cycles, the state would no longer have the steadiness property, and so it had to be
that a small deviation would be reversed in due course, that stability. They work hard to construct
their cosmological model to have that property, but stability is not enough if you want to make
the universe eternal. However stable the state is to small changes, a large nudge will eventually
happen given the normal assumptions of statistical mechanics, and even quantum mechanical tunneling
would eventually have the same effect, and so the steady state would degenerate, degenerate,
into the far larger realm of states that evolve with time into other states, thus violating the
principle. So the perfect cosmological principle would have required the quantum state of the
universe to be exquisitely pruned to eliminate what Bryce Dewitt called the maverick universes
that would evolve to violate the principle at any time in the future. The specialness of that state
in the view of the opponents of the steady state theory was just creationism. The entire
eternal universe created all at once in a state finally crafted to give the appearance of an
evolving textured mixture of structure and randomness, but actually all along wreaked to conform
to a certain ideal throughout its infinite extent. So here we had the proponents of two rival
theories, each accusing the others of in effect creationism, while other people were delighted
that their favorite theory brought meaning as they saw it back into physics, into the universe.
But this was all misconceived. Everyone was simply assuming that all fundamental explanations had
to be in the form prescribed by the prevailing conception, possibly with the addendum of the
perfect cosmological principle. Now it is possible to eject to the whole idea of principles in
nature, in that sense. Laws about laws. Can't we confine ourselves to laws about phenomena?
Is it possible to restrict science to those laws and reject laws about laws?
Well, here's an object lesson. The principle of the conservation of energy started out as a
mere law about phenomena. In fact, less than the law, it was merely a mathematical theorem
of Newtonian mechanics, initially that just for a system of particles moving in space without
friction and with elastic collisions, the quantity half mv squared summed over all the particles
is a constant. We now call that the kinetic energy of the system. But the theorem
was known for centuries before the concept of energy was even conceived and it wasn't necessary
at that time. In the meantime, people realized then that if you add that quantity to what we now call
the gravitational potential energy, which is minus gm1m2 over r squared, then the result is a
constant even if they're gravitating particles. But it still won't be true if there's friction,
for example. But now that's a theorem of Newton's law's emotion and gravity.
It still has strictly no more content than those laws themselves. In fact, less.
And it applies only when those laws are the full explanation of what is happening.
Every prediction of Newton's theories can be made without any reference to energy,
without even knowing of the existence of energy or of its conservation. And in particular,
those theorems predict nothing about the content of undiscovered laws of physics.
So that, and they didn't call it energy yet, but if they had, their energy would not have been
the energy that we know, nor is its constancy under those theorems, the conservation law that we know.
But then, in the 19th century, after Kant-Rumford's experiments on canon
that got hot when they were being drilled out, people guessed that if you add a further term to
that Newtonian scheme of summing half m v squared and gm1 m2 over r minus that, you can add
you can also add the Newtonian work done. And if you add to that an expression for the heat
normalized with suitable units, then the total will now be conserved even if there is friction.
And now, you have something that isn't a theorem. It's not deduced from laws of motion or
indeed anything. It's a law of physics in its own right, the law of the constellation of energy.
And indeed, it couldn't have been deduced because the laws of motion that underlie frictional
processes were still unknown at the time. In fact, I'm not sure they're known today,
but if they are, they're quantum mechanical. And at that point, people tweaked,
they realized that this new law now, to make sense, had to be respected by even by
as yet unknown forces and unknown substances. It was a law about laws, the principle of the
conservation of energy. And that is exactly when the term energy was invented.
So the key, that's the law's conservation of energy expressed with this in mind.
And the key word there is not energy. It's every. And when the theory of electromagnetism
was later invented, it did indeed conform to this principle even though electromagnetism hadn't,
theory of electromagnetism also was not known at the time when the principle was invented.
Furthermore, thermodynamics was born with several further principles about heat and work
and temperature and a new quantity entropy. And none of those principles were deduced from
laws of motion. In fact, many attempts have been made in the century and a half or so
since the inauguration of the thermodynamics to establish a connection between
between those laws and laws of motion or somehow to express thermodynamics within the prevailing
conception. And none have been satisfactory. They all involve fudges such as coarse graining
and infinite ensembles. And even the exact distinction between work and heat remains elusive to
this day. Then of course, still later, the 20th century in the early study of radioactive beta decay
when physicists added up the kinetic energies and the MC squared energies in radioactive decays
and finding that they didn't add up to a constant, powerly and fermi could guess that there was
just a hitherto unknown particle, the neutrino, for whose existence at first, the only evidence was
that principle. And again, that version of the conservation of energy couldn't possibly
regard it as a theorem whose premises were the neutrinos laws of motion and interaction
because those laws were not yet known. It was just predicted that once they were known
they would be found to obey the principles of quantum mechanics which by now included the principle
of the conservation of energy. And so they did. And that prediction was an indispensable guide
to discovering those laws at all. So the rule of restricting science to laws about phenomena
and rejecting laws about laws is untenable. Note that that rule is itself a principle,
the anti-principled principle. And as we've just seen, it's false.
More generally, I think the whole purpose of theoretical science is to explain the world,
the physical world, and therefore the sole criterion by which theories ought to be judged
is their explanatory power. This rules out having preferences between modes of explanation,
preferences that if those preferences are independent of how good the explanations are,
that should be what counts, the only thing that counts. So just as a scientific theory about
phenomena is much more than just an instrument of prediction of those phenomena,
much more than just a compressed summary of them, but is an explanation of them.
So a principle of nature is not just a statement of shared properties among theories,
it is an explanation of those properties. Now the prevailing conception is a principle too, isn't it?
And I believe it's just as false as I'll explain in a moment.
So what other principles of nature might be true aside from those of thermodynamics that I've
mentioned? Well, both quantum theory and relativity are partly principles that in addition to
making direct predictions about phenomena, they also assert that all other laws of nature,
including each other, conform to certain principles such as the principle that laws are formulated
in terms of geometrical objects in the case of general relativity, and in quantum theory the
principle of unitarity. And as those two examples illustrate, we shouldn't expect there to be
a rigid hierarchy of principles with ordinary laws being subordinate to principles,
but we should expect that the immense explanatory power of some of our theories, of our best theories,
implies that if they are true of some physical systems, they must be true of all of them.
I think it was Feynman who called this, in the case of quantum theory, the totalitarian property
of quantum theory, which, I think, Bryce DeWitt proved the same thing, and I think independently,
that if any system in the universe is governed by quantum theory,
then no system that could interact with that one could obey classical laws of motion.
Relativity isn't quite as totalitarian as quantum theory, but its principles do seem to be
inconsistent with those of quantum theory, so presumably one or both of them must be superseded.
Something that we couldn't know unless we regarded those two theories as principles.
Now, for example, they might just apply to different phenomena.
Now, the way in which DeWitt, in particular, proved the totalitarian property,
I'm not sure how Feynman did it, is quite significant from my present perspective.
He used the so-called uncertainty principle of quantum mechanics, horribly misnamed,
and he assumed for the sake of argument that it did only apply to quantum systems,
and that classical objects could exist in nature too, and could interact with some quantum system.
And then he showed by an ingenious set of arguments that by making certain measurements,
one could violate the uncertainty principle, not just for the combined system,
So loosely speaking, quantum theory either applies to everything or to nothing.
And the reason that that mode of proof is significant to me now
is that it uses the uncertainty principle in the form such and such a class of tasks is impossible.
And if a certain physical object is possible, the classical object,
then a further process would be possible that would lead to a contradiction.
And therefore, if the principle is true, the classical object isn't possible.
He expressed the proof in the prevailing conception, but it's barely used as such.
You see, saying that a given task is impossible in the sense that the uncertainty principle does
means that it, not just that it doesn't happen, but that it can't be caused to happen by anything.
Can't be caused with the help of anything else.
Even things not explicitly referred to, things not yet known,
So while ago, I proposed a new mode of explanation that Philip preferred to construct a theory
or rather a theory called construct a theory, which I hope will incorporate this new mode of
explanation, which is intended eventually to supersede the prevailing conception,
though the two are inter-translatable in many cases.
The first principle of construct a theory is this.
The laws of physics are expressible entirely in terms of statements about which
physical transformations it's possible to cause to happen and which are impossible to cause
and why. So this is about transformations that are caused by something,
some agent which is itself not specified, that any agent
except that this agent must itself be possible.
And there's a condition that the agent retained its ability to cause the transformation again.
Otherwise, it's only partly an agent and partly a patient.
Chemical catalyst is an example of such an agent.
It causes chemical reactions but does not participate in them.
By the way, we're told in elementary chemistry classes that it doesn't cause
chemical reactions, it just changes their speed, but that is not the case.
It converts things to have different temperatures and so on, but it itself stays the same.
So as a computer, we call these agents generically constructors.
By possible to cause, we mean possible with arbitrary accuracy.
That is, you give me an epsilon and if a task is possible, you give me an epsilon and someone
could design a constructor which causes that task to happen with accuracy, epsilon, or better.
And impossible means that the laws of physics exclude the possibility that anyone could ever produce
such a design or the laws of physics rule out the existence of such an agent, such a constructor.
So there are no probabilities in the constructor theoretic conception of the world.
Task that look probabilistic like building a fair rule at wheel,
I expressed in terms of preparing it in a specified quantum mechanical state with a given
So while the prevailing conception seeks to distinguish at a fundamental level what happens
from what doesn't happen, in which case possible and impossible are just a manner of speaking
about certain approximations or about our ignorance, but in constructor theory, it is the other
way around. The laws of nature are about what's possible and impossible, in the sense I've
just described. And what actually happens is in general an emergent consequence of that. Sometimes
it can be calculated, in which case the constructor theory and the prevailing conception are
equivalent, but sometimes it can't be calculated either because it's intractable or for some more
profound reason. And in those cases, constructor theory can express exact laws that are inaccessible
One important case of the latter are initial conditions of the universe.
They are in constructor theory, they are supposed to be incalculable consequences of laws about
As I said, you wouldn't expect there to be a fundamental law specifying the state at any other
time than the initial time, such as today, including all the locations of all the cows in
Oxfordshire that were auctioned today, you wouldn't expect the state of those cows to have
fundamental significance. Why expected of the initial state, especially as it,
this violates symmetries that exist everywhere else in physics.
And with that constructive theoretic perspective, we can begin to notice that there
are already other principles of nature that are already known, but are not usually acknowledged
as such, nor even acknowledged as being part of physics at all, simply because they don't
There's the principles of the theory of computation, for example.
The distinction between computable and non-computable functions doesn't refer to what the
computers made of. We expect it to be the same for any make or model or technology of general
purpose computer, even ones using laws of physics or materials not yet discovered.
So it's a principle, difficult or impossible to express in the prevailing conception.
But in some work that my colleague Kiara Maleto and I have done,
we have shown that there is a beautiful expression of this principle in constructor theory.
And this is in the context of a full constructor theoretic information theory
in which processes like computation and quantities like information are characterized
in elegant exact terms. That is, in constructive theoretic terms, in terms of what
classes of physical transformation it's possible or impossible to cause.
This new theory of information, which I commend to you all, unlike Shannon's existing theory,
naturally includes quantum information and predicts all its strange and distinctive properties,
such as the impossibility of cloning, a quantum, the information, the quantum state,
and the famous unpredictability of quantum measurement despite its deterministic law of motion.
And Maleto is also used to construct a theory in a biological application
to characterize what precisely it is about the laws of physics that permits the origin and
evolution of life. Among other things, the apparently non-physics concept of the appearance of
design, which was coined I think by Richard Dawkins, has an exact definition in constructor
theoretic physics. The result is that regardless of the so-called fine-tuning coincidences
in the constants of physics, the laws of physics do not in fact have and do not in fact have
The laws of thermodynamics which I've mentioned already have some
existing constructive theoretic formulations, like you can't build a perpetual motion machine
of the first kind or of the second kind and so on. But these are considered vague and hand-waving
in the prevailing conception. Another example you can't convert heat entirely into work without
side effects. But if this further worked by Maleto pans out, it would revolutionize the
foundations of thermodynamics because with slightly different versions of the first and second
laws from ones we know, it would express those known hand-waving formulations exactly and would
provide an exact characterization of the distribution between work and heat and hence of entropy
without coarse graining, without distinction between macro states and micro states, without
ensembles, just constructive theory. The basic reason that constructive theory can work
that sort of magic is that it abstracts away the constructor. Like the theory of catalysis
in chemistry, which is another as I said, another example of an existing constructive theoretic
theory. It's not this process that is declared to be possible or impossible in constructive
theory. It's this, just that. So the constructor, which is the usually the macroscopic
part of the process, is abstracted away and this is what makes constructive theory a natural
vehicle for expressing scale invariant laws and substrate invariant laws about quantities like
information and heat and work exactly. Like the perfect cosmological principle,
which had to be developed into the sophisticated steady state theory, constructive theory will
have to be developed quite a bit more before we derive testable predictions from it.
But unlike steady state theory, constructive theory has already provided a significant
unification and illumination of fundamental matters in diverse areas of physics and beyond, as I
said. And I think this already makes it a substantial step towards the unity that Dennis was looking
And so to mark the closing of this wonderful series of lectures, I've been asked to present