In 2018, doctors at the New York University of Medicine announced that they had discovered a whole new organ in the human body, naming it the interstitium. The interstitium is a dense network of interconnected tissues and fluid-filled sacs found throughout the torsos of human beings. It had never been discovered before because the science of anatomy was based on the dissection of cadavers and, when a person dies and is dissected, the fluid empties out of the sacs; previously medical researchers had misidentified the remnants of these sacs as being tears resulting from biopsy methods. It was only because of new technologies enabling researchers to peer minutely into the interiors of human bodies while the humans were still alive that researchers came to the extraordinary conclusion that they had discovered a whole new human organ. As I understand it, scientists are still unsure what the function of the interstitium is, although it has been proposed that it partly acts like a system of shock-absorbers preventing damage to other parts of the body when people are moving around.
I learnt about the intersititium first through the British TV show QI. If I have readers who are unfamiliar with QI, I advise them to watch it on Youtube if they can't get it on ordinary TV. It used to be hosted by the inimitable genius Stephen Fy and is now hosted by Sandy Toksvig who is also Quite Interesting. QI specialises in reporting on incredibly surprising facts and speculations. The main point of the show is to reveal comedically how wrong everyone often is about things they think they know. For instance, did you know that the Immaculate Conception doesn't refer to Mary's visitation by the Holy Ghost but rather, according to official Catholic doctrine, the doctrine that Mary herself was born without Original Sin? I didn't until earlier this week. It was an episode of QI hosted by Stephen Fry that first introduced me to Rupert Sheldrake. The other thing QI often does is show that contrary to the idea that science has explained everything that there are mysteries from the trivial to the profound that scientists still haven't solved. For some reason, I find this very heartening, perhaps because it helps fortifies me in pursuing one of my life goals, my project of showing that the whole of psychiatry is bullshit. The only small drawback of QI is that because it often reports on the latest scientific findings, it may be promulgating errors that will not stand up to the test of time.
The main purpose of this essay is to discuss something QI gives little time to perhaps because it seems too complicated: quantum mechanics. I want to try to explain a part of quantum mechanics in such a way that it is clear to people who only have a slight understanding of physics and at the same time present in rudimentary form a whole new way of approaching it. I first wrote about quantum physics many years ago in two posts called "Probability and Schrodinger's Cat" and "Probability and Schrodinger's Cat Part 2" and have written a number of posts either directly or indirectly concerning probability, a key aspect of quantum theory, since. If I have a radical proposal it may be precisely because I am not an expert, that I have thought about this issue philosophically rather than mathematically. It may be that the problem with becoming an expert is that this involves accepting the orthodox position and so discourages independent thinking.
In the previous essay I discussed the current war in Gaza and before I discuss quantum physics I need to make a couple of corrections. In the previous post I said that prior to October 7, Israel was pursuing a normalisation agreement with Egypt. In fact it was close to making a deal with Saudi Arabia, a deal in which the Saudis would have officially recognised Israel as a state. I made this stupid mistake because I didn't factcheck my memory. I also said that 1400 Israelis lost their lives on October 7 because that was the number reported by all the news networks at the time– we now know it was closer to 1200. My previous essay was a little stylistically unclear at one point because I had difficulty trying to articulate exactly what I meant but I hope my readers understood what I was trying to say. Since writing the previous essay I continued to research the Israeli-Palestinian dilemma and considered writing a full follow-up to it, replying to a podcast by Sam Harris and an interview I saw between Piers Morgan and Douglas Murray. I decided that rather than write something about it again myself it would be better if I simply direct readers to two podcasts by Ezra Klein, both available on Youtube, "An Intense, Searching Conversation with Amjad Iraqi" and "What Israelis Fear the World Does Not Understand". The first is an interview with an Arab Israeli who would like to imagine a world without 'ethno-nationalism' and the second with a Jewish Israeli who is a self described Zionist. Both are reasonable left-leaning people who just happen to have quite different perspectives. I will say one brief thing not about the present war in Gaza but about the Israel-Palestinian conflict more generally. The crux of the matter seems to me to be this. Palestinians want the Right of Return – that is, Palestinians in Gaza and those who have been (and are currently) being displaced in the West Bank, as well as the millions of refugees living in neighbouring Arab states, want to return to the homes their forbears fled or forced from in 1948 and, I think, 1967, homes that are now inside Israel. Israeli Jews want a Jewish state and in order for the state to be Jewish, they believe it needs to have a Jewish majority. In the previous essay I was a little flippant about this fundamental issue. What needs to happen is some kind of compromise. Of course there are horrible right-wing people on both sides who hate and want to kill their opponents but these people should be excluded from the conversation.
In the previous essay I also discussed the Second Law of Thermodynamics and I want to say something more about it before diving into quantum physics, something I implied but didn't say directly. The Second Law states that entropy always increases or, to put it helpfully but perhaps a little misleadingly, that isolated systems always tend to become increasingly disordered as time goes on. If we consider very, very small systems, however, such as a system consisting of a couple of protons and electrons, the Second Law of Thermodynamics doesn't apply. This is because we can't meaningfully say that one state of a microscopic system is more or less ordered than another. In fact even the term 'heat', a term that is very important to thermodynamics, doesn't really make sense when considering such microscopic systems. If we imagine a slightly bigger system, the Second Law applies weakly; it becomes stronger and stronger as we consider larger and larger systems. The Second Law applies almost absolutely when we consider macroscopic systems, such as a box full of air, a coal-driven locomotive, or even perhaps a person. It is easy to forget the difference in scale – something like a pen contains an unimaginably large number of atoms. At the human scale the Second Law does seem like an absolute law because macroscopic states are so much vastly larger than microscopic states. This raises an interesting philosophical question. In physics (although not in other sciences like biology) it is common to talk about Laws of Nature, where a law is a rule that the universe follows without exception. If we use this rough definition, the Second Law of Thermodynamics is not really a law at all because exceptions occur, perhaps even very occasionally at the macroscopic scale. Something else comes to mind. One of the most important laws of physics is that energy is always conserved but it has been recognised for a long time that, as quantum and particle physics seems to imply, this law is violated when we consider particle interactions, although these 'perturbations' are so small and brief that we don't need to worry about them unduly. The law is obeyed in the long run. When we consider all this, Rupert Sheldrake's proposal that the Laws of Nature are not really laws at all but actually habits seems far less outrageous.
There is a second issue with thermodynamics that I implied but didn't explore. According to Wikipedia, the definition of the entropy of a macrostate is the Boltzman constant multiplied by the logarithm of the number of possible microstates associated with that macrostate (assuming that each microstate is equally probable). The problem here is that neither the term macrostate nor the term microstate has a clear definition. When thinking about this over the last few days, I became worried that the Wikipedia entry was simply wrong. In 2005 I did a fairly comprehensive course in first-year physics (not second-year as I said in the previous post). We covered thermodynamics, special relativity and a reasonable amount of quantum physics but not general relativity or, in any depth, the Standard Model. Although I still possess the textbook dealing with quantum physics, I lost the textbook containing the discussion of thermodynamics a long time ago and so have been unable to check whether Wikipedia is peddling disinformation. From memory, I seem to recall the textbook saying that we cannot meaningfully talk about the absolute value of a system's entropy but only about changes in entropy.
In the nineteenth century, when the concept of entropy was first formulated, the change in entropy of a system was defined as the change in energy of that system (the amount of energy coming in or out) divided by the absolute temperature of that system. If we combine this definition with the Second Law, we can explain why if a hot object is placed in contact with a cooler object, the first will cool down and the second warm up and they will together tend towards a common equilibrium temperature. The nineteenth century understanding of thermodynamics was replaced by a statistical theory, formulated I think by Boltzman. In 1900, in fact, physicists were divided as to whether the Second Law was a hard law or a statistical law – Max Plank himself, I believe, thought it was a hard law. The statistical view won out (even though Lawrence Krauss gives the impression of still believing it to be a hard law). According to the statistical theory, it is actually possible for the hot object to get hotter and the cooler object to get cooler but this is very, very unlikely. My understanding was that we can derive the nineteenth century theory of thermodynamics from the statistical theory: the older theory 'emerges' from the statistical theory when we consider systems of larger and larger sizes. However, as I suggested above, I wonder if the statistical theory is itself adequately defined.
And now at last I turn to quantum physics. The first idea in quantum physics I need to discuss because the following discussion won't make sense without it is 'wave-particle duality'. Hopefully my readers have at least heard of this idea. At the human level we tend to think in particle terms: we think of a rock as composed of particles and, when we throw it, imagine it to behave like a really big particle that obeys Newton's laws. We also know about waves such as waves on the sea or sound waves. At high school we learn that light is made up of waves, specifically electro-magnetic waves. Waves and particles seem like two entirely different types of thing. However at the quantum level we can't easily tell the two apart. When we consider something like an electron or a photon, sometimes it is better to describe it as a wave and sometimes it is better to describe it as a particle. It depends on the situation. Everything actually possesses this wave-particle duality, not just electrons and photons, but for some reason this 'wave-particle duality' isn't apparent in the macroscopic world. I won't attempt to explain why it is not apparent here. What is important is that in the following discussion I will talk about particles having associated waves, even though this is not exactly right, because it will make the discussion easier.
The waves associated with particular particles, such as electrons, generally obey the Schrodinger equation. (If the particles are moving very fast, relativistic effects come into play and we need to use the Dirac equation instead, an equation more general than Schrodinger's and formulated later.) Interestingly you can't derive the Schodinger equation from prior assumptions or axioms. When Schrodinger came up with it in 1925 it seems to have just popped into his head. (Of course he had been thinking intensely about the new science of quantum mechanics for a while.) Schrodinger's equation is a wave equation. Schrodinger himself didn't know how to interpret it – because waves are spread out in space he thought that an electron must also be spread out in space. In 1926 Max Born offered the interpretation that has become a core part of quantum physics ever since. The wave function should be interpreted in terms of probability. The greater the amplitude of the wave at a particular point in space and time, the greater the probability of finding the associated particle there. (Mathematically, it is is bit more complex. The probability that a particle will be found in a particular region is the absolute square of the wave function integrated over that region.)
There is no way I can in this essay discuss all of quantum physics. One important, in fact foundational, idea in quantum physics, for instance, is 'quantisation'. If a particle like an electron is subject to boundary conditions, if it is for instance stuck in a box from which it cannot tunnel out, it takes on one of a discreet set of energy levels. This contrasts with classical physics in which an electron in a box can have any energy at all and where energy levels are continuous. I sense that there is a conceptual puzzle here but again I will not dive into it. My purpose here is to simply describe one particular example of quantum mechanics and show that it leads to an extraordinary conclusion.
The phenomenon I wish to talk about is diffraction. Basically, diffraction is the term we use for the fact that light, like other waves, bends around corners. This bending, in the case of light, is generally so slight from a human-level perspective that ordinarily we just assume that light travels in straight lines but diffraction has been known about since at least the eighteenth century. If we shine a beam of light, from a reasonable distance away, on a very small aperture (an aperture that is comparable or smaller than the wave-length of the light) and then examine where it lands on a screen placed some distance away on the other side, we find that the beam spreads out. It forms a band known as the general maxima and, on either side, a number a smaller bands known as secondary maximas. Until the idea of photons was proposed (in 1905 by a then little known German Jew working in a patent office in Bern), almost all the evidence, including the phenomenon of diffraction, suggested light was a wave; with quantum physics it became possible to describe the beam of light as, rather, an absolutely enormous number of photons. This is now where we start to get quantum. It is possible today to shoot a single electron or a single photon at an aperture and then observe where it lands on the screen. Even though we are, in a sense, dealing with a single particle, there is still a wave associated with that particle, a wave which we can call a probability wave. This wave, theoretically, diffracts and forms a general maximas and secondary maximas on the screen just as a beam would.
But now, finally, we arrive at the Measurement Problem. When this experiment is actually, rather than theoretically, performed, the very human scientist in the real world doesn't observe a wave forming on the screen; he or she observes the electron or photon arriving at a specific point on the screen. The wave function, as I hope I explained above, describes the probability of finding the particle at a particular point – before the measurement is carried out. When the scientist actually performs the measurement and observes the particle arriving at a a particular point, it seems he or she must come up with a new estimate of the probability of the particle being found there. In fact, because it is actually found there, the scientist must say that the probability of it being found there is 100 percent. Any other conclusion seems to me to be incoherent. (I will briefly mention one such incoherent conclusion, the Many Worlds interpretation of quantum physics, later in the essay.)
In the previous paragraph, in attempting to clearly describe the Measurement Problem, a problem that has bedevilled physicists for a hundred years, I may have done something extraordinary – I may have strongly gestured towards a (partial) solution. The rest of this essay will involve fleshing out this hint. The solution involves thinking about the nature of probability in a different way than people often do. This is something I have written about a number of times and I'll give some examples which might make my still very tentative theory a little clearer.
Suppose we have a generic shuffled pack of cards without jokers and we ask, "What is the probability of Bob drawing the Queen of Hearts from the top?" Because we lack any information about the order of the cards, we should (rationally) assume that the probability is 1/52. We should assume this because every possible card order is, presumably, equally probable. Suppose Bob draws the Queen of Hearts and we see this. We now have new information. If we now ask, "What is the probability that Bob drew the Queen of Hearts?" we can say, "100 percent". However it seems that we can also ask the question, "What was the probability of Bob drawing the Queen of Hearts?" People, I believe, will usually say, "1/52" because they will consider what they knew about the pack prior to Bob drawing the card and will exclude from the model or picture in their minds of the situation the fact that Bob actually drew a Queen of Hearts. If Jane wins Lotto, people often say, "The chance of Jane winning was one in a million." However in a sense the probability that Jane won was 100 percent because it actually happened. The reason people say her chance of winning was one in a million is that they consider very general ideas about how lotteries work and ignore the fact that Jane actually did win. One last example. Suppose you have a friend who says, "The probability that Donald Trump won the 2016 election is 67 percent." You're entitled to say, "You're insane. Donald Trump actually won the election. The probability that he won is 100 percent." Your friend's claim only makes sense if she has reasons for thinking Donald Trump might not have won and can mathematically quantify her uncertainty. If, however, your friend says, "The probability that Donald Trump won the 2016 election was 67 percent," it seems she is saying something different. She is saying that based on the information she had at the time back in 2016 and excluding information she found out later, in particular the fact that Trump actually won, that this is her rational estimate of the probability of it occurring back then. Any estimate of probability is subjective in the sense that it is made by a person based on information he or she possesses, information that is always incomplete except when we are dealing with certainties. (In talking about probability in this manner I am, by the way, implicitly assuming determinism.)
The purpose of this essay is not to fully spell out a theory of what probability is but to talk about quantum physics, although I will implicitly be invoking the ideas I have thought about and suggested in the previous paragraph. I'll go back to the diffraction experiment again. When many of us think about experiments like the diffraction experiment, we imagine that what happens is that the electron or photon is a wave until it is observed on a screen at which point it somehow suddenly turns into a particle. This makes it seems that the wave function disappears. I used to believe something like this myself. Physicists and others, myself included, have talked about 'wave function collapse'. This term seems to make sense because, before the measurement, the wave function was spread out on the screen but when we observe it, we seem to find a particle arriving at a single point. It seems to have collapsed. However we need to bring in another idea from quantum physics, the Heisenberg uncertainty principle. I hope readers have heard about it. This principle says that if you have very great certainty about the position of a particle, you lose information about its momentum. This means that if the scientist measures with great accuracy the arrival point of the particle, she necessarily loses information about its momentum; its possible momenta are smeared out in a wave-like fashion. To put it simply, even after the measurement is performed, there is still a wave function associated with the particle. The wave function does not disappear when the measurement is made; rather it changes. I need to take this one step further. Before the measurement was made, there was one wave function associated with the particle and after the measurement is made there is another. But when we perform the measurement, the wave function the particle must have had before the measurement, when it left the emitter, passed through the aperture and sped towards the screen, also must change. When one performs the measurement, one must retroactively change the particle's past wave function. If the wave function is a real thing then when we perform the measurement, the past changes. If we want to assume that the past is fixed, then we must therefore suppose that the wave function is not really a real thing. What I am arguing is that the wave function describes what we can know about a particle. When we perform the measurement we are not changing the past but rather changing our knowledge not only about the present and future of the particle but also its past history. And yet, by gaining some new information about the particle we have lost other information.
Last year I wrote an essay, "Chance and Necessity" for a course in classical and medieval philosophy, an essay I published in this blog. In it, I drew on an essay by JME McTaggart and talked about an A Theory and B Theory of time. We can consider the idea I presented above in similar terms. Suppose tomorrow someone proved that Francis Bacon was the real author of all the plays hitherto attributed to Shakespeare and everyone comes to believe this. We could make a bizarre argument that this person has changed the past. Alternatively we could take the common sense view that it is not the past that has changed but our understanding of the past. Similarly the common sense view should be that when we make a measurement we do not change the past but rather our knowledge of it. However it is when we combine this common sense view with quantum physics that we arrive at some seriously radical conclusions.
This notion, that the Schrodinger equation does not describe something in the real world but rather what we can know about it may not seem very controversial to laypeople but, trust me, to physicists it is indeed off the wall. If quantum physics does not describe reality but rather our understanding of reality, it seems there must be some intimate relation between the laws of physics and the consciousnesses of sentient beings. This is because quantum physics is founded on ideas of probability and as I have argued above and in other essays any estimate of probability is subjective, in the sense that it is made by a person based on incomplete information. There is a second argument that supports this radical notion. Suppose the diffraction experiment is carried out by several people. Before the measurement they all agree on the wave function the particle must have based on observed or predicted data such as the width of the aperture and the momentum of the particle. The experiment is carried out by one physicist who measures the arrival point of the particle and 'updates' the wave-function in her own mind. However she does not immediately tell her colleagues. For a period, the first physicist has one idea of the wave function and her colleagues have another. It is when she tells them, when the new information gained by the measurement is communicated to them, that they themselves 'update' the wave function in their own minds in a kind of Bayesian way. My interpretation of the wave function is that it is a kind of model in a person's mind concerning the outer world based on the information she has and that it can vary from person to person. A critic may mount the following objection to this proposal. She may say that the particle, surely, must indeed have a real objective wave function associated with it but that the physicists performing the experiment got it wrong before the measurement and even after. This critic may even suggest that this real objective wave function can never be precisely known. I am unsure how to clearly answer this objection except to say that I suspect it relies on a misunderstanding of what a wave function is. The wave function is based on the observable information associated with the particle; furthermore it involves probability (even though Lawrence Krauss pretends this is not the case) and probability is, I believe, a subjective assessment made by person based on incomplete information. Therefore the wave function itself can simply not be described as objective.
This interpretation of quantum physics is only a partial solution. In the previous post I recommended that readers watch "Are Many Worlds and Pilot Wave the SAME thing?" by PBS Spacetime on Youtube. I said it was an accessible introduction to the three main interpretations of quantum physics – but perhaps it is not as accessible as I suggested. Therefore I will attempt to help out. The three main interpretations I shall discuss are the Copenhagen interpretation, the Many Worlds interpretation and the De Broglie-Bohm Pilot Wave theory. My theory is not compatible with the first two but may be compatible with the third. I shall briefly attempt to describe each of them.
The film Oppenheimer includes a very brief scene of a number of prominent physicists sitting around on chairs in Copenhagen. It is a nod to audience members who know a little about the history of quantum physics; it gives the impression of the physics community getting together to reach a consensus about how to interpret it. In talking about the Copenhagen interpretation here I shall not rely on memory but refer to the Wikipedia article on it. The Copenhagen interpretation has its roots in talks given by Max Born and Heisenberg in 1927 but the term 'Copenhagen Interpretation' didn't catch on until the 1950's; it has been for a long time the default interpretation that is often taught to students. The Copenhagen interpretation can't be attributed to any single physicist and there may be variations on it; nevertheless I will try to sketch some general features. In 1927, at the Solvay Conference, Max Born and Heisenberg declared "we consider quantum mechanics to be a closed theory, whose fundamental physical and mathematical assumptions are no longer susceptible of any modification." According to the Copenhagen interpretation all the real information about a particle concerning its position, momentum, and energy can be derived from a knowledge of its wave function; there are no 'hidden variables'. The wave function is objective. However reality is fundamentally indeterministic, random, a randomness that becomes apparent when one makes a measurement. When a measurement is made it seems we find out something new about the particle but this new information comes from nowhere, is causeless. The idea that I advanced above, that the measurement actually changes the wave-function, does not seem to have occurred to people who formulated and subsequently subscribed to the Copenhagen interpretation; they seemed to regard the wave function and measurement as two quite separate types of thing. The problem that particularly vexed the early proponents of the Copenhagen interpretation was not so much the measurement problem but rather where to draw a line between the classical deterministic macroscopic world of scientists and measuring instruments and the peculiar quantum indeterminate microscopic world of wave functions and particles – I feel physicists worry less today about where to make this 'cut'. Adherents to the Copenhagen interpretation are encouraged to just ignore the measurement problem, to treat it as not a problem at all. They are advised to just do the maths and not worry about why a particle ends up at a particular point on a screen. Famously N. David Mermin summed up the Copenhagen interpretation as "Shut up and calculate!"
In the previous post I suggested that Lawrence Krauss gives the impression that he wants to ignore the measurement problem and reported how he considers quantum physics to be deterministic even though he concedes that a measurement produces a random change. Having thought about it, I have concluded that this must be because he subscribes to the Copenhagen interpretation. Krauss is also a determinist and so I wonder how he reconciles these two incompatible belief systems.
The Many World interpretation seems to me a kind of development of the Copenhagen interpretation. Like the Copenhagen interpretation all the information about a particle is derivable from its wave function. The difference between the two is that the Many Worlds interpretation holds that what people have termed 'wave function collapse' never really occurs; even when a measurement seems to be made, the wave function continues as if it hasn't. The universe according to this interpretation is constantly splitting into alternative universes, perhaps all the time or perhaps only when measurements are made, I am unclear which. Consider the diffraction experiment again. Suppose we imagine a line down the middle of the central maxima; the Many Worlds interpretation holds that a number of universes split off and of the universes that branch off, half have the particle arriving on the left side and half have the particle arriving on the right side. I am unsure if Many Worlds proposes that the number of universes that branch off when a measurement is performed is finite or infinite. It is this interpretation of quantum physics that inspired Everything, Everywhere, All At Once and all the recent Marvel films.
At this point I shall digress a little. Another interest of mine is literary interpretation. I believe that successful stories, especially films, are arguments in favour of some core proposition. Everything, Everywhere, All At Once is an argument in favour of the idea that free will exists. Its central message or moral is, "No matter how shitty your life has become, you can improve it by making the right decision." Everything, Everywhere, All At Once won the Best Picture Oscar and this fills me with hope because it suggests that the Academy of Motion Pictures and Sciences has some understanding of quantum physics and its philosophical implications.
Even though Everything, Everywhere, All at Once is a good film, I must reject the Many Worlds interpretation of quantum physics. My reasons are philosophical rather than mathematical. When the scientist performs the measurement, she does not observe and participate in all the possibly infinite universes that branch off. As far as she is concerned, the particle arrives at a definite location; she only observes and is aware of one of the possible universes. The Many Worlds interpretation does not really solve the measurement problem, it just kicks the can down the road. The problem becomes a problem of consciousness instead, a problem that is the province less of physicists than of philosophers of mind. We need to explain why the scientist observes only one universe, not some smeared out collection of all the possible universes that can arise from the wave function. Physics, in the end, must depend on empirical evidence and in the real world, as you and I know, a person only observes a single universe. The problem with many worlds can be illustrated with an example. Consider the friend who says, "There is a 67 percent probability that Donald Trump won the 2016 election." According to Many Worlds this is reasonable statement to make. Your friend is saying that of all the universes that split off just before the election, 67 percent of them contained Trump winning and 37 percent of them didn't. If you think your friend's statement unreasonable, I contend you should also reject Many Worlds.
I turn now to the third interpretation. The DeBroglie-Bohm theory is the most famous of the 'hidden variable' theories. The nice thing about this theory is that it is conceptually so simple, In this theory, waves and particles are two distinct types of thing. Particles are real things that always exist; their associated waves are 'pilot waves' that guide the particles through space. Bohm introduced a new equation, the guiding equation. Particles are influenced by both the Schrodinger equation and guiding equation; they don't travel in straight lines but sort of wobble about. The theory is deterministic. However, the guiding equation is explicitly non-local. What this means is that the position and velocity of a particle depends on all the other particles in the 'universe', where what we mean by 'universe' is everything involved in the experiment, perhaps even the scientist performing the experiment herself. DeBroglie proposed the first version of this theory in 1927 but was bullied into accepting the Copenhangen interpretation instead; it was revived and properly developed by David Bohm in the 1950s. It was considered a fringe theory for a very long time – for two reasons I believe. First, physicists have tended to dislike non-locality. (Einstein famously used the term 'spooky action at a distance' when criticising quantum theory.) Second, even though the particles move deterministically under the influence of the Schrodinger equation and guiding equation, we can never know precisely where they are and how fast they are moving unless at some point (in the past, present or future) we learnt where all the particles in the universe are located through some some of magic (it can't be empirically worked out). Physicists don't like this because they like doing experiments to prove things and Bohmian mechanics is experimentally unprovable. This is one reason the Copenhagen interpretation was so popular – it only dealt with observables.
Recently however the pilot wave theory has become more fashionable. Non-locality is today far more accepted than it was in the past. Physicists now, mostly, believe in something called 'entanglement', the idea being that if two particles become 'entangled' and later we perform a measurement on one of them it instantaneously affects the other even if the two are very far apart when the measurement is performed. More importantly, the DeBroglie-Bohm theory seems to solve the measurement problem. If the particle is always a particle, it should surprise no-one that when it arrives on a screen it seems to be a particle rather than a wave. In fact, both Wikipedia and the Stanford Encyclopaedia of Philosophy say that the DeBroglie-Bohm theory solves the measurement problem.
This is not a consensus opinion however. In the Youtube video by PBS Spacetime that I mentioned earlier, Matt O'Dowd presents an argument, an argument proposed by fans of the Many Worlds theory, that seems to imply that pilot wave theory doesn't solve the measurement problem. The argument is based on the fact that according to pilot wave theory the particle has no effect on the wave function. If this is true, the fact that there is a particle that actually passes through the aperture and arrives on the screen should have no effect on the probability distribution we theoretically find on the screen and so doesn't explain why we find the particle at a particular point. I have been thinking about this and have decided that these Many Worlds enthusiasts are making an error; they are continuing to believe that measurements never occur. I argued above that when we perform a measurement the wave function changes. If any physicists read this blog, I would put it to them like this: when we perform a measurement we are effectively introducing a new boundary condition that affects the wave function. Even if the particle does not affect the wave function, the measurement does.
The reason pilot wave theory has become more popular is largely because of the influence of a more recent physicist, John Stewart Bell. It is Bell's work that led to the idea of entanglement. Bell proposed an experiment that could show whether non-locality is a real thing or not; when such experiments were eventually performed they did indeed show that non-locality is a real thing (assuming something known as Superdeterminism is false). The experimenters won the Nobel Prize in Physics for this work last year or the year before. Bell was very influenced by Bohm. In fact, he objected to the term 'hidden variable theory' on the grounds that such supposedly hidden variables are the very things revealed by measurements. Despite the current popularity of Bohm's theory, there are significant difficulties with it. It is very hard to reconcile pilot wave theory with Einstein's Theory of Relativity because Relativity says nothing can travel faster than light. Bohm's theory depends on the Schrodinger equation and, as I pointed out above, the Schrodinger equation (unlike the Dirac equation) is non-relativistic. The challenge for physicists who find some value in pilot wave theory today is how to reconcile it with Relativity.
In trying to clearly understand and explain the Measurement Problem, and in thinking about it over the last few days, I believe I have realised the mistake proponents of both the Copenhagen interpretation and Many Worlds interpretation have made. Consider the diffraction experiment again. The wave function possessed by the photon or electron has historically been determined only by some information, specifically the width of the aperture, the momentum of the particle, and the distance to the screen, but not all the information. When someone makes a new measurement, she acquires new information and so must update the wave function; at the same time she loses some information. My contribution, if it is wholly mine, is to add to quantum physics a new way of looking at probability. Any estimate of probability is subjective in the sense that it is made by a person based on the information that the person possesses, information that is incomplete. The Schrodinger equation is probabilistic and must therefore be subjective. This implies that different people can come up with different values of the particular wave function for a system if they have different information about the system. If there is an objective world that we all share, these different values can only incompletely describe the system and the differing wave functions should rather be understood as describing what particular people know about the system. This is what I was trying to say in the posts about Schrodinger's cat years ago and I still may not be expressing myself clearly. I feel that this proposal supports some 'hidden variable' theory but is not conclusive. If we decide for other reasons however that the world is deterministic, this way of looking at probability may supplement such deterministic theories.
I want to say something about how I came up with this idea. It may seem at first as if I am being arrogant but if you read on, you'll find I'm not. Although my background is in English literature, I have always had an interest in physics. In 2005, I did the paper on physics although I failed to understand the Schrodinger equation at the time. The paper got me thinking about it though. In 2007, almost immediately after I became 'sick', my father gave me a long article about Bell's Inequality. I wasn't in a state to make sense of it then and I still don't know why he gave it to me. My father is a smart man but he isn't a physicist. In 2008, I think to distract myself from ongoing emotional distress, I set myself the task of deriving E=MC squared from a small set of assumptions and Maxwell's equations. I deliberately chose not to look up any proofs on the internet. There were pieces of paper covered in algebra all over the house. At one point during the year, I accidentally derived the Heisenberg Uncertainty principle. This principle is not, as some people think, a law arrived at empirically, through experimentation – you can derive it from the DeBroglie wavelength and the mathematics of waves. I derived it from doing calculus on the normal function. I finally derived E=MC squared at the end of the year, just before I became 'ill' again.
In subsequent years I continued to think about relativity and worked out a much simpler proof for E=MC squared. When I wrote The Hounds of Heaven in 2012, I included a scene very near the beginning where Jess talks in voice-over about wave-particle duality and wave function collapse. Of course it wouldn't have rung true for Jess to deliver to the audience a rigorous explanation of quantum physics – her brief soliloquy is more consistent with the understanding of an amateur who has an interest in a wide variety of topics. A small regret I have about the film is that I gave Jess my own obsession. The real girl that Jess was based on is very clever but she is more interested in poetry and neuroscience than quantum physics. In 2013, I became 'ill' again and at the beginning of 2015 began writing this blog. For a long time this blog has been the most important thing in my life. I think in 2018 or 2019 I wrote the posts about Schrodinger's cat. I actually gave a copy of these posts to my psychiatrist partly to show I wasn't dumb. I suspect that he may have thought it crazy New Age ramblings because, for one thing, in them I said that any probability estimate is 'subjective' without defining what I meant by the term 'subjective'. The argument I made in those posts is different to the one in the essay – it was based around the Schrodinger's cat paradox rather than a discussion of diffraction but it was on the right track. The error I made in them is that I used the term 'wave function collapse' and I now realise that this was misleading. In those essays I also used the term 'Bayesian'. I thought 'Bayesian' denoted a way of viewing probability close to the one I was advancing. Not long after, I looked up Bayes' Theorem and decided that I must have made a mistake, that I had used the wrong word. I considered going back and rewriting the posts. Although I do not know exactly how to apply Bayes' Theorem to quantum physics, I now realise that the term Bayesian is roughly the right word because, roughly, Bayesian denotes the idea that one change one's probability estimates based on new evidence.
It was very recently that I came back to the issue of quantum physics, actually not long before I wrote the previous post. The stimulus was a Youtube clip by Sabine Hossenfelder. In this essay I am not plagiarising her: rather she dropped a hint which sparked the novel idea of this essay. When briefly mentioning quantum physics she chose to use the term "update" rather than "wave function collapse"; she also used the word "Bayesian". It was these hints that led me to the idea that a measurement actually changes the wave function and, having thought about it, this prompted me to write this essay. In fact, I believe Sabine's view and mine must be different because she thinks measurements are performed by measuring equipment, as many physicists do, whereas I think they are performed by people. I also suspect she believes in Superdeterminism. If so, it is of course possible she could be right.
In 2013, I told my friend Jess about the rather spectacular marks I had received for the GRE exams I took in 2004. I was trying to impress her. But I also told her that I could only explain my results by supposing that I had read the minds of all the other people in the world who were taking the exams at the same time. In 2009 I heard a voice talking about Plotinus, the ancient Greek philosopher and so looked him up. Plotinus believed in a "world soul". What I am attempting to suggest here is that the idea I have proposed in this essay may not be wholly attributable to me but may have emerged from the collective mind of all the people thinking about this issue recently, the same way that quantum mechanics rapidly developed in the 1920s from the work of many physicists thinking about the same issues at the same time. My idea depends on a theory of what probability is and, if the world is deterministic, I don't see how anyone can say this theory is wrong.
I have just watched Sabine's most recent video, called "The End of Science" in which she suggests that it might be the case that science will never again be able to make radical new discoveries, an example she gives being the the apparent fact that no-one has discovered a new organs in the human body for a very long time. Obviously she has never heard of the interstitium.
No comments:
Post a Comment