Introduction

Within contemporary scientific and science‐adjacent communities (like philosophy and/or history of science), it is generally accepted that quantum physics is our best theory. By “best” I mean something like robustly empirically verified and predictively fecund for a wide range of both energy and mass scales, and for an impressively diverse class of target systems. For this reason, it is understandable—and laudable—that scholars interested in questions at the intersection of science and theology wish to meaningfully engage with this physics. Recent work in foundations of physics has, however, importantly altered the landscape of quantum theory; in this article, my goal is to introduce these advances, then make an argument within this new landscape that I hope will be useful for certain theological inquiries. Specifically, I shall argue from grounds of the physics itself that one may, with clear philosophical conscience, access the majority of quantum theory's tools, models and explanations while maintaining an interpretation‐neutral yet realist stance toward this physics.

In what follows, I will take for granted the universal applicability of quantum theory. Of course, this disregards the infamous conflict between quantum theory and general relativity at the Planck scale. In addition to our missing a successful theory of quantum gravity (that is, at least at the time of my writing; one can always hope!), within both the physics and philosophy of physics community, there are notable dissenters to the universality claim. These facts will curtail the scope of my argument vis‐a‐vis theology, but only very minimally. Regarding the Planck scale, since this regime is many (many, many) orders of magnitude smaller than the domain of currently observable systems, I think we may safely remain mystics on this unresolved point and still make good progress. Regarding those who reject the universality claim (along with the hope that we will eventually figure out how to incorporate gravity): my glib response is to say the onus is not on me to prove universality but rather on dissenters to provide an alternate explanation for quantum theory's thus‐far exceptionless confirmation, and even Atlas would baulk at such a burden. Thus, I consider this assumption relatively innocuous.

Here is the plan: the second section sectionintroduces quantum decoherence, a dynamical process resulting directly from the standard formalism of quantum mechanics (that is to say, sans philosophical interpretation and sans additional axioms). This physics will help us with the project of the third section, which carefully teases apart related yet importantly distinct questions commonly referred to as “the measurement problem.” This problem is the usual point of entry for philosophical interpretations of quantum mechanics; being clear about which measurement‐related issues can be explained using textbook quantum mechanics (with decoherence included, as it ought to be)—and which issues still require interpretative supplementation—constitutes support for my argument regarding the explanatory capabilities of interpretation‐neutral, realist quantum mechanics. Indeed, I shall argue that interpretation‐neutral quantum mechanics can explain or resolve all of the puzzles associated with quantum measurement except one. This deflating of the measurement problem arguably leaves very little work for interpretations of quantum mechanics to do, and the fourth section will suggest that certain substantial theological questions can and ought to be addressed using interpretation‐neutral quantum mechanics rather than relying on any specific philosophical approach. The central lesson of these arguments, taken together, is to present an approach to realist quantum mechanics—that is, an account going beyond instrumentalist FAPP (“for all practical purposes”) physics by providing both insights and constraints on possible ontologies—that raises novel questions about the nature of God's interaction with the world. The fifth section concludes.

Introducing Quantum Decoherence

Physical systems are inevitably interacting with some environment at the quantum level. Quantum‐mechanical interactions are importantly distinct from classical processes like thermal interactions (e.g., heat exchange) or mechanical interactions (e.g., billiard balls colliding) because they generally give rise to entanglement, the signature of which is precisely the sort of “spooky” action‐at‐a‐distance that concerned Einstein. For example, air molecules in a room at constant temperature are not on average thermally coupled. They are, however, interacting—even nonlocally!—in a way that will generally lead to entanglement. So while the state of the air with respect to energy can be described completely classically according to thermodynamics, the individual air molecules are (generally) entangled, meaning the energy state of an individual molecule cannot be separated out and described apart from the energy state of other molecules with which it is interacting quantum‐mechanically (and I emphasize, this interaction need not be local—i.e., molecules at great distances may nevertheless become entangled). The impossibility of this sort of interaction from a classical perspective is precisely why Schrödinger considered entanglement the signature characteristic of quantum mechanics.

In Schrödinger's wave mechanics, a quantum system (any system whatsoever, in principle) is modeled as a wave packet made up of superpositions of individual wavefunctions, where the relations among individual wavefunctions encode phase relations just as in classical wave mechanics. Think of the double‐slit experiment: when we shine light at a barricade containing two very narrow slits near each other, a screen downstream of the slits will show an interference pattern: alternating fringes of brightness (constructive interference) and darkness (destructive interference). This makes sense on a classical wave picture of light: light is like water streaming through two side‐by‐side sluices whose ripples collide to form taller peaks and deeper troughs. As light waves pass through the double‐slit apparatus, they ripple and eventually collide, forming an observable fringe pattern on the screen. If no energy was ever lost from these ripples (in other words, if the rippled region of water constituted a perfectly isolated system), the interference pattern would be stable, which is to say the phase relations between peaks and troughs in the pattern would remain constant. But because energy is lost due to environmental interaction, the ripples dissipate and calm reappears; the peaks and troughs decohere. In the double‐slit experiment, we know that if we place a detector of some kind immediately downstream of one slit but not the other, then even if that detector is turned off (and so does not perform any measurements), the screen will not show a fringe pattern, revealing instead an apparently classical statistical distribution of hits—a Gaussian peak—on the screen centered behind the slit without a detector. The presence of the measuring device in this case, even if it is not actively measuring, still creates a new environment for the light, decohering its position and thereby destroying our ability to observe coherent fringe patterns.

Because quantum‐mechanical interactions can be nonlocal, it is impossible to perfectly (or for any reasonable amount of time) shield a given system from environmental entanglement—even in pristine, highly controlled laboratory settings. The entanglement relation enables the environment to behave like a measuring device on a system by decohering that system's phase relations/interference terms. Coherent, or constant, phase relations among superposed states of a macroscopic system, say, would make it possible to measure very strange, quantum states: a cat that is alive‐and‐dead or a coffee cup that is here‐and‐there. But decoherence of a system's phase relations—that is, the loss of coherence among interference terms due to environmental entanglement—makes the probability of observing such states in practice impossible. These strange states still somehow exist (in that they have nonzero, albeit incredibly miniscule, probability amplitudes according to the system's wavefunction), but we tend to measure or observe only the most probable states of the system. Such states as these retain their high probabilities even under environmental interaction, while interference terms between them are rapidly suppressed to near‐zero probability.

As it happens, the states of macroscopic systems left relatively stable during and after decoherence appear to be classical states: a cat that is definitely “alive” or definitely “dead,” or a coffee cup that is definitely “here” or definitely “there.” I emphasize appear because unless we invoke an interpretation of quantum mechanics at this juncture, the interference terms have not collapsed or disappeared entirely—they are still part of the mathematical description of the system. As a result of entanglement with the environment, the system's initial coherence spreads throughout the system‐environment composite such that very little coherence remains “in” the system alone. Thus, when we focus on the system by itself and ignore the environment, the information we obtain from the now‐decohered system appears to be classical, or definite.

A few important notes. First, decoherence is just a consequence of applying axiomatic quantum mechanics when the idealization of a truly closed system is dropped and the ubiquity of entanglement is appropriately taken into account. As such, decoherence ought to be part of standard, textbook quantum theory; physics educators have been slow to amend this situation and as a result harmful confusions about decoherence persist. In particular, decoherence is often wrongfully considered an interpretation itself, confusing what is standard physics with a fully worked‐out interpretation called the Consistent Histories approach—which, granted, has decoherence processes at its heart, but adds much more besides (see Gell‐Mann and Hartle 2014). Or it is wrongfully considered a piece of physics belonging only to the Consistent Histories approach and Everettian interpretations. The foundations of physics literature has been quicker to correct these misunderstandings. The very fact that all three primary realist interpretations now take decoherence to form an essential part of their respective explanatory packages underscores its neutrality in this regard (cf. Schlosshauer 2005 for an overview of the role of decoherence in standard realist interpretations. For more on decoherence in de Broglie‐Bohm, see Rosaler 2015; in Everettian approaches, see Saunders 2022; in collapse approaches, see Fortin and Lombardi 2014).

Second, it is important for present purposes to emphasize that in discussions of decoherence, what is considered the system of interest and what the environment can be arbitrarily defined. From a formal perspective, the division into “system” and “environment” is done merely by choosing a convenient partitioning of the joint Hilbert space. In the physical realm certain systems come to us prepackaged (so to speak) more naturally than others: think of everyday objects as apparently isolated/isolatable from their surroundings. We tend to make use of these predetermined divisions in conducting our scientific inquiries. The crucial thing to remember, however, is that these labeling choices are dictated by convenience or convention, and not by anything deeper or more principled. It may help here to remember that we have presumed all systems can in principle be described quantum‐mechanically.

Third, as we have seen phase relations and superpositions play an important role in decoherence processes. Though these concepts were adopted from classical theories, they function differently in quantum theory. Heisenberg from the beginning frequently admonished physicists on this point—that classical terms could be applied only analogously to quantum contexts. In classical wave mechanics, a wave packet is just a superposition of the aggregate individual waves (e.g., electromagnetic field strength is just the sum of wave amplitudes at a spacetime point). In quantum mechanics, although mathematically the superposed state is still described as a sum of the individual component states, the system so characterized may admit of properties considered aberrations of nature (like the alive‐and‐dead cat or the here‐and‐there cup). More shall be said on this point about quantum superpositions—more properly called coherent superpositions—in the following section.

Fourth, when decoherence is taken as part of the standard quantum theory package, this massively deflates the roles traditionally assigned to those perennially troubling terms, “measurement” and “observation.” If external degrees of freedom may be defined as the “environment” for any system of one's choosing and it is these environmental degrees of freedom which become entangled with and so measure/observe/monitor the system, there is no longer anything particularly worrisome about these verbs. No human observer, no apparatus of a special kind, no highly engineered interaction is necessary to initiate the dynamical processes that result in apparently well‐defined, stable outcomes for a given system. Explaining how this works from the vantage point of decoherence is the focus of the next section, wherein we see just how far this physics can go toward resolving a number of issues associated with the measurement problem.

Defining and Deflating the Measurement Problem

There are a number of interpretive problems arising from quantum theory, and although foundations of physics in recent years has expanded its scope of inquiry considerably beyond this, it is still fair to say that the measurement problem occupies a central role. I suspect this is due in part to the elision of several related yet distinct puzzles that nevertheless get labeled “the measurement problem,” making this problem seem more unwieldy than it is.

Regarding one such puzzle, I draw your attention to the curious incident of superpositions in the night‐time: not only are superposed states allowed in quantum mechanics, they are ceteris paribus the most likely states for a system to occupy. To find a system in a single state (like one well‐defined position in space, or following a well‐defined trajectory through one slit in the double‐slit apparatus, or having a well‐defined spin orientation) should not only not be normal but should be an exceptionally rare occurrence, as such states represent a very small subset of the large ensemble of possible states (this ensemble comprising individual states—eigenstates—along with all superpositions thereof).

The statistical predominance of superpositions among observed states is trivial for continuous systems like liquids or vibrating strings—systems we are used to understanding as waves, and where the fact that superpositions generate wholly new objects is well understood (for example, superposing the wavefunctions of two musical notes generates a third, distinct note). But the consequences of the superposition principle when applied to apparently individual, everyday objects like coffee cups and cats are highly nontrivial. Under our assumption of quantum fundamentality, the same laws that describe sound waves ought to describe cups and cats, too, so why are superpositions considered natural for the former but decidedly unnatural for the latter? In short: why don'tnt we observe superposed states nearly everywhere and nearly everywhen, for all systems?

This puzzle can be explained by appeal to decoherence. As any experimentalist working in the quantum regime will readily attest, the ability to keep a system that is initially in a superposition from decohering due to environmental entanglement is the cardinal difficulty in studying such states. At the microscopic level (e.g., dealing with photons, electrons and the like) superpositions are easier to preserve, and this explains our ability to see manifest interference phenomena and other signatures of superposed states when dealing with light beams or highly attenuated streams of atoms. Nevertheless, the prodigious speed and efficacy of decoherence processes makes maintaining coherent superpositions extraordinarily difficult even at quantum scales. Indeed: qubits (quantum computer information bits) initially in superpositions of “0” and “1”—forbidden states for classical bits and thus the entire reason for building quantum computers in the first place—are only effective if these superposed states remain coherent throughout various computational operations. Decoherence due to entanglement with other qubits and with the computer circuitry itself makes maintaining qubit fidelity a monumental challenge.

Given the extreme challenge of shielding superposed states from decoherence even at the quantum scale, it is no surprise that as systems get larger (and so involve more internal degrees of freedom interacting with more environmental ones), the instability of superpositions only increases. Though brilliant experimental advances in the last 20 years or so have made it possible to maintain superposed states long enough to measure them well into mesoscopic regimes, 1 the sheer difficulty of keeping superposed states coherent long enough to measure them once again underscores the effectiveness with which decoherence suppresses those states beyond observability. In summary, the lack of observable superpositions at all scales is directly related to a system's entanglement with and subsequent decoherence by environmental degrees of freedom.

A different but kindred puzzle is this: The Hilbert space formalism within which quantum mechanics is most often done makes it trivially easy to write down a system's state in terms of any variable/degree of freedom we choose. Thus, from a purely formal perspective, we could just as well describe the state in terms of usual variables like position, momentum, energy, or spin, as in terms of strange superpositions thereof. Our choice of which basis to use—choosing to write the equations in terms of a specific degree of freedom—is as mathematically arbitrary as choosing whether to use Cartesian or spherical coordinates. For example, the Hilbert space formalism puts no preference on the choice to describe an electron's spin along an axis where the eigenstates are either “spin up” or “spin down” rather than on the choice to describe the electron's spin along a separate axis whose eigenstates are superpositions of spin up and spin down. Given this lack of formal preference among the vast number of basis choices available for carrying out measurements, it is puzzling that we nevertheless only ever observe systems in a small subset of these bases.

For the medium‐sized dry goods of everyday experience, the subset of bases “preferred” by nature (but not by theory) became, quite understandably, the obvious candidates for observables. Hence, classical physical theories were developed on the basis of these bases: position, momentum, and energy are familiar classical variables, whereas superpositions of these are not. For systems at the atomic scale, however, the usual classical variables were not a natural fit. As alluded to above, in the article where Heisenberg derives the uncertainty relations he is careful to emphasize that the classical terms “momentum,” “position,” “trajectory” and the like can only be applied analogously within the new mechanics. Even the term “superposition” borrowed by Schrödinger from classical wave mechanics for his wave‐mechanical quantum formalism cannot be understood exactly as in classical theories. Different degrees of freedom appear more natural means of observing some systems rather than others; another way of saying this is that some bases within which to make measurements are more stable than others, depending on size/energy scale. What explains this theory‐world mismatch? Why do certain bases of measurement seem to be preferred by nature over others and presented to us as especially amenable to physics when no such preference appears in the theory? And why are different bases preferred at different scales?

Nature's apparent and consistent preference for certain bases of measurement is not due to some as‐yet undiscovered set of selection rules, but rather due to the varying susceptibility of bases to environmental decoherence. The rate and efficiency with which phase relations for some system variable decohere depends on the strength of system‐environment entanglement with respect to that degree of freedom. Consider decoherence modeling of a large system like a pollen grain interacting with a common environment like air (as opposed to specifically engineered, highly selective environments found in the lab, like a supercooled liquid helium bath). When this model is evolved (that is, the quantum Brownian motion model of decoherence 2 ) the position and momentum of the pollen grain are decohered, leaving the system in apparently definite states of both (that is, in phase‐space). This is because the air becomes quickly entangled with both the pollen's position and momentum, so that by the time we interact with the pollen grain it is behaving like a macroscopic system and tracing out a Newtonian trajectory. In sum: the problem of preferred bases is not a matter of nature obeying secret selection rules, nor need we appeal to some interpretation of quantum mechanics to explain it. A perfectly robust explanation—moreover, one that is able to cover the more subtle puzzle of why different bases are “preferred” at different scales—is available from decoherence.

These considerations closely relate to yet another long‐standing issue: the pointer basis problem. In predigital days of yore, one would carry out laboratory measurements using an oscilloscope or voltmeter or other delightful analogue device with a “pointer”—a physical needle, say—which would literally swing around and finally settle on a specific value: the measurement outcome. If in addition to the thing being measured, the needle itself is also describable as a quantum system, then why does the needle sweep out an apparently classical, Newtonian arc in space‐time, and then—additional mystery!—stabilize on a single, definite value, when the theory gives only probable values? An answer to the first question should be readily apparent given the previous paragraph: the needle, being a macroscopic object, has interacted with the air and gravity and other environments, and so become decohered in phase‐space just as the pollen molecule did. Hence, it appears to follow a quasi‐classical path through space‐time and appears to settle in a well‐defined direction, pointing to a single value.

An answer to this second question regarding the needle's settling on a single point (corresponding to the measurement of a well‐defined value for the system) requires more care, and hopefully illustrates the fecundity of this exercise of disentangling various measurement puzzles. The claim that is (or ought to be) uncontentious, and whose explanatory package is free to all, is that decoherence processes underlie both the apparent classicality of the pointer's trajectory and the apparent definiteness of the value it ends up pointing to. To assert that the trajectory is in fact, ontologically Newtonian (and not just apparently or effectively so)—likewise to assert that the final pointer position is in fact, ontologically definite (and not just apparently or effectively so)—is a highly nontrivial step beyond what the theory stipulates, and so constitutes philosophical emendation. For example, one way to ensure this extra step is to postulate a physical collapse mechanism that ontologically reduces the interference terms among the pointer's superposed position states to exactly zero; this is the move made in spontaneous collapse theories (but note well: not in the Copenhagen interpretation, more on which anon).

Decoherence alone does not deliver an explanation for ontological definiteness (if indeed there is such a thing), but it does provide an explanation for why the interference terms whose presence would give rise to indeterminate or fuzzy or superposed measurement values have become, for all intents and purposes, suppressed beyond recall, resulting in apparent definiteness. If your ontology requires actually definite values—where one state from among the ensemble of possible states ends up with probability exactly one and all other states have probability exactly zero, you must supplement the physics.

Recall our pollen grain. Decoherence models show its microdynamics as it floats through the air as evolving in accordance with apparently continuous, classical motion, and then when the pollen comes to rest on a blade of grass, the pollen occupies an apparently well‐defined region of spacetime there. That at all times the pollen grain, in interaction with its environment, can be assigned definite values for each of its variables (position, momentum, etc.)—this puzzle is resolved by decoherence. Let us call this the Variables Problem (elsewhere I have referred to this as the problem of general outcomes, following Schlosshauer 2007).

However, decoherence does not indicate whether the pollen grain's trajectory was ontologically classical, nor whether the space‐time point whereupon it rests is ontologically well defined. Indeed, to disconfirm either of these outcomes is beyond our current technical capabilities—but the absence of disconfirmation does not imply confirmation! If the decoherence rates generated by our best models are correct even to within a few orders of magnitude, it would indeed take billions of years of measuring the position of our pollen speck before it might be observed in an indefinite state like a superposition of positions. This is because indefinite states are rendered practically impossible to observe by decoherence. But the probability that our system truly occupies an indefinite state nevertheless remains nonzero until or unless one applies an additional step—an interpretation of quantum mechanics—that does away with these states.

While decoherence resolves the Variables Problem both by showing that environmental decoherence results in a quasi‐classical trajectory and an apparently well‐defined rest position for our pollen grain, decoherence cannot explain the further question of why (and whether!) the pollen grain moved along this classical trajectory instead of that one, or why it landed at this place rather than that one. Or using the cat and coffee cup examples from earlier: decoherence explains why the possible outcomes are apparently definite states (like alive or dead, here or there). This is the Variables Problem. Decoherence does not explain why we measured a particular apparently definite state (say, “alive”/“here” rather than “dead”/“there”). Let us call this latter issue the Values Problem (elsewhere I have referred to this as the problem of specific outcomes).

Interpretation‐Neutral Theology

If decoherence is rightly understood as part of standard quantum mechanics, and if the above analysis of the measurement problem in light of decoherence is correct (and I am confident it is; at minimum it aligns with the accepted view in physics), then it provides explanations for all but one small puzzle. It does not answer the Values Problem—if such an answer exists in the first place, which should not be assumed on interpretation‐neutral grounds!

What theological work can be done when we lay aside the Values Problem and so stay within the bounds of interpretation‐free quantum theory? As theology is not my background, I leave much of the answer to this question as an exercise for theologians. What I will do is focus on a particular subject where “quantum theology” frequently shows up, to wit—divine action—and begin to sketch the new logical space that might be available in virtue of interpretation neutrality. I want to focus on four aspects of quantum physics that strike me as most pertinent and/or problematic for accounts of divine action grounded in quantum theory. I will dedicate a subsection to each in what follows, demonstrating how standard quantum mechanics plus decoherence might helpfully reframe the matter. These four aspects are (A) the interpretation question: must one adopt a particular approach first in order to address theological questions, and if so, which of the empirically adequate realist interpretations is best suited to such endeavors?; (B) the issue of indeterminism: whether it is ontic or merely epistemic, and how precisely it comes into play in the various interpretations; (C) how to handle the problematic terms “measurement” and “observer,” and last (D) the nature of quantum events.

Divine Action and Interpretations of QM

In his Oxford Handbook entry on Non‐Interventionist Objective Divine Action (NIODA), Russell argues that in as much as all physical theories are multiply interpretable, the need for NIODA to ground itself in a given interpretation of quantum mechanics is “not particularly surprising or unavoidable” (Russell 2008, p. 585). While I wholly endorse Russell's call to epistemic humility in light of the multiple interpretability of all physical theories, there is clearly something different about the quantum case. The fact that entire subfields of both philosophy and physics are dedicated to investigating various quantum interpretations and their consequences is itself evidence of the heightened role the question plays here. One might also mention the commonly held view that the main realist interpretations are more appropriately understood as competing theories, in that they are empirically equivalent yet introduce distinct ontologies, and different explanations undergird shared empirical content. Thus, I would push back on Russell and say one's choice of interpretation is of special valence regarding quantum‐mechanics‐informed theology, but at the same time argue that if one is convinced about the explanatory power of decoherence processes arising from standard quantum mechanics, then one may maintain interpretation neutrality while still providing explanations of the sort required for noninterventionist accounts of divine action. Should someone wish to go further and resolve the niggling remaining issue of the Values Problem, then decoherence is still relevant, as the chosen realist interpretation is in fact doing far less explanatory work (for better or worse) than previously assumed. 3

A comment about instrumentalism is in order here. One might wonder whether standard quantum mechanics plus decoherence is just an instrumentalist approach to quantum mechanics, in which case what I am recommending boils down to the claim that theologians should maintain instrumentalist attitudes toward this physics rather than adopt any full‐blown interpretation. The view of quantum mechanics I have been advocating is far richer than instrumentalism in a number of ways. Here are three. First—and most importantly—I have clearly been making ontological claims in my discussion of decoherence which the instrumentalist will not have access to. Granted, the form of these claims has often been negative (“no system is truly closed – or remains isolated for long – from quantum interactions”) or deflating (no special way of carving out systems and environments exists), but they nevertheless extend significantly beyond instrumentalism.

Second, decoherence processes were discovered as a direct result of jettisoning the idealization that one is working with truly closed, isolated (even from nonlocal quantum interactions) systems; this is done by explicitly feeding environmental dynamics into the Hamiltonian. Nevertheless, because such idealizations about systems remain incredibly useful—and, arguably, appropriate—in many experimental settings, the instrumentalist (perhaps rightly) will be unmoved to incorporate the complexities of decoherence into her account. But in philosophical or theological discussions, the epistemic aim is rather different, and FAPP physics of the sort practiced by operationalists or instrumentalists will not cut this mustard.

Third, I suspect that the objection to interpretation neutrality as merely instrumentalism in another guise is largely due to the fact that both views have the virtue of recovering the world of everyday acquaintance without appeal to a specific interpretation. While this is certainly true, decoherence does a lot more work besides. More will be said about this below in subsection C, but for now simply note that it is particularly pernicious for philosophical and theological purposes to subsume the robust suite of nuanced dynamical explanations for a vast array of system and environment interactions available from decoherence under a vague, overly coarse instrumentalist line about the “cancelling‐out of quantum effects at larger scales.” 4

In sum: engagement with decoherence will result in significant theological dividends specifically regarding divine action, as this physics provides deeper scientific explanations than those available via instrumentalism—vis., telling detailed, context‐dependent stories about real dynamical processes underlying observed interactions—and yet does so without committing one to any particular philosophical interpretation of quantum mechanics.

Divine Action and Indeterminism

In a recent paper in this journal by Vanney (2015), the author describes where and how indeterminism enters into each of the usual realist interpretations of quantum mechanics: hidden‐variables theories like Bohmian Mechanics, spontaneous collapse theories like the Ghirardi‐Remini‐Weber (GRW) theory, the “Copenhagen” interpretation (the reason for scare quotes will become apparent below), and Everettian interpretations (including many‐worlds, many‐minds, and relative states).

Although I am sympathetic to Vanney's project, the crucial dynamics of decoherence are missing from her analysis, and the question of indeterminism requires reexamination in this light. I should also note in passing that what counts as an instance of genuine metaphysical indeterminacy—and whether quantum physics in fact instantiates it—is currently the subject of intense debate among philosophers of physics. Let us lay aside these worries for the present, however, and consider where indeterminism (commonly construed) has entered into the interpretation‐neutral physics described in previous sections.

There are two prominent ways in which indeterminism may be said to arise in quantum mechanics. First, indeterminism is sometimes associated with quantum nonlocality as demonstrated by experiments testing Bell's inequalities, and more generically as codified by the Kochen‐Specker theorem. The latter proves that one cannot assume that preexisting definite values exist simultaneously for a complete set of variables without contradiction. Some subset of a system's variables may be so defined, but a complete description of a given state in terms of each of its degrees of freedom is impossible without violating the axioms of quantum mechanics. Even in principle it is theoretically stipulated that a complete determination of all possible states, independent of context, is forbidden. This point is crucial for arguments not just in divine action but elsewhere when one poses questions about God's knowledge of physical systems: the nonlocality inherent to any interpretation of quantum theory—even collapse ones—prima facie implies that even God cannot have complete knowledge of a given system. However, given the interpretation‐neutral account above, we may instead understand such questions to be ill‐posed: that nonlocality shows up even in textbook quantum mechanics is cold, hard fact. But where nonlocality shows up, and how it alters the broader explanatory package, will change importantly depending on (i) whether one thinks the Values Problem requires a solution, and if so, (ii) how one solves it—that is, which interpretation one adopts (see the final section of Crull (2022) for a brief discussion aimed at those who would answer these questions in the affirmative).

Another way indeterminism links up with quantum mechanics is via the measurement problem writ large: the puzzle of obtaining one definite outcome from a theoretical ensemble of possible outcomes described by the univocal and deterministic Schrödinger equation. This sense of indeterminism is sometime introduced along with the Born rule, which associates probabilities to specific outcomes of measurements. In the usual discussion, if the Born rule is considered axiomatic, then by definition there is no further story about why certain outcomes receive the probabilities they do, and why upon measurement one of these should obtain definitely; this is brute indeterminism. If the Born rule is not considered axiomatic (a view motivated by aversion to brute indeterminism) then it must be derived, and attempts to do so remain controversial. 5 But not to worry! The question of whether the Born rule is axiomatic (and if not, whence it can be derived) is very much beside the point. As I have emphasized elsewhere (e.g., in Crull 2017 ), the Born rule is merely a guide to experimental expectations of the following sort: “for a position measurement on the system, one should (according to the Born rule) expect with probability P(X) the position of our system to appear as X.” One heaps superfluous baggage onto the Born rule by taking it to be a metaphysical assertion regarding the existence of real, definite outcomes; such a reading is neither motivated by the theory itself nor part of textbook quantum mechanics. Thus, the Born rule is only superficially the source of quantum indeterminism.

What about the measurement problem, though? Following the deflationary picture above, if the question is how one might go from a prepared state of a superposition to a nonsuperposed state under unitary Schrödinger evolution, we have seen that this is only a puzzle if we ignore the crucial idealization that our system has remained coherent, effectively shielded from environmental decoherence. Once we drop the idealization that our system is isolated and account for external degrees of freedom, we are able to model precisely in what bases and at what rate coherence leaks into the environmental modes with which it is entangled. The indeterminism here is an artifact of zooming in on the system “alone” long after system‐environment interactions have begun, thereby ignoring nontrivial environmental influences. If the indeterminism is meant to arise in connection with nature's so‐called preferred bases of measurement, we again have scale‐sensitive microdynamical explanations for these preferences available from decoherence: continuous environmental monitoring renders certain bases more stable than others at different scales. If the question of indeterminism is tied to the Variables Problem (why did the path of the pollen grain appear quasi‐Newtonian? Why did its ultimate landing spot on the grass appear definite?), again decoherence has shown why the indeterminism is “hidden”: it is encoded by prodigiously damped (read: impossible to measure in practice) interference terms in the phase‐space basis and position basis of the pollen grain, respectively. The upshot of the foregoing analysis is this: the only place left for indeterminism to play a significant role is within the limited scope of answers to the Values Problem.

Problem Terms and the Quantum‐to‐Classical Transition

The problematic nature of terms like “observer” and “measurement” in quantum contexts has to do with the fictional notion that there exist distinct classical and quantum domains. This is often considered part and parcel of the Copenhagen orthodoxy, but none of the historical figures associated with this interpretation (in particular, Bohr, Heisenberg, Pauli, Hermann and Weizsäcker) believed in a singular point at which things “switched over” to classicality. In their more careful moments, they speak only of classical and quantum modes of description, and never as though there were a real, physical divide between these domains. Indeed, in their correspondence, lectures, publications and private discussions during the interpretive halcyon days before the war (1927‐1935), they consistently affirmed the asymmetry of the Heisenberg cut: one could in principle move the cut as far in the classical direction as one wanted (and so describe everything in the universe quantum‐mechanically), but there is a definite limit to how far one can push the cut in the direction of the system to be measured (and where this limit falls depends on the properties of that system and its environment). 6

Historical point aside, one must tread lightly around the question of the quantum‐to‐classical transition, for the relevant definitions of “measurement,” “observation,” and even “event” are dependent upon how one understands this transition. 7 The necessity of decoherence in any such story highlights a significant point: not one of these accounts of the emergence of classicality require resolution of the Values Problem. This means that if one's theological question requires that the system of interest be suitably classical, its classicality can be guaranteed in virtue of interpretation‐neutral quantum mechanics; this just is the Variables Problem, which textbook QM‐cum‐decoherence resolves. It is only in (admittedly hard to imagine) cases where one's theological question requires an answer to the Values Problem—the system of interest must show a particular classical behavior (not just a behavior of the kind “classical”) or manifest a particular value (not just a well‐defined value of the right kind)—that one would need to invoke an interpretation of quantum mechanics. 8

This discussion of the quantum‐to‐classical transition is meant to illustrate the potential problems arising for any theological account wherein “measurement” or “observation” are considered primitive, unanalyzable terms. I have already touched on the first two of these terms in the previous section, arguing that even the most permissive definition of measurement imaginable—the operational definition of measurement as one system gaining information about another system—is sufficient to recover the entire decoherence picture. This operational definition likewise allows us to define “observer” as merely the system obtaining the information in the measurement process. Nothing more special than this is needed. Because decoherence models have been used to describe objects at scales from individual quanta up to humanly observable apparatuses like Josephson junctions in superconductors or the multi‐ton Weber bars used to detect gravitational waves, this full range of systems constitutes viable candidates for the role of either “system” or “environment.” The extreme triviality of designating some cluster of degrees of freedom “the system” and others “the environment” is further underscored by the point made above that it is utterly arbitrary, from a formal perspective, how we divide the joint Hilbert space of an entangled system into subsystems. There was even some theoretical work proposed in the early 2010s modeling the decoherence of a single electron, where the electron's spin was the “system” and its translational degrees of freedom served as the “environment”; that is, a single quantum was used for both system and environment in this model.

In order to describe ordinary macroscopic states of affairs one need not postulate special operations called “measurements” nor special observers call “agents”—nor indeed any observer whatsoever. Effective classicality can be explained by appeal to standard quantum mechanics with decoherence, ergo without reference to any specialized (or specially troublesome) terms.

The Nature of Quantum Events

What limited acquaintance I have with questions at the intersection of quantum physics and (Christian) theology has left me with the impression that two key features of this theory are especially relevant. The first is entanglement, a novel yet physically ubiquitous relation with potential for broadening the scope of the theological imagination. I explore this particular aspect of quantum theology (along with the related notions of nonlocality in space and time) in detail in my forthcoming Cambridge Element, God & the Problem of Quantum Physics. The second is the emergence of an apparently classical realm. I have described this issue above, but here wish to zero in on the role of quantum events (vs. classical events) especially for theological considerations. That events at all scales can be considered quasi‐localized (read: apparently classical) is due to decoherence by environmental degrees of freedom. Most theological questions, I venture to posit, do not further require that the apparent final states take any specific value (that is, require the resolution of the Values Problem); that the relevant values are effectively stable or effectively classical is sufficient for getting theological explanations going.

Since the apparent definiteness of events even in highly‐controlled environments can be explained using decoherence, this is certainly true also of events in the wild. Indeed, these will almost always appear robustly localized due to the sheer noisiness and scale of the environment. This is true also of events in the human brain—a decidedly noisy, hot environment. Consider the (very) toy example of serotonin being released from a presynaptic nerve ending, and ask whether or not a sufficient amount makes it across the gap to postsynaptic nerve ending receptors, where it can be absorbed and induce a feeling of happiness (and perhaps this happiness is part of a theological story I am telling). This case can be considered analogous to the pollen grain scenario: the trajectory of the serotonin from one nerve ending to another and its final landing place atop an active receptor are quantum processes, and the likelihood that a given serotonin molecule is in nonsuperposed position, momentum and energy states during even part of its transmission is infinitesimally small. Yet it is trivial to assume that these molecules are entangled with a host of other systems in the noisy, hot brain environment (e.g., other serotonin molecules, neurons, the local electromagnetic field, etc.), thus any initially superposed state is quickly decohered into a nonsuperposed, approximately definite state. This means we would observe (if we could) a quasi‐classical trajectory for the serotonin molecules as they are transmitted across the nerve gap, and could verify empirically the distribution of molecules as they land in effectively localized positions, and so calculate the percentage of molecules that land on receptors, are absorbed, and induce the desired happy mood. Presumably neither the precise trajectory of each serotonin molecule nor its precise resting place on a given postsynaptic nerve—that is, answers to the Values Problem—are at all necessary. What is needed for “happiness” to arise in this little anecdote is just resolution of the Variables Problem, and this we have from interpretation‐neutral standard quantum mechanics plus decoherence.

Likewise, when the resurrected Jesus passed through the wall of the upper room after appearing to his disciples, this was surely a quantum “event.” For us to discuss this phenomenon in a theologically meaningful way while yet maintaining consilience with quantum physics, we surely do not need to know the precise energy values of all the wavefunctions of Jesus’ quanta at the moment he stepped through the wall—nor indeed do we even need to know that these energy values were ontologically definite. All that is required to explain this event is that the energy states of his collective quanta stabilized long enough and at sufficiently high (possibly indefinite) values to overcome the potential energy barrier created by the wall for the duration of his passage through. Decoherence (in principle) delivers an answer to this Variables Problem problem.

Conclusion

The interpretation neutrality I have argued for based on the incorporation of decoherence into standard quantum mechanics will, I hope,  liberate theologians from having to engage with the interpretation question altogether while nevertheless engaging in “more than FAPP”physics. If theologians still wish to resolve the Values Problem and so must adopt a given interpretation of quantum mechanics, the incorporation of decoherence will at least enrich the explanatory package in a way that allows for increased theological imagining. In either case, the result is greater freedom.

Notes

  1. For example, interference phenomena were measured using fullerene molecules by Anton Zeilinger and his group in Austria in the late ‘90s (cf. Arndt et al. 1999; Zeilinger was awarded the 2022 Nobel Prize in Physics for this and related research). Fullerene molecules contain 60 carbon atoms and are five orders of magnitude—that is: 100,000 times larger—than a typical quantum system like an electron.
  2. See Schlosshauer (2007) chapter 5 and references therein for an introduction to this crucial decoherence model.
  3. In this issue, Mark Harris tackles precisely the question whether the Copenhagen interpretation need necessarily be wedded to scientifically informed accounts of divine action. Hopefully what I write here only underscores Harris's thesis.
  4. One quick example to illustrate the problem with an uncareful phrase like “cancelling out quantum effects.” If “cancelling out” is read as “makes ineffective” then this aligns with decoherence processes, which do render system interference terms (i.e., the alluded‐to “quantum effects”) ineffective by damping them to negligible—but nonzero!—values. But “cancelling out” could also be read as “makes null,” in which case it most emphatically does not align with decoherence. To make this reading true requires invoking an interpretation to resolve the Values Problem so that all values, save the one measured, become equal to zero—not just approximately so.
  5. For a critical overview of attempts to derive the Born Rule, see Vaidman (2020). For attempts specifically within Bohmian Mechanics, see Callender (2007). For attempts within (but critical toward) Everettian interpretations, see Rae (2009).
  6. For in‐depth philosophical and historical analyses of primary source materials related to the incompleteness and nonlocality debate in the years 1927–1937, see Bacciagaluppi and Crull (forthcoming).
  7. Elsewhere (Crull 2022) I have provided a list of the most popular candidate “borders” between the classical and quantum regimes, then described how each fails to be either a sufficient or necessary condition for a suitably general notion of classicality. The point is to emphasize that quantum‐to‐classical transition stories are highly nuanced, yet all necessarily involve decoherence.
  8. A further point may be relevant here. Several prominent accounts of divine action impose a FAPP border between quantum and classical realms by appeal to the irreversibility of physical processes. This is a thermodynamic constraint, and one should be careful to distinguish between classical limits derived from quantum mechanics and limits defined based on thermodynamic considerations. The latter are appropriate only at large scales and for time‐averaged systems. Additionally, thermodynamical definitions of classicality are circular in the sense that the theory's variables (pressure, temperature, volume) are only well defined for ensembles of systems, and so are already irreducibly macroscopic concepts. In contrast, limits evaluable via decoherence models are defined in terms of quantum‐mechanical properties, and therefore represent the finest‐grained mapping possible for the quantum‐to‐classical transition (though such mappings can never be generalized, as they are highly contingent upon the details of the given system and environment).

References

Arndt, Markus, OlafNairz, JulianVos‐Andreae, et al. 1999. “Wave–Particle Duality of C60 Molecules.” Nature  401:680–2.

Bacciagaluppi, Guido, and Elise Crull. Forthcoming. The Einstein Paradox: The Debate on Nonlocality and Incompleteness in 1935. Cambridge: Cambridge University Press.

Callender, Craig.2007. “The Emergence and Interpretation of Probability in Bohmian Mechanics.” Studies in the History and Philosophy of Modern Physics  38 (2): 351–70.

Crull, Elise. 2017. “Yes, More Decoherence: A Reply to Critics.” Foundations of Physics  47:1428–63.

Crull, Elise. 2022. “The Philosophical Significance of Decoherence  .” In The Oxford Research Encyclopedia of Physics. Oxford: Oxford University Press.

Crull, Elise. Forthcoming. God and the Problem of Quantum Physics. In series “Elements in the Problem of God,” edited by Michael L. Peterson. Cambridge: Cambridge University Press.

Fortin, Sebastian, and OlimpiaLombardi. 2014. “Partial Traces in Decoherence and in Interpretation: What Do Reduced States Refer To?” Foundations of Physics  44 (4): 426–46.

Gell‐Mann, Murray, and James B.Hartle. 2014. “Adaptive Coarse Graining, Environment, Strong Decoherence, and Quasi‐classical Realms.” Physical Review A  89 (5): 052125.

Rae, Alastair I.M.2009. “Everett and the Born Rule.” Studies in the History and Philosophy of Modern Physics  40 (3): 243–50.

Rosaler, Joshua.2015. “Is de Broglie‐Bohm Theory Specially Equipped to Recover Classical Behavior?” Philosophy of Science  82 (5): 1175–87.

Russell, Robert J.2008. “Quantum Physics and the Theology of Non‐Interventionist Objective Divine Action  .” In The Oxford Handbook of Religion and Science, 2nd edition, edited by PhilipClayton, 579–95, Oxford: Oxford University Press.

Saunders, Simon.2022. “The Everett Interpretation: Structure” and “The Everett Interpretation: Probability  .” In The Routledge Companion to Philosophy of Physics, edited by E.Knox and A.Wilson, 213–229 and 230–246. New York and London: Routledge.

Schlosshauer, Max. 2005. “Decoherence, the Measurement Problem, and Interpretations of Quantum Mechanics.” Reviews of Modern Physics  76 (4): 1267–305.

Schlosshauer, Max. 2007. Decoherence and the Quantum‐to‐Classical Transition, 2nd edition. Berlin: Springer.

Vaidman, Lev.2020. “Derivations of the Born Rule  .” In Quantum, Probability, Logic: The Work and Influence of Itamar Pitowsky, edited by MeirHemmo and OrlyShenker, 567–84. Cham, Switzerland: Springer.

Vanney, Claudia E.2015. “Is Quantum Indeterminism Real? Theological Implications.” Zygon: Journal of Religion and Science  50 (3): 736–56.