*Abstract*

*This paper presents realistic approaches that provide more effective and fruitful conceptual frameworks for the advancement of research in the quantum world.*

**Keywords:** *Realism, Anti-realism, Quantum Mechanics, Scientific Knowledge, Copenhagen interpretation, Instrumentalism.*

** **

**Alternative Interpretations****of Quantum Mechanics**

The Copenhagen interpretation, despite all the conceptual difficulties, was, as far as predictions and calculations were concerned, a successful scheme. This aspect of the theory helped to create an environment in which the majority of physicists who were mostly interested in practical problems started using the mathematical machinery of the theory without questioning its conceptual validity (Popper, 1967, p.8). However, efforts to find alternative interpretations have continued by a number of philosophers and philosophy-minded scientists ever since the Einstein – Bohr controversies. The interesting thing, however, is that there has been hardly any consensus among the advocates of the rival interpretations.

This situation has aptly been described by N. Herbert “Quantum theory resembles an elaborate tower whose middle stories are complete and occupied. Most of the workmen are crowded together on top, making plans and pouring forms for the next stories. Meanwhile, the building’s foundation consists of the same temporary scaffolding that was rigged up to get the project started … Physicists’ reality crisis consists of the fact that nobody can agree on what’s holding the building up. Different people looking at the same theory come up with profoundly different models of reality …” (1985, pp.157-197).

Although many of the proposed interpretations have covert or overt anti-realistic leanings, this fact should not bring comfort to anti-realists. This is because, on the one hand, as we shall see, these anti-realists schemes suffer from considerable conceptual deficiencies. On the other hand, the very existence of a number of different interpretations, realist and anti-realists alike, provides an argument against the conviction of those anti-realists who would regard the orthodox interpretation as complete and final. The following models are among the better-known interpretations of quantum mechanics.

**I.a. Consciousness-Created Reality**

This interpretation was prompted by von Neumann who had concluded that if the predictions of quantum mechanics were correct, then the world could be made up of ordinary objects possessing unobservable or hidden attributes. Being an ardent advocate of the OQT, von Neumann believed that there is a definite separation between measurement devices and the quantum objects and that the wave function collapse occurs in some vague neighborhood between the two. He decided to calculate the size of this neighborhood.

However, to his surprise, it turned out that the collapse, as far as ordinary experiments were concerned, could virtually occur anywhere. As a result of this consequence, von Neumann started thinking of human consciousness, as a part of the long chain of measurement. However, while von Neumann himself did no more than allude to the role of the conscious mind in bringing about the collapse of the wave packet, this possibility was taken seriously by a number of physicists, chief among them Eugene Wigner (1961 and 1967 reprinted in Wheeler & Zurek, 1983). Wigner and others have dramatized the situation by proposing a paradox in the form of a thought experiment which draws on Schrödinger’s famous cat paradox (Schrödinger, 1953).

The thought experiment involves a sealed and insulated box containing a radioactive source. The source has a 50-50 chance of triggering the Gieger counter during the course of the experiment, thereby activating a mechanism that causes a hammer to smash a flask of prussic acid, thereby killing the cat. An Observer has to open the box in order to collapse the wave function into one of the two possible states (cat=dead, cat=alive). A second observer (Wigner’s friend) is then needed to collapse the wave function of the larger system comprising the first observer, the cat, and the equipment. The problem here is that now the original observer, Wigner’s friend, and the equipment plus the cat, constitute a new system, which may itself require an ‘Acquaintance’ to collapse its wave function, and so on (Casti, 1989, p.445)

Wigner’s own solution is that due to interaction between living minds and inanimate nature the state of the original system changes from an indefinite one into a definite one as soon as *any* mind would become conscious of the outcome of a measurement upon the original system. For Wigner, the conscious mind is the basic reality, and things in the world are no more than useful constructions built out of one’s past experiences, somehow coded into one’s consciousness. Wigner’s proposal has not been met with great enthusiasm among the majority of physicists (Belinfante, 1975, pp. xiv-xv.). An exception is however, the American physicist, John Archibald Wheeler, who has taken Wigner’s approach one step further and has declared that, “no elementary phenomenon is real phenomenon until it is an observed phenomenon” (Wheeler, 1979-1981). For Wheeler the essence of reality is meaning and the essence of meaning is communication defined as the joint product of all the evidence available to those who communicate. In this view meaning rests on action, which means decisions, which in turn forces the choice between complementary questions and the distinguishing answers. This amounts to the basic idea of generating reality by act of measurement. Einstein, once wrote, “I cannot believe that a mouse can change the world by simply looking at it” (Bohr, 1949, p.237). Bohr tried to distance himself from this sort of interpretation. However, his equivocal remarks, have no doubt contributed to the rise of this overtly subjective approach.

Apart from strong idealistic connotations of this approach, there are a number of internal conceptual difficulties that undermine the soundness of it. In the first place, Wheeler has restricted the act of creating reality by observation only to elementary particles and has denied it in the case of medium size and large size objects. But such a restriction seems to be quite arbitrary, with no convincing rationally. In fact, as other physicists with the same persuasion as Wheeler (e.g. Mermin, 1985, pp.38-47) have claimed all entities – cats, oranges, rainbows, even moon, and stars – are not real until somebody looks at them. This is, of course, a very natural conclusion, which drives from Wheeler’s basic assumption.

The other difficulty with this scheme is that its advocates do not agree on what counts as an observation. Some of the followers of this school, including wheeler himself, are of the view that the essence of the measurement is *the making of a record*, and this can be done even by a robot. Others believe that only a *conscious observation* counts as a measurement. Here again, due to the arbitrariness of the distinctions and lack of objective criteria, no progress has been made.

**I.b. The Many-World Interpretation**

This approach due to H. Everett (1957), states that in any act of measurement, while one of the many possibilities latent in the wave function actualizes for the observer, the rest also are simultaneously actualized in the worlds parallel to, but inaccessible from, that of the observer. The important point in Everett’s scheme, which by the way makes it attractive to some physicists, is that according to him no wave reduction takes place, and since there is no collapse of a wave of function, there is no *measurement problem*. Another important feature of this scheme is that its interpretation seems to arise naturally out of the mathematical formalism, whereas the other approaches require additional assumptions associated with the distinction between the quantum system and the measurement apparatus (Rae, 1986, p.77).

In Everett’s interpretation, any isolated system is described by a wave function that changes only as prescribed by Schrödinger’s equation. If this system is observed by an external observer then, in order to discuss what happens, it is necessary to incorporate the observer into the system, which then becomes a new isolated system. The new wave function, which now describes the previous system plus the observer, is again determined for all times by the Schrödinger equation (Squires, 1986, p.69).

However, the scheme suffers from the following rather serious conceptual inconsistency. As noted above, according to Everett, the reality is a wave function which always contains all possible outcomes and a conscious observer is capable of demanding a particular result and thereby selecting a ‘branch of the world’ in which he exists. But this poses a problem, because on the one hand the wave function – with all its components – corresponds to the whole reality, and on the other hand there exist a number of conscious observers over and above this whole reality which can cause the process ‘branching’ (d’Espagnat, 1979/83, p.172).

There are other reasons for doubting the credibility of the many world model. These arise principally because it is not clear from the theory just when the alleged branching takes place. It is sometimes said that it happens whenever a ‘measurement like’ interaction between a quantum system and a measuring apparatus occurs, but if this is the case then the many-worlds model has clearly failed to solve the *measurement problem*! Alternatively, branching may occur whenever any kind of interaction takes place between two component parts of the universe. But this means, among other things, that the electron and proton in a hydrogen atom are continually interacting and creating infinities of the universe! The formalism of many-world model does not provide clarifications over these difficulties (Rae, 1986, pp. 81-2).

**I.c. Hidden Variable Interpretation**

Among the realist physicists who took Bohr’s views seriously, David Bohm is to be mentioned. Bohm’s attention was drawn to the notion of *indivisibility* between the quantum domain and the larger-scale domain according to the Copenhagen interpretation. Time and again Bohr had emphasized that, “…[T]he fundamental difference with respect to the analysis of phenomena is classical and in quantum physics in that in the former, the interaction between the objects and the measuring instruments may be neglected or compensated for, while in the latter this interaction forms an integral part of the *phenomenon*. The essential* wholeness* of a proper quantum phenomenon finds indeed logical expression in the circumstances that nay attempt at its well-defined subdivision would require a change in the experimental arrangement incompatible with the appearance of the phenomenon itself”[i] (Bohr, 1958, p.72. Italics added).

Bohm, although sympathetic with this account, was uneasy about its apparent contradiction. In his inaugural lecture at Birkbeck College, University of London, he noted that “In reality, they are only one indivisible system. Yet, our very language asserts that they are two. Hence, there is a contradiction between our common language and the facts of the case. It is this contradiction that is at the root of our inability to find a single conceptual model of the movement and behavior of the observed system” (Bohm, 1963, p.10). In Bohm’s view, Bohr and Heisenberg had resorted to a sort of conventionalism and arbitrariness as a way out of this contradiction. They had suggested “a purely imaginary ‘cut’, at some place where classical physics is still adequate. The precise place is not significant, as long as it is still in the classical domain. On the large-scale side of the ‘cut’, it is evidently adequate to go on using our ordinary classical concepts. On the other side, we apply the laws of quantum mechanics, whose sole experimental meaning is however now the prediction of probable results on the observed classical side of the ‘cut’” *(ibid).*

Bohm, who was “dissatisfied with the self-contradictory attitude of accepting the independent existence of the cosmos while one was doing relativity and, at the same time, denying it while one was doing quantum theory” (1987, p.34), decided to produce an alternative micro-realistic interpretation of the formalism of quantum mechanics. His guiding thought was that, contrary to the claims of Copenhagen school, a wave function, far from presenting a complete description of reality, captures “only certain aspect of what happens in a statistical ensemble of similar measurements, each of which is in essence only a single element in a greater context of the overall process” *(ibid)*.

A meeting with Einstein made Bohm interested in finding out whether a deterministic extension of quantum mechanics could be found (p.35). To this end, Bohm reformulated quantum mechanics in a language which is closer to that of classical physics He wrote the complex wave function in the form, exp (iS/h) and obtained two real equations, one which is essentially a classical equation of motion, the other a potential term called by Bohm “the quantum mechanical potential” (Bohm, 1952).

This second wave would act as a *pilot wave*, spreading out at superluminal velocity and coming towards where the quantum object is found, telling it how to move. The idea of a pilot wave which guides the quantum object was originally introduced by de Broglie in the Solvay conference of 1927 but rejected by the advocates of the Copenhagen school.[ii] It was deemed to be a real but in principle unobservable entity which serves the function of residing in the environment and reporting its finding back to the particle which *is* detectable. The particle then acts in accordance with the information provided by its associated pilot wave.

Bohm had called his version of quantum theory a hidden variable theory. However, he, later on, came to regret the choice of the term. Despite the fact that Bohm’s version did agree precisely with OQM in all its empirical predictions, physicists, by and large, did not look at it sympathetically. The reason, apart from the quasi-ideological dominance of the views of Copenhagen school, has been the fact that it has not accounted for at least one experiment which is not accounted for by OQM. The following quotation depicts the standard attitude of many present-day physicists towards this theory: “In the absence of experimental distinguishability between [Bohm’s version and OQM], the former becomes a substructure to OQM that is scientifically gratuitous. It would be based completely on philosophical grounds rather than empirical grounds” (Rohrlich, 1983, p.1252).

Another problem with this scheme is that the *pilot wave* should travel faster than light to serve its purpose. But this is clearly contrary to the special theory of relativity. Some more positivisticaly inclined physicists have also objected that the fact that the pilot wave is not even in principle detectable makes its existence spurious.

Bohm’s partial answer to these difficulties has been that the pilot wave is not a wave of matter, but just a wave of active information. Its effects depend only on its form, not upon its magnitude; consequently, unlike matter waves whose effects diminish with distance from the source, the pilot wave can have a big effect at long distances (non-locality).[iii]

Perhaps Bohm’s greatest achievement, notwithstanding the cool response from the physics community, has been to put a successful challenge to the seemingly absolute injunction against this sort of model, imposed by von Neumann.[iv] In fact, it was exactly this interpretation, which was later on re-named by Bohm was the casual interpretation (Bohm, 1957) that led John Bell to develop his famous theorem.

In subsequent years Bohm developed his ideas concerning the underlying reality responsible for quantum effects still further. He was particularly attracted to the role of language in forming our conceptions, and to the significance of the notion of “order” for shaping our scientific ideas (Bohm, 1963/ 1971). Thinking on the ways of reconciling the theory of relativity, which replaces “the concept of a permanent extended object by that of a continued structure of similar and related events, constituting a process taking place in a more or less tube-like region of space-time” (1963, p.12) and quantum theory, that “denies the notion of a connection and exactly specifiable process-structure, because the particle movement is always being disturbed by its interaction with the environment through indivisible quantum links associated to what would classically be its continuous field” *(ibid).* Bohn came to appreciate a new notion of order which he dubbed the implicate order (Bohm, 1987).

In this new metaphysical-scientific model, the notion of extensionless point particles is replaced by an undivided seamless whole whose enforcement and unfoldment gives rise to the *explicate order*. This seamless whole has a super-quantum potential (hence the link with Bohm’s former model, namely, the causal interpretation) and a wave function is assumed for the whole universe. “The general picture that emerges out of this is of a wave that spreads out and converges, again and again, to show a kind of average particle-like behavior, while the interference and diffraction properties are, of course, still maintained. … The whole universe not only determines and organizes its sub-wholes but also … gives form to what has until now been called the elementary particles out of which everything is supposed to be constituted. What we have here is a kind of universal process of quantum potential as to give rise to a world of form and structure in which all manifest features are only relatively constant, recurrent and stable aspects of this whole” (1987, p.43).

As is apparent, in this theory, the quantum attributes are not *localized* in the quantum entity itself but reside in ‘the entire experimental set up’ which may have to include not only the activities in the immediate vicinity of the entity’s actual detector but action arbitrarily remove in time and space from the detection site. Ultimately the whole universe may be implicated in a simple measurement. In Bohm’s view, the quantum potential of ordinary physical systems should be regarded as the first implicate order, while the super-quantum potential is called the second implicate order (or the super-implicate order). In principle, according to Bohm, there could be an infinite series if implicate orders with growing degrees of subtlety and generality (pp.43-4).

Bohm’s implicate order, notwithstanding its possible heuristic merits, has not been developed into a full mathematical model. This is perhaps one of the main reasons that the theory has not been taken enthusiastically by the physics community.

**I.d. Propensity Interpretation**

Another attempt to produce a realistic interpretation of quantum mechanics has been made by Popper (1959, 68/ 1967/ 1983). In his (1967, pp. 7-44) he has produced thirteen theses which summarize his views on this issue. Popper, following Einstein, maintains that quantum theory is essentially a statistical theory which gives statistical accounts of the behavior of ensembles of quantum systems and does not deal with the cases of individual quantum entities. However, unlike Einstein, Popper is of the view that: “the interpretation of the calculus of probability”. This way of looking at the issue has led Popper to both his main objection against the Copenhagen school and his own proposed solution to the apparent difficulties of the orthodox interpretation.

In Popper’s view the proponents of the Copenhagen school have committed a “great quantum muddle” which consists in “taking a distribution function, i.e. a statistical measure function characterizing some *sample space* (or perhaps some ‘population’ of events), and treating it as a physical property of the elements of the population (p.19). Popper maintains that this same muddle is behind many confused talks about wave-particle duality. Many physicists according to Popper have taken the -function as a physical property of the elements of the population, whereas Popper says, “the wave shape (in configuration space) of the -function is a kind of accident which poses a problem to probability theory, but which has next to nothing to do with the physical properties of the particles” (pp. 19-20). It is as if someone was called a ‘Gauss-man’ or a ‘non-Gauss-man’ in order to indicate that the distribution function of his living in a certain location has Gaussian or non-Gaussian shape. For Popper, on the contrary, the -function is only a probability distribution function, whereas it is the element in question which has the properties of a particle.

The reason behind the great quantum muddle in Popper’s view is the appeal of quantum physics to a subjective theory of probability. In fact, as his third thesis, Popper clearly states that, “[I]t is this mistaken belief that we have to explain the probabilistic character of quantum theory by our (allegedly necessary*) lack of knowledge*, rather than by the statistical character of our problems, which has led to *the intrusion of the observer, or the subject into quantum theory*” (p. 17).

To remedy this great misunderstanding, Popper has proposed an objective theory of probability which he has dubbed the propensity theory (1959/68, 1959b). Briefly stated, it is a theory for the application of the probability calculus to a certain type of ‘repeatable experiment” in physics and related fields such as biology (1967, p.31). In Popper’s view probability statements (as against the statistical statements) should be taken as statements about ‘some measure of a property (a physical property, comparable to symmetry) of *the whole experimental arrangement*” (p.32). This is a measure of a *virtual frequency* (i.e. infinite sequences of well-arranged experiments), while the statistical statements correspond to frequencies in actual (i.e. finite sequences of such) experiments.

Properties, according to Popper, are thus some kind of abstract physical properties related to the whole experimental setups. Every experimental arrangement is liable to produce, in the course of frequent repetition, a sequence with frequencies dependent on that arrangement. These virtual frequencies or propensities are probabilities. On this approach, quantum theory is seen as a theory, not about the dynamic processes in time but a probabilistic propensity theory that assigns weight to various probabilities. For example, to assert that the probability of a photon passing through a semi-transparent mirror is one-half is to say that the entire experiment arrangements here have a propensity of letting the photon pass through the mirror in 50% of the cases.

To show that this propensity interpretation solves the problem of the relationship between particles and waves, Popper has resorted to an analogy between a pin board and a quantum system. Having explained the change in probability distribution of those balls which actually hit a certain pin due to the change in experimental arrangement (e.g. lifting one corner of the board), Popper then goes on to claim in his ninth thesis that, “In the case of the pinboard, the transition from the original distribution to one which assumes a ‘position measurement’ (whether an actual one or a feigned one) is not merely analogous, *but identical with the famous ‘reduction of the wave packet*. Accordingly, this is not an effect characteristics of quantum theory but of probability theory in general” (pp. 34-35).

Applying this approach, Popper has reasoned that Heisenberg’s interpretation of the famous example of photons passing through a semi-transparent mirror which was first suggested by Einstein, is misguided. According to Heisenberg, if we find that the photon is reflected, “Then the probability of finding the photon in the other part of the packet immediately becomes zero. The experiment at the position of the reflected packet thus exerts a kind of action (reduction of the wave packet) at the distant point occupied by the transmitted packet, and one sees that this action is propagated with a velocity greater than that of light” (Heisenberg, 1930, p.39/ Quoted from Popper, 1967, p. 36). However, according to Popper, this apparently reveals a conflation on the part of Heisenberg. The relative probabilities namely,

- p (
*a,b*) = p (*-a,b*) = ½, and - p (
*a, -a*) = O, p(*-a, -a*) = 1,

where a refers to photon passing through the mirror and –a to a reflection event, and b to the experimental arrangement, are independent of each other; each belongs to a certain experimental arrangement entirely different from the other. As Popper has put it, “No *action* is exerted upon the wave packet p (*a,b*), neither an action at a distance nor any other action. For p (*a,b*) is the propensity of the state of the photon relative to the original experimental conditions. This has not changed, and it can be tested by repeating the original experiment.

Popper’s approach, though not without some intuitive appeal, suffers from a number of shortcomings. Apart from the lack of precision and rigor with the technicalities of probability calculus and quantum mechanics, the major difficulty with his account is that it does not provide a truly micro-realistic interpretation of quantum mechanics. This is because; in Popper’s approach propensities are attributed to the whole experimental arrangements and not to quantum entities. This, in turn, means that on the one hand Popper’s propensities are macro properties, and on the other hand, they cannot be regarded as a somewhat natural generalization of the notion of dispositional properties which are prevailing in all branches of science. This is because, these dispositions (e.g. fragility), are properties of the entities themselves and not the features of experimental arrangement.

Moreover, to attribute propensities to the experimental setups imports an element of arbitrariness as to what should be regarded as the proper setup in question. This arbitrariness in a way resembles the very arbitrariness in the Copenhagen interpretation as to where to draw the line between the macro world and the microsystem. Furthermore, since in Popper’s account, the problem is being shifted from the domain of quantum physics to the realm of probability calculus, even if it can account for the issue of measurement, it has no satisfactory reply to the issue of non-locality of reality which has become apparent from the results of Apects’ experiments.

**A Proposal Due to Nicolas Maxwell**

The main point of this proposal is that fulfillment of the basic aim of science, namely understanding of the fundamental structure of nature, requires the development of a micro realistic version of quantum theory. Such a version should be exclusively about micro entities and their interactions. Macrosystems, and in particular *measuring instruments* should not be lurking, in however concealed a fashion, in the background as far as the basic postulates of the theory are concerned (Maxwell, 1976, p.275).[v] The key to developing such a version is the realist conviction that micro entities exist independently of the human perceivers, or in other words, it is not the case that *esse est precipe*. To fulfill this requirement, any viable micro-realistic explanatory theory must have a definite, characteristic ontology of its own.

A realist guiding principle for probing the nature of unobservable entities (including quantum posits) is the general methodology of conjectural essentialism (Popper, 1972). In line with this methodology two sensible assumptions, which set the basic framework of a micro-realistic approach, need to be introduced;

- In speaking of the
*properties*of fundamental physical entities (such as mass, charge, spin) we are in effect speaking of the dynamical laws obeyed by the entities – and*vice versa*. Thus if we change our ideas about the nature of dynamical laws, we thereby, if we are consistent,*change*our ideas about the nature of properties and entities that obey the laws, and - The quantum world is fundamentally probabilistic in character, that is, the dynamical laws governing the evolution and interaction of the physical objects of the quantum domain are probabilistic and not deterministic (Maxwell, 1988, p.12).

The main conceptual tool of the proposed approach is the notion of *dispositional* properties. Physical entities, macro objects or micro-objects, possess properties which are dispositional in character: Their properties simply imply something about how the respected objects change, resist change, or affect change in other objects, in certain circumstances. From this point of view, the main difference between micro entities and macro entities is that while the latter have *deterministic *dispositional properties which are accounted for in classical physics,[vi] the properties of Quantum entities are probabilistic. These kinds of properties are called propensities[vii] and the objects which possess them are called propensitons (p.13).

Two kinds of probabilistic laws, namely, continuous and discrete *probabilistic laws* are considered by Maxwell. Corresponding to these two laws, two kinds of propositions, *continuous* and *discrete *propositions are introduced. Maxwell has suggested that quantum entities (e.g. electrons, photons, …) are varieties of the second kind of propensities called *discrete propensitons*. As long as the physical conditions for the probabilistic actualization of the propensities of these entities are not realized, they evolve in space and time *deterministically*. When these conditions are realized, they suffer an instantaneous, probabilistic change of state, determined probabilistically by the value of relevant propensities at the instant in question. In order to specify the nature of these propensitons (i.e. the nature of the propensities they possessed) three things need to be specified: i) the deterministic dynamical laws of evolution and interaction; ii) the precise propensiton condition for probabilistic event to occur, and iii) probabilistic laws governing instantaneous probabilistic transitions (p.14).

This task has been carried out by Maxwell in a number of publications. The final result is a new version of quantum theory called propensiton quantum theory, or PQT for short (Maxwell, 1988/ 1993b/ 1994). PQT, retains the dynamical equations of orthodox quantum theory (OQT) but rejects Born probabilistic interpretation of y. Instead of interpreting y as containing information about values of observables and about the outcome of performing measurement on the system (or ensemble of systems) in question, PQT interprets y as specifying the actual physical state of the individual quantum system in physical space and time, even in the absence of *preparation* and *measurement*. PQT also regards all measurements to be no more than special cases of a kind of probabilistic process occurring naturally throughout the universe. According to PQT, what exists potentially in one spatial region at an instant depends, in this way, on what exists, potentially, elsewhere – a feature of the quantum world (i.e. non-locality) not encountered within classical physics, and confirmed by the outcomes of the experiments on testing Bell’s inequality. PQT’s explanation for this fact is that n interacting particles do not have n distinct quantum states, but only have a joint, quantum entangled state as a whole. This whole undergoes probabilistic transitions when the condition for such transition (collapse) is achieved. This way of describing the quantum reality prepares the ground for an experimental test which can decide between the two rival versions, namely, OQT and PQT.

For OQT all quantum mechanical phenomena (or physical systems) are to be regarded as wholes, consisting of the measuring instruments and what is being measured. What is being measured, whatever it is, always exists in a superposition of states, until an act of measurement, which causes a particular wave collapse, is carried out.[viii] As a real example, consider a rearrangement collision between spinless particles a, b, and c, with the following two channel outcome:

(a+b)+c —–à (A) or (B), where;

(A)=(ab)+c and

(B)=a+b+c

where (ab) is the bound state. According to OQT, the outcome of the interaction is a superposition of the two channel state, (A) and (B), and only on measurement one of the two states can be detected. According to PQT however, the superposition of (A) and (B) collapse spontaneously and probabilistically (provided the condition for such a collapse is achieved) even in the absence of the act of measurement (Maxwell, 1988/ 1993b/ 1994).

To examine the validity of this claim, Maxwell has suggested a crucial test which can establish the superiority of PQT over OQT. Although this test has not been carried out yet, it is in principle possible to perform it. The proposed scheme, as a well-known physicist, has put it, provides an interesting research programme for advancing a realist theory of quantum world (Squires, 1986, pp. 413-17).

The significance of Maxwell’s interpretation and another realist approach to quantum mechanics is that contrary to the OQM they provide heuristic insights for delving further into the realm of quantum reality and thereby increase the chances of finding more effective models for comprehending deeper layers of reality. This is something that OQM, almost by definition, prohibits and therefore forecloses the progress towards better understanding.

**Endnotes**

[i] It should be born in mind that for Bohr the very concept of “phenomenon” refers only to the observations obtained under circumstances whose description includes an account of the whole experimental arrangement.

[ii] In the fifth Solvay conference, de Broglie delivered a paper entitled “The new dynamic of quanta” in which he presented in an incomplete and diluted form a simplified version of his original ideas concerning a wave-pilot theory. However, this theory did not receive with enthusiasm among the audience. De Broglie himself admitted that it was partly due to this unfavorable reaction that he abandoned his own ideas and espoused the Copenhagen interpretation from then on. Twenty-five years later, however, Bohm’s paper and certain other developments in the general theory of relativity revived his interest in his original causal approach (*Cf.* Jammer, 1966, p.357).

[iii] “[T]he quantum potential [does] not depend on the intensity of the wave associated with [the quantum objects]; it depends only on the *form *of the wave. And thus, its effect could be large even when the wave has spread out by propagation across large distances” (Bohm, 1987, p.36).

[iv] Von Neumann in his (1955) after presenting this theorem against the hidden variable theories had emphasized that, “It should be noted that we need not go any further into the mechanism of the hidden parameters since we know that the established results of quantum mechanics can never be rederived with their help” (p.324). Although von Neumann’s proof was mathematically impeccable, he had made a harmless looking technical assumption. Bohm of course had not used this assumption and thus was able to produce a viable version of hidden variable theory. It was John Bell who in 1964 clarified the issue and removed the imposed restriction.

[v] At the outset of his article, Maxwell has made it clear that in his approach, macro systems arise simply as the outcome of interaction between a vast number of microsystems. His main concern is to show that such an interpretation is in *principle*, viable. That there is a *practical* problem in calculating the structure macro systems out of the configuration of microsystems does not undermine the validity of the model.

[vi] Note that classical statistical mechanics is not a *fundamentally* probabilistic theory: it presupposes that the dynamical laws are *deterministic*. Probabilism enters into classical mechanics via probabilistic distribution of initial and boundary conditions in relevant ensembles of physical systems (Maxwell, 1988, p.12).

[vii] The notion of *propensity* was first introduced by Popper (1957).

[viii] It should be born in mind that according to OQT, the measuring device itself, is also subject to the formalism of quantum mechanics if its itself the object of observation by another piece of macroscopic apparatus.

**References**

Belinfante, F.J. (1975). *Measurement and Time Reversal in Objective Quantum Theory*. Oxford: Pergamon Press.

Bohm, D. (1951). The Paradox of Einstein, Rosen, and Podolsky, reprinted in Wheeler & Zurek, 1983.

———-, (1952). “A Suggested Interpretation of the Quantum Theory in Terms of ‘Hidden’” Variables (I&II). Originally in Physics Review, 85, 163-93.

———-, (1957*). Causality and Chance in Modern** Physics.* London: Routledge & Kegan Paul.

———-, (1987). “Hidden Variables and the Implicate Order” in B.J. Hiley & F.D. Peat (eds.) *Quantum Implications: Essays in Honour of David Bohm.*

———–, (1963). “Problems in the Basic Concepts of Physics”. An Inaugural lecture Delivered at Birkbeck College.

Bohr, N. (1958) *Atomic Physics and Human Knowledge. *New York: Wiley.

———-. (1949). “Discussion with Einstein on Epistemological Problems in Atomic Physics” in A. Schilpp (ed). *Albert Einstein: Philosopher-Scientist*.

Casti, J.C. (1989). *Paradigm Lost.* London: Abacus.

D’Espagnat, B. (1979/83). “The Quantum Theory and Reality”. *Scientific American* 241 (5).

Everett. H. (1957). Originally published in *Review of Modern Physics*, 29, reprinted in Wheeler & Zurek. 1983.

Herbert, N. (1985). Quantum Reality; Beyond the New Physics. London: RIDER.

Jammer, M. (1966). *The Conceptual Development of Quantum Mechanics.* London: McGraw Hill Book Company.

Maxwell, N. (1976). “Towards a Micro Realistic Version of Quantum Mechanics, I&II”. *Foundations of Physics,* Vol. 6, Notes 3 & 6.

————-, (1988). “*Quantum Propensiton Theory: A testable resolution of the Wave/Particle Dilemma*”. The British Journal for Philosophy of Science, Vol. 39, No.1.

————-, (1993a). “Induction and Scientific Realism: Einstein versus van Fraassen, Parts I & II”. B.J.P.S. Vol. 44, No. 1, pp.61-102.

————-, (1993b). “Does orthodox Quantum Theory Undermine, or Support Scientific Realism?”. The Philosophical Quarterly, Vol. 43, No. 171, pp. 139-157.

————-, (1993c). “A Philosophical Struggles to Understand Quantum Theory: Particle Creation and Wave Packet Reduction”.

————-, (1994). “Particle creation as the quantum condition for probabilistic events to occur,” Physics Letter A 187, pp.351-355.

Mermin, N.D. (1985). “Is the moon there when nobody looks? Reality and the quantum theory”. *Physics Today*, Vol. 38, pp.38-47.

Popper, K. (1959/68). The Logic of Scientific Discovery. New York: Harper Torch Books.

————, (1959b). “The Propensity Interpretation of Probability”. BJPS, 10, pp.25-42.

————, (1963/1972). *Conjectures and Refutations.* London: Routledge & Kegan Paul.

————, (1967). “Quantum Mechanics without ‘The Observer’” in M. Bung (ed). Quantum Theory and Reality.

Rae, A. (1986). *Quantum Physics: Illusion or Reality?* Cambridge: Cambridge University Press.

Rohrlich, F. (1983). “Facing Quantum Mechanical Reality”. Science, Vol. 221, pp.1251-1255.

Squires, E. (1986). *The Mystery of the Quantum World*. London: Adam Hilger Ltd.

Wheeler, J.A. & W.H. Zurek. (1983). *Quantum Theory and Measurement*. Princeton: Princeton University Press.

Wheeler, J.A., (1979-1981). “Law without Law”. Reprinted in Wheeler & Zurek (1983).

Copyright © 2019 Sharam Kohan. All Rights Reserved.