In The Nature of the Physical World, Eddington wrote:
The principle of indeterminacy. Thus far we have shown that modern physics is drifting away from the postulate that the future is predetermined, ignoring rather than deliberately rejecting it. With the discovery of the Principle of Indeterminacy its attitude has become definitely hostile.
Let us take the simplest case in which we think we can predict the future. Suppose we have a particle with known position and velocity at the present instant. Assuming that nothing interferes with it we can predict the position at a subsequent instant. ... It is just this simple prediction which the principle of indeterminacy expressly forbids. It states that we cannot know accurately both the velocity and the position of a particle at the present instant.
--end quotation
According to Eddington, then, we cannot predict the future of the particular particle beyond a level of accuracy related to the Planck constant (We can, in QM, predict only statistics of the results for similar particles). The outcome for a particular particle will fall within a range of possibilities, and this range can be predicted. But the specific outcome, regarding a particular particle is, we might say, sub-causal, and not subject to prediction. So, is universal causality (the claim that every event has a cause and when the same cause is repeated, the same result will follow) shown false as Eddington holds?
Social Science has always followed the mothership of science, physics. But in the last few decades, esp. post heisenberg, physics has become comfortable with quantum worldview, but the social science researchers (who study or build our perception of causality) are still stuck with classical Newtonian worldview and therefore having tough time grappling the physical science discoveries around causality. There are five main aspects where this is becoming tough:-
1) Difference of perception of space-time characteristics by different observers (Theories of Relativity)
2) Unified Theories could exist (unified field theory, string theory..)
3) Effects with no physical/ observable influence medium (Quantum Entanglement)
4) Inseparability of microcosm and macrocosm (Single Electron Universe)
5) No measurements or probabilistic measurement (Determinstic Quantum physics) Observation may not translate to measurement always. It's Ok, if the instruments can't measure, still an observation could be useful.
Consequently, social science research outputs a narrow view of reality, the part of reality which is measurable through classical means driven by newtonian paradigm. Since reality has many more dimensions, the experiential learning and practice are getting divorced from social science research due to this bottleneck of newtonian cognitive framework of causality.
Attached is an 1876 English translation of classical sanskrit text, which was written to know reality through substance (not matter alone) and reasoning. We are trying to bring that into research methodologies for a a better view on causality. Hope this is useful.
Here is a short quotation, along similar lines, on the "collapse of the wave function," from W. Pauli, in his Nobel prize lecture of 1945--awarded for discovery of the exclusion principle,
I shall only recall that statements of quantum mechanics are dealing only with possibilities, not with actualities. They have the form "This is not possible" or "Either this or that is possible," but they never say "That will actually happen then and there," The actual observation appears as an event outside the range of description of physical laws and brings forth in general a discontinuous selection out of the several possibilities foreseen by the statistical laws of the new theory (p. 30).
--end quotation
What Pauli calls a "discontinuous selection out of the several possibilities foreseen by the statistical laws," it would seem, is not a causal consequence of any particular or specifiable conditions of the prior state of the system. Is that right?
The Heisenberg Uncertainty Principle occasioned the downfall of classical mechanics, which was based on the assumption of finite universal causality. It was the first undeniable indication that the infinite divisibility of matter required the kind of causality implied by Bohm (Bohm, David, 1957, Causality and chance in modern physics: New York, Harper and Brothers, 170 p.).
The new form of causality is the Second Assumption of Science, causality (All effects have an infinite number of material causes). This is consupponible with the Third Assumption of Science, uncertainty (It is impossible to know everything about anything, but it is possible to know more about anything.) (Borchardt, Glenn, 2004, The ten assumptions of science: Toward a new scientific worldview ( http://www.scientificphilosophy.com/assumptions.html ): Lincoln, NE, iUniverse, 125 p.)
This form of causality is the reason we always have a plus or minus in all our measurements, and why it is impossible to determine the position and velocity of a particle at the same time.
This is subtler than it first appears. A simultaneous, precise position and momentum cannot be known, but this is not because the state of the particle is somehow incomplete or imprecise. It's because no possible quantum state settles both; a quantum state that specifies a precise position is maximally uncertain about momentum and vice-versa. The time development of quantum states is fully deterministic, but (on the standard interpretation) measurements 'collapse' the quantum state of the system into one of the possible (sub-) states that has a unique value for the measured property. In principle, the pure quantum development of a system seems even more deterministic than classical physics, since chaotic behaviour, in which systems with initially tiny differences diverge exponentially over time, doesn't arise in quantum mechanics.
The first of the ten assumptions reads: “The external world exists after the observer does not.” Is this a scientific or a metaphysical assumption? I think it is pure metaphysical. The meaning of “observer” and “exists after …” are by no mean scientific terms. Also: The difference between “actuality” and “possibility” is a metaphysical one. Does a possibility “exist” – if yes: in what sense? If everything we know about the universe rests – at least “also” – on observations, and if observations are always a “collapse of the wave function” (i.e. a actuality, not a possibility) – then “possibility” exists only in the mind (or: as the mind) of the observer. Pauli was therefore close to Berkeley: “I am persuaded would Men but examine what they mean by the Wort Existence they wou’d agree with me.” (Phil. Comm. § 604). Thanks.
I agree with Martin that there are different issues going on here. As usual I have to quote Leibniz who pointed out that everything that ever happened or will happen in the universe can be considered predetermined, without this requiring 'determinacy' in the sense that the laws of physics require one possible future. For Leibniz the laws of God (let's say physics) allowed several possibilities but the possibility that is taken will never be other than the one it was going to be. In fact the idea that the universe 'might be different from what it was going to be' is incoherent. So it only makes sense to see the universe as predetermined or to say that this doesn't actually tell us anything because the alternative idea that it can somehow be 'altered' is an oxymoron.
So I think Eddington, imaginative and insightful as he was, has given us a complete non-sequitur. Heisenberg's principle tells us that WE cannot predict the future. That says nothing about whether or not it is predetermined. Nor in fact does it even tell us anything about whether the rules are 'loose' with options. The principle may simply be a reflection of the limits of acquisition of knowledge. They may also be, and probably are, a reflection of the false assumption that there is a 'particle' that actually has a position and a momentum in a naive realist sense. All the issues do turn out to be linked in QM but I am not sure that HUP has to be directly linked to the fact that the linear deterministic progression of the wave equation gives several options of which only one is actual. (I am unimpressed by the von Neumann two process analysis and the idea that the 'wave function collapses' 'after' it has progressed, as is often the way it is interpreted.) When I am sailing out on the sea I see waves go past whose position and wavelength are indeterminable beyond a certain degree but I do not say 'that wave shows the universe is fuzzy', just that am never quite sure when the boat is going to pitch as the wave overtakes us.
I think Eddington got in a muddle.
Dear Edwards, it strikes me that your comment above is, indeed, very much in the spirit of Leibniz. Though, in spite of that (and the genius of Leibniz), taking the uncertainty principle quite seriously will take us beyond the idea that it represents some difficulty about coming to know something unknown. That's the old idea that QM is incomplete, that there must be "hidden variables," determining outcomes or measurements. Regarding the precise simultaneous position and momentum of the particle--beyond the limits set in relation to Planck's constant, there is simply nothing there we could ever know --nothing known and nothing unknown. More on this later, however.
BTW: I´ve long thought that it is a major problem with Leibniz's philosophy that he has no genuine concept of accident, or the accidental. Even the Aristotelian concept of accident as coincidence is missing --as in, I go to the market for one purpose, and my friend for another, and we meet by accident. Or again, there is the distinction between between "accident," "property," and "essential character." Isn't this lack of accident just a matter of Leibniz going overboard with the idea of divine foreknowledge, and the highly intellectualized, orthodox theology of the late middle ages? It is as though everything is equally deducible from the per-established ideas in the mind of God. This would seem to be an ultimate, extreme version of the "logic of exposition," at best, which neglects the logic of inquiry.
It strikes me as of some import to insist that even if, as I suppose, there are open possibilities in the world, it still make sense to speak of what the universe will be at some point in the future--which point might take us back to Aristotle and the sea-battle. The law of excluded middle cuts no ice on how one of two exclusive alternatives may come to be.
Eddington does get himself in various muddles, though I believe that what he says about indeterminacy is not one of them.
H.G. Callaway
Dear HG,
I don't quite follow the problem with accidents. I am sure Leibniz was happy that there are coincidences as perceived by us. He would argue that these would have always come about through sufficient reason but he makes it clear that there may be no possible way of describing that sufficient reason, it being infinitely complex. Thus the inexplicable accident would be nothing alien to him. As I understand it, however, he did not use or perhaps know of the use of the term accident in this sense. He talks of accidents as things acquired by entities in addition to their essential nature. My understanding is that he is right to deny accidents in this form at the level of dynamic indivisibles - for him monads and for us modes of field excitation. Mere aggregates of matter like stones might still acquire additions, as in the deposits he noted were added to rocks by dripping water in his mines. But rocks were not for him entities with an essence - they are arbitrary conceptions by us of aggregates. So my understanding is that Leibniz is merely rejecting accident in the technical Scholastic sense and he is right to do so. The main accident at issue was speed. He rightly noted that you cannot 'add speed' to a dynamic unit since all speed is relative.
I still think there is a potential confusion about the meaning of 'what is possible' here that goes beyond the old epistemic/ontic division and which is rooted in a universal tendency for us to conflate token events with types, but we have covered this before.
Dear Brian,
You are right to say that debate is a sign of health in the patient. However, for all the comments you give I could give ones on the other side. If I remember rightly Lee Smolin reports that Einstein in his last years began to think he was wrong after all. And with due respect to Roger Penrose, I think if he spent six months working in a neuropsychology lab he would come to realise that there isn't going to be a submicroscopic reality of that sort for reasons unrelated to any specific theory of physics. That sort of 'reality' is a set of arbitrary signs used by our brains for purposes of survival, nothing whatever to do with fundamental dynamics.
Dear Flanagan, Many thanks for your contributed quotations above. I think they are all more or less relevant to the question here, and I would like to see them discussed in detail. However, I also sympathize with Edwards, directly above, on the topic of what various famous people may have finally thought about GR and QM. Given the conflicts and tensions involved, some doubt would be in order from all sides. Perhaps that is the point you most want to make. If so, I certainly agree.
One of the best arguments of this sort starts from Einstein. He was never convinced about QM, and he continued working on his idea of a unified field theory, uniting gravitation and electromagnetism until the end of his days--so far as I know. Even if he did express some final doubt on his opposition to QM, this does little to undercut the force of his long-continued opposition or skepticism. That, we should take very seriously.
On the other hand, though, we have the subsequent developments leading to the standard model, and this has substantially unified 3 or the 4 known forces of nature, electromagnetism, the weak nuclear force and the strong force--from the direction of quantum field theory. That helps create the expectation that a theory of quantum gravity should start out from the standard model in some fashion, and it makes the direction of Einstein's late work somewhat less plausible.
So, in general, I'm all for open-mindedness here, and if someone could explain, say, Dirac's doubts about the fundamental character of Planck's constant h, (h-bar, I take to be a special-purpose, if pervasive derivative of h), then that would be great. But without some detail on the grounds of the doubts, we make little progress here, since we can all likely agree that there is some justice in various and opposing doubts. That is just what makes the topic so engaging.
Or consider, Roger Penrose, for instance. Basically, I'm interested in anything the man has to say. I think he's brilliant. In a certain sense, he does think that QM is incomplete, but I do not know that this goes so far as to challenge the fundamental character of Planck's constant and the uncertainty principle. He argues that gravity is involved in the collapse of the wave function, or in decoherence, but I don't think he expects that the success of his suggestions along these lines, would be a full theory in answer to the "measurement problem." I think he wants to contribute a possible constraint on a theory of quantum gravity. I am aware of others who have doubts about this approach.
Again, I do not propose that we do physics here. That is best left to the physicists themselves, along with the details of the mathematics. But I see the present question as having broad interest, and I believe it possible to make improvements in our understanding of these matters. Eddington and Pauli are also good places to start. I trust the physicists may help keep us on the straight and narrow.
The uncertainty principle, I believe, tells us something about causality; if there are good reasons to doubt the uncertainty principle, that might also tell us something about causality. I think we lack any evidence of scientific deficiency of the uncertainty principle.
H.G. Callaway
Karl-Heinz:
You are right. According to Collingwood, all fundamental assumptions have opposites and neither can be proven true or false. For instance, as scientists, we believe that there are material causes for all effects, even though we could never prove that to be true for an infinite number of instances. That is why we can have interminable debates over determinism vs indeterminism, infinity vs finity, etc. The infinite universe forces us to use assumptions to get any work done.
Collingwood, R.G., 1940, An essay on metaphysics: Oxford, Clarendon Press, 354 p.
Also Cournot would have something to say about the failure of the classical concept of causality. His ideas apply also to the macro-world.
Below I paste an excerpt of my paper with Lungarzo about what we called "Cournotian Processes":
"The classical Newtonian schema of causation inspired the functional concept of causality (Mackie, 1972):
Effect = f (Cause 1, Cause 2, Cause 3,....)
where f specifies how the set of causes relate to the effect. Causes and Effect are conceived as physical changes that occur in the system, relative to its initial and boundary conditions.
Mathematical functions describe a univocal relation between the causes and the effect. How do the causes relate to each other? There are two possibilities:
a) they are previously correlated, such that there is a function F that deduces f;
b) they are not previously related, and therefore a function F that deduces f does not exist previously; i.e., f comes to existence only at the moment when the causes interact. This possibility is illustrated by the concept of “chance” as an absence of previous correlation of causal chains, according to the proposal advanced by A. Cournot (1838).
Biological processes in a cell (e.g. metabolic networks) or in the whole organism (e.g., different tissues and systems working autonomously) are parallel in a stronger sense than in classical computation, since they contain simultaneous phenomena that cannot be composed or decomposed in sequential processes. This aspect has been recognized in several approaches, as probability theory, fuzzy logic, and non-linear thermodynamics of dissipative structures.
Another typical factor present in biological computation is that the mechanisms amenable to be described by algorithms have a property of self-organization, conferring a spontaneity to their dynamic evolution. A central aspect of such processes, is the availability, for each parallel processing unit, of information about the states of the other units. However, because of architectural and (possibly) general physical constraints (as e.g. the finite velocity of signal transmission), information about the system available for each unit is partial. The global, self-organizing processes that characterize such systems is based on the partial information that each processing unit has about the others"
[ see: www4.pucsp.br/pos/tidd/teccogs/artigos/2009/edicao_1/3-a_cournotian_approach_to_the_emergence_of-relational_collectives-carlos_lungarzo-alfredo_pereira_junior.pdf]
Best Regards,
Alfredo Pereira Jr.
Well, the classic questions is whether the uncertainty principle tells that the particle does have a specific location and velocity, but WE cannot measure them with infinite precision, or whether the particle itself does not have a specific location and velocity even if there are not observers near it.
Dear Borchardt, I wonder what you may make of Popper's argument concerning the unpredictability of the growth of knowledge, in The Poverty of Historicism (1957). This seems to me to illustrate Popper at his best; and, of course, Popper is, after all, an advocate of indeterminism. (This argument does not turn on QM.)
The argument goes something like this: The growth and development of human knowledge cannot be predicted. For, if we had a method of predicting the development of knowledge, and thus the development of science, too, then we would not have to actually practice science, we could simply sit back and predict its outcomes. But that consequence is absurd. We cannot find out what results science will come to without actually practicing the sciences. Yet, what we will be able to do in the future depends, in part, upon what we will come to know, since what we are able to do depends upon what we know. In consequence, as Popper concludes, history is essentially unpredictable.
Now, I mention this argument partly because of your mention of apparently interminable debates and conflicts, Collingwood, and etc. However, I want to restrict consideration here to the human debate concerning determinism and indeterminism. I submit that in the present context, whether we should adopt indeterminism or not depends upon outcomes and results of science--say, the debate between Einstein and Heisenberg. QM makes extremely accurate predictions, and we have no evidence against the uncertainty principle.
To simply insist, then, that arguments between determinism and indeterminism are incapable of any reasoned or scientific settlement amounts to a kind of dogmatism. You are implicitly predicting the outcome of the very discussion we are engaged in on the present question. You are implicitly predicting the outcome of the discussion--its future history, as it were, instead of engaging in it. Popper's argument above recommends against that.
I think there can be no serious doubt that the conflict between determinism and indeterminism is a central element is the differences between Einstein and Heisenberg, GR and QM. At the very least, the topic is an approach to the profound problems connected with the topic of quantum gravity.
H.G. Callaway
Dear Saridakis, the trick that affords a move from epistemological to ontological approaches is this:
a) Take into consideration a system with degrees of freedom (the operation of each part has a degree of independence from each other);
b) Instead of asking about the information that the scientific observer has about the system, ask for the information that each part has about the others;
c) If there are physical factors - as quantum uncertainty - that limit the information that one part of the system has about the others, then the system is Cournotian, i.e., the causal factors that guide its evolution are not coordinated and therefore it is not Laplacian-deterministic.
Alfredo I could agree with you if I knew what "information" is. Do you know any scientific definition of "information"? (a formula with it would be even better)
Here are two short quotations, perhaps worth some discussion:
Stephen Hawking, 2005, A Briefer History of Time:
"...we know that the theory of general relativity must be modified. Because classical versions predict points of infinite density--singularities--it prognosticates its own failure."
Brian Greene, 1999, The Elegant Universe:
The notion of a smooth spatial geometry, the central principle of general relativity, is destroyed by the violent fluctuations of the quantum world on short distance scales"
The following, short backgrounder on singularities may prove helpful:
http://en.wikipedia.org/wiki/Gravitational_singularity
It seems to me the basic point is that physics, generally, puts so much confidence in QM as to frequently put GR under very serious question. This is not simply a matter of generalized doubt, or the idea that anything can be quested. The questioning of GR arises directly from QM.
Can you, for instance, accept the concept of "points of infinite density"? Or is it not, instead, that such infinities suggests defects in theory? This is the kind of open discussion which is currently going on among experts. Can we follow it here?
H.G. Callaway
Dear HG,
I may be off the wall but I have a thought in response to your two quotes. My experience is that eminent physicists come in two types. The ones that write popular books and appear on the media see a rift between GR and QP (quantum physics in its current form). The ones I meet in private tend to say the problem may be more apparent than real. My suspicion is that those who see a problem are people who one way or another want to have a 'God's eye view' of things, despite the fact that there have always been good reasons for thinking there is none and these reasons have gathered even greater power in recent times. So my question about a point of infinite density might be 'for whom'? Who would measure this density and how?
I have a feeling the way forward is not to scratch heads about the ill-fitting of GR and QP but to point out that both theories are seriously incomplete in a quite different way. Alfredo P might agree with me here. Both theories emphasise a role for an observer, yet neither makes any attempt whatever to include such an observer within the theory. I think to understand the basic framework of physics we need to take observers much, much more seriously. Einstein complained that neither his theory nor any other in physics included 'now'. Maybe he should have made this a Minkowskian 'herenow'. Herenow is the observer within all theories. But we also have to be very careful because the observer as a human being is a vast collection of herenows and observing and the paradoxical nature of 'aboutness' in perception means that we have an experience about a 'herenow' that is not the 'herenow' of that experience in dynamic terms. Physicists may throw up their hands in horror at this point and say they are not muddying their theories with philosophy of mind garbage. But I don't think they are entitled to.
What I suspect we need is not to ask how we marry GR and QP but how do we marry both of these with a framework that starts with herenow, the obvious paradigm being Whitehead. Physics has to be about causal dynamic relations, not existence in the view of God. It can only be about causal relations because only those can cause us to know. Cause means sequence. Sequence means implicating two times, one after another. Physics does this as a B series time. But an observer has an A series with a herenow and physics needs observers. Moreover, we do not want observers to be any different from the rest of goings on in physics; we don't want some magic confined to human brains. So maybe all dynamic relations are herenow in some sense. But herenow is only one time and we want a sequence. So Whitehead says that every actual occasion is a meeting of, or relation between, present and immediate past. Whitehead came to this idea partly because of relativity but neither SR nor GR as I understand them address this 'present-on-pastness' at all. On the other hand QP does start to recognise the past - as what is 'determinate', and the present as 'superposed'.
None of these famous quotes on the GR/QP fracture seem to me to be expressed insufficiently well defined terms to assess whether or not the complainant has thought more widely about the present/past divide. Could it be that GR, like most physics before it, is effectively a theory of the past? And that QP exposes the past/present divide but as yet fails to realise this? Is the problem that the metric of the past and that of the present are not the same thing, even if they are always commensurable within an event? Is one continuous and the other not? This may be wide of the mark but I sense that something like that would solve the problem. I think Brian Greene is raising the issue of 'quantum foam' but is quantum foam really a description of reality if it has no relation to a specific herenow? Is it a description of real token dynamics or of notional types? Again the confusion between tokens and types (ensembles) seems rife in the popular exposition of QP.
I think physicists need to learn some neuropsychology and philosophy of mind - that might get them further than arguing about string theory.
All laws of physics are causal and therefore deterministic in the absence of perturbations with regard to whether simple variables or global variables for complex systems. When perturbations are present, for instance for non-controlled systems, it's manifest that the behavior of physical system diverges from the behavior provided for the deterministic law in the measure of the perturbation.
Heisenberg's Uncertainty Principle raises instead another question: nature, above all on microphysical scale, is always indeterministic, also in the absence of perturbations. Indeterminism and uncertainty i.e. are inside nature.
Heisenberg's Uncertainty Principle is a theoretical concept based on a mathematical model concerning photon and then extended to massive elementary particles. The used mathematical model, and not intrinsic nature leads to indeterminacy relation. I have maked use of another mathematical model for photon and I have demonstrated the "Determistic Quantum Physics". Therefore determinism and indeterminism are the result on the choise of the mathematical model.
The question then becomes: is deterministic or indeterministic the nature?
The answer to this question can be philosophical or scientific. The scientific answer depends on our ability to perform experimental measures above all in the microphysical world. But we already have trials about the fallacy of the Uncertainty Principle, for instance, through the traces of trajectories of elementary particles photographed in accelerators.
Dear Edwards, It seems clear to me that theoretical physicists who write books for the educated public can be better or worse at what they do, just like anyone else. I certainly would not tend to see any generalized philosophical prejudice in these many, many books. I see no reason to generalize about them in the terms you suggest. On the contrary, what you hear in private conversations may simply be an expression of a lack of deeper interest in conflicts between GR and QFT. For many people, I'm sure, its not a matter that they would consider or take up in a casual conversation--possibly touching on their professional credentials or standing. I think we should be very grateful for the many fine volumes published on related topics, which, as with Eddington's similar writings, are sure to challenge the non-specialists, and sometimes even the specialists, I suspect. Beyond Hawking, Penrose, Smolin and Greene, I already mentioned of late, I would also mention Carlo Rovelli and Frank Close here as making especially valuable contributions. If I gave it some thought, I'm sure many others would some to mind.
Regarding your theme of the "here-now," and related reflections, I recognize an influence of Whitehead on Eddington, but I am not at all certain that taking up Whitehead in any detail will benefit the present thread and answers to the present question about uncertainty and causality. It is very interesting, I think, to reflect on the fact that both Russell and Eddington, and also W.V. Quine where students of Whitehead at one time or another. That is part of the continuing fascination with Eddington, I believe. Quine and Eddington are polar opposites in philosophy --in several ways, though, like Whitehead, both were strongly oriented to natural science. I imagine the old battles of prior generations may still be going on in some form.
I take it that every frame of reference assumes a zero-point of its coordinates, and the most convenient starting point will, of course be, here-now. But on the other hand, "here" and "now" are indexicals, and this means that there reference depends on the location of the speaker who uses the words at the time they are used. In spite of that, of course, we can easily imagine other speakers, in far-off times and places who could also speak of "here-now," though we regard this as a matter of "there-then." Taking the matter more realistically, as when concerned with measurements, say, and not merely with mathematical exercises and imaginative examples, it is then important to establish the relationship between our own frame of reference, and another we are interested in considering. These come in two general groups, according to Einstein --those moving at a uniform velocity in relation to our own, and those which are accelerating. Its true that physicists, and mathematicians often say little about the point of origin of the coordinates, and some indexical element seems to be ineliminable, but I don't see that we really need to bring in Whitehead's metaphysics to deal with the related issues or problems. What's wrong with a common-sense approach here?
Part of the curiosity of QM is that it rests on empirical evidence, but the predictions are statistical in character. Still, every supporting measurement is a specific event, falling within a predicted range, with a predicted frequency or probability, but nonetheless it is always a matter of some specific event or measurement. Though all the supporting evidence comes from specific events, the specifics of the event are not predicted. Again, leaving out the prospect of "hidden variables," then the specifics seem not to be determined by the prior condition of the system. It seems clear to me that the per-dominant view is that there is genuine indeterminacy involved. QM is not "incomplete" in the sense of Einstein's claim, it is instead incompletable. The Bell inequalities, and Aspect's work, in particular, have helped produce evidence of this. So, I suppose there is no going back to determinism.
Maybe it is just me, but I've always thought it significant that it is possible to build devices which produce random results--honest dice, roulette wheels, etc. I think you have to ask yourself what would count as evidence of indeterminacy. If your answer is that nothing could count as evidence of indeterminacy, then it seems that this amounts to treating determinism as a dogma. Heisenberg, and many others, have taken it that they found evidence of indeterminacy. If they are wrong, then it will be important to address the evidential claims.
H.G. Callaway
Dear H.G.,
What I have heard from eminent physicists in private conversation is certainly not an expression of lack of interest in the relation of GR to QP. I am thinking in particular of two physicists with a deep interest in foundational issues. They do not write books on 'The Trouble with Physics' because they do not see such trouble. I agree that there is a range of views in the popular books, but I still see a certain narrowness of vision. Rovelli is interesting because he does move a little towards Whitehead I think. Whitehead may not have much to do with the determinism issue but in your previous post you raised this new question about GR and QP, to which I think he is very relevant
I don't actually think that taking oneself as the zero point of a set of co-ordinates has anything to do with herenow. I can have my observer frame and say my eye is at x=9.7, y=5.2, z=4.4, t=198.6. and it makes no difference to the fact that herenow is herenow. This cannot be such a trivial problem because Einstein was truly saddened by his inability to find a place for herenow in his physics. So this has nothing to do with frames of reference in the sense of SR or GR. The 'common sense' approach was not good enough for Einstein, and I think for good reason.
I totally agree that the world is indeterministic in the sense that the rules of physics generate several possible outcomes for any situation type and this reflects the fact that such types do genuinely have several possible outcomes. But that is not inconsistent with Leibniz's claim that the possibilities that are actually the outcomes in the history of the world can only ever be what they were going to be. So if you want to think of the world in intuitive realist terms as a Minkowski block, and Aspect etc. suggests that that may be unavoidable if you going to envisage intuitively at all, then one may have to say that the whole history of the universe is predetermined, despite its being 'indeterministic'. Two quite different issues seem to get confused.
Dear Edwards, Just briefly for now, if I may.
You wrote, just above:
I totally agree that the world is indeterministic in the sense that the rules of physics generate several possible outcomes for any situation type and this reflects the fact that such types do genuinely have several possible outcomes. But that is not inconsistent with Leibniz's claim that the possibilities that are actually the outcomes in the history of the world can only ever be what they were going to be.
--end quotation.
It seems we agree about indeterminism in physics. But we differ, I believe, in our readings of the phrase you've introduced, regarding "the possibilities that are actually the outcomes in the history of the world," being ever only "what they are going to be." That suggest, in turn, that we don't actually agree about the meaning of something being physically indeterminate.
It seems to me clear that the idea you attribute to Leibniz is pretty empty. We can all agree that whatever things are going to turn out to be, they are going to be. (Que sera, sera.) But this is absolutely trivial, since it makes no claim to predict or forecast any particular results or outcomes. It is not clear to me that you do see the full triviality of this. Otherwise, I cannot understand how the point could appear as an apparent objection to anything. As trivially understood, it certainly does not suggest anything so methodologically weighty as a "principle of sufficient reason," say, or the idea that there must be a cause of any specific outcome, in all its specifics. This latter notion is just the negation of indeterminacy, while the phrase you brought in is mere tautology. As though to say, no matter what P may state about the future, if P, then P.
Again, if no matter what P says about the future, P or not-P, this tells us nothing about how the actual future comes about, So, it is perfectly consistent with thinking that it comes about, partly, by purely indeterminate process. Its no different than If P then P.
If the future is open and not yet determined, then still, whatever P may say about the future, if P then P.
H.G. Callaway
In the scientific meaning determinism doesn't mean physics has to be able to forecast the future, always and anyway. Who believes that confuses physics with astrology. In the scientific meaning, determinism means that a physical law is able to forecast the future value of a physical quantity, only in the absence of perturbations. It is not clear on what the following claim is based: "the world is indeterministic in the sense that the rules of physics generate several possible outcomes for any situation". This is true only in the presence of unknown and unexpected perturbations and only for events that are sthocastic in their essence. I think also this view of physics has philosophic nature and doesn't consider big results of physics in all fields, from nanotechnologies to space navigation, in which only one outcome, with a little margin of error, must happen in order to avoid disasters. I think still philosophic indeterminism and scientific indeterminism are two different things. Heisenberg's Uncertainty Principle and the consequent probabilistic view of all physical phenomena is now an old view.
In quantum mechanics, the Heisenberg uncertainty relation,
or Heisenberg uncertainty relation, suggests that no
is possible to determine, at the same time and with deliberate precision, physical position and momentum variables (number of
movement) of a given object. I mean, how much more certain
sought to determine the position of a particle, the less
knows its linear momentum. This physical introduced
withe the epistemic subject in research. In Level
macroscopic quantum base element is the observer,
the mind that realizes the senses and builds awareness
tools to confirm what he sees ahead.
In fact, it has been demonstrated that defined repetitively
what you want to see the research on quantum phenomena
observation affects the object investigated. Hypotheses
can dislodge it, for all kinds of science, including
there human and social, are:
The nature of the research objects, which in some
scientists can sense manner, is associated to the way
as they interact with these objects: match
Thus, condition, circumstances, and extent of perspective
observation stage, among other factors. In other words,
noumenon or the thing itself susceptible to investigate what is, in
advance, by the way built, condition, circumstances,
perspective and scale of observation scenario.
Consequently, the construction of scientific knowledge
(Episteme), as a result of this interaction is
Depending on the nature (noumenon) intuited objects
research. In other words, all scientific theories
phenomena on deliberately defined sets
and defined as objects of research, hypotheses have
underlying epistemological and ontological, that are deriving
or influencing the development of the form and the content of
these theories.
Therefore, all scientific theories are regarded to be
elected or which arrived by historical circumstances or spatio-temporal discarding countless other looks that would have been also possible.
The work of Penrose and Hameroff, certainly very criticized, in which quantum physics seeks to coordinate with the micro-cell physiology including neuronal-, have come to understand that quantum experiments
have shown that a particle can appear "not known
where "and divided into many other traveling some distance [involves distances within the atomic nucleus, smaller sizes 1fm; knowing that one is a femtometre fm, also called fermi, and is the unit of length equal to one trillionth of a meter (1 fm = 1x10-15m)]
and then those particles collide and disappear within
of "nowhere". Some scientists have already mentioned
formulated his theories based on the possibility of considering
There is still an unknown level, alternate character to reality
known physics in general. This level has been considered
as a vacuum state or of no where disappear
emerge and which particles and waves; an energy state
absolute potentiality, which becomes reality acting
at infinity of possible universes; and within these, countless
possible micro realities. It is as if "big mind" where
emerge every effort to exist. The universe in which we live
It's one of those possibilities actually made as bypass
that absolute, empty and infinite possibilities of potential state.
However, when talking about waves waves must ask?
what, what field produces waves? As I said above,
the existence of an alternate reality, a field is assumed,
an ocean of pure potentiality, a field of existence
abstract potential, now named the "field
unified ". Hence any apparently arises connectivity
particles with which everything that exists is made. That is the
fundamental property of quantum mechanics.
Dear H.G>,
The is indeed a tautology, but that is not trivial. I have tried to explain that there are different issues here that can easily be conflated. I think you assume a 'progressing wavefront' universe or maybe a 'growing block'. I sympathise, but I accept that my preference for a 'progressing wavefront' is purely aesthetic - I cannot see 'the point' of a block universe, the temporal passing of which is merely an illusion played out in our minds. I have no better logical defence of a non-block universe than 'why bother' if it's a block. 'Delayed choice' and Aspect scenarios, if taken to their logical conclusion do seem to imply that there must be a block. If a dynamic connection, like a pair of entangled photons, depends for its very nature as much on the final conditions of measurement as it does on its origin then the measurements cannot be an afterthought. Since these measurements are parts of new dynamic connections whose very nature depends on their final state of interaction (perhaps in a Rovelli picture) the it goes on for ever. The end of the universe has to be there at the beginning.
Now I happen to think there is a way out of this, but many would deny there is, and my way out requires the sort of metaphysical sidestep that Whitehead uses. But the basic point at issue is that Eddington's text is non sequitur because he talks of the future not being determined (as in a block) and then backs this up with the looseness of the steps along the way. The trouble is that a predetermined block can have just as loose rules of connection as a growing one. Ramon argues that HUP shows that the world is a growing block or progressing wavefront but I don't think it addresses that at all. If anything QP makes a predtermined block de rigeur unless you unpick time in a very radical way.
This is relevant to the suggestion that it is all about perturbation. To my mind there is no such thing as perturbation. Perturbation is supposed to be deviation from some 'normal' or destined path. But since perturbation can only occur by the action of potentials already in the 'normal' equations and nothing can ever deviate from its destiny - precisely because que será será is a tautology there can be no such thing. One of the things that seems to me inconsistent about QP is that people still talk about delayed choice when physics has no way of accommodating 'choice' - it is essentially a supernatural concept.
I am afraid that I think this really is much more subtle than Eddington seems to have thought.
Dear Schmeikal,
It does seem that Edwards is very intent upon discussing Whitehead here. I take it his idea is that Whitehead is the only alternative to a "block universe," to use the term from William James. I believe that Whitehead has little importance for the present question.
Dear Edwards, I am unable to fathom the importance you attach to Whitehead in the present context--except that it reflects the importance you attach to avoiding various versions of the "block universe." As seems clear to me, however, we have already got around that.
H.G. Callaway
Despite Einstein was opposed to the probabilistic and indeterministic interpretation of Quantum Mechanics, he subscribed to the success of the indeterministic view in physics and in science also with his Theory of Special Relativity. In fact when Einstein added to the principle of relativity the principle of constancy of the speed of light, he defined an invariance of the speed of light with respect to all systems of reference and consequently an indeterminacy with respect to the reference frame that becomes unessential for the assessment of the speed of light. It is in disagreement just with the principle of relativity that claims instead an invariance of laws of physics and not an invariance of speeds.
Einstein introduced the second principle thinking to give a theoretical solution that was in agreement with the Michelson&Morley experiment, but in actuality it caused numerous contradictions that can be surmounted only giving the just meaning to the concept of reference frame and to the principle of relativity. I have searched for solving this question through the Theory of Reference Frames that defines a preferred reference frame that is completely different from the absolute reference frame of classic physics. This approach allows to surmount difficulties that are present in Special Relativity and to define new transformations in the space-time-mass domain. I don't know if philosophy is interested in these speculations of physical research.
Dear H.G. Callaway:
Popper's argument concerning the unpredictability of the growth of knowledge is, of course, correct. The unpredictability stems from the fact that the universe is infinite, as we assume in the Eighth Assumption of Science, infinity (The universe is infinite, both in the microcosmic and macrocosmic directions). Although the opposing assumption finity, is conventional as well as mandatory for modern physics and cosmology, there have always been dissidents. The anti-Copenhageners, Bohm (1957), and Popper verged on adopting formal recognition of infinity. Like the others, Popper’s view relies on the presupposition now known as the Third Assumption of Science, uncertainty (It is impossible to know everything about anything, but it is possible to know more about anything). Popper was also famous for insisting that scientific theories could be falsified, but that they could never be completely proven. They could be supported by evidence, but one could never gather an infinite amount of evidence in support.
You mentioned that:
“To simply insist, then, that arguments between determinism and indeterminism are incapable of any reasoned or scientific settlement amounts to a kind of dogmatism. You are implicitly predicting the outcome of the very discussion we are engaged in on the present question. You are implicitly predicting the outcome of the discussion--its future history, as it were, instead of engaging in it. Popper's argument above recommends against that.”
Sorry, H.G., but I don’t read Popper that way, as you can see from my mention of his view of falsification. Per Collingwood (1940), we know that fundamental assumptions always have opposites and that neither can be completely proven. But if one is true, then the other is not: the universe cannot be both finite and infinite at the same time, even though we will never be able to completely prove which really applies. After extensive “reasoned or scientific settlement”, we can choose one or the other assumption, at which point we may rightly be called “dogmatic.” As scientists, our belief that “there are material causes for all effects” definitely “amounts to a kind of dogmatism.” We could not do any work without it, and once we make our decision, we do not debate or discuss it before using it.
Popper is correct because any prediction we can make will always have a plus or minus associated with it because of infinity. Laplace’s Demon was destroyed by Heisenberg, as recognized by Popper. The classical form of determinism upon which it was based required finity. Because infinity obtains, just as Bohm said, the Demon could not predict the future with perfect precision. Neomechanics and univironmental determinism are replacements for classical mechanics and classical determinism that expressly include the assumption of infinity (see "The Scientific Worldview").
Implicitly predicting outcomes is what we do in science, but we never expect to produce perfect predictions. That is because the universe (we assume) is infinite. The theoretical switch from finity to infinity is a big, big deal. That will not happen quickly, but as with Popper, Bohm, and the anti-Copenhageners, there are signs of welcome change. Do not worry about anyone being able to do a complete job of “predicting the outcome” or “future history” of knowledge. That is guaranteed (we assume) by the Third Assumption of Science, uncertainty (It is impossible to know everything about anything, but it is possible to know more about anything).
Here follows a few short quotations, which, as I hope, may help this question along:
"As they are currently formulated, general relativity and quantum mechanics cannot both be right. The two theories underlying the tremendous progress of physics during the last hundred years...are mutually incompatible."
--Bryan Greene, The Elegant Universe, p. 1.
"...the gently curving geometrical form of space emerging from general relativity is at loggerheads with the frantic, roiling microscopic behavior of the universe implied by quantum mechanics. ..this conflict is rightly called the central problem of modern physics.
--Greene, Elegant Universe, p. 5.
If we assume here that QM does tell us of the lack of smooth geometry, at the sub-microscopic scale of the Planck constant, then it seems to follow that the continuous geometry of GR must be a special case of something more general and which cannot itself be generally described in terms of the smooth geometry of GR. The conflict, quoted here, I submit, is the basis of contemporary research on quantum gravity. The conflict, suggest that the evidence supporting QM is, in some fashion, evidence against GR, "as it is currently formulated."
H.G. Callaway
Dear H.G.,
I may not have been entirely clear but I did not intend to imply that Whitehead is the way out of a predetermined block universe, as seems to be required by extrapolating the Aspect result indefinitely, with each step in the 'causal chain' apparently being as dependent on the future as on the past. I was suggesting that to avoid a block universe one needs something as radical as Whitehead. I don't think Whitehead's proposal itself alters the problem. A more specific reformulation of the relation of quantisation to time seems needed.
But I am interested to hear you say that you think that we have already got around the predetermined block. How do you counter the implications of the findings on Bell's inequalities and 'delayed choice' scenarios?
I appreciate that the Elegant Universe was written for a popular audience but phrases like 'gently curving geometrical form of space' and 'frantic roiling' do tend to suggest a lingering God's eye view type of naive realism. I am prepared to accept that there are more technical perceived problems with the consistency of the maths but when I read the book I could not put my finger on exactly what they were. If the smooth continuity of the spacetime metric is the continuity of determinate fields of potentials then maybe the 'roiling' possibilities of the quantum present, that are not determinate as such, do not disturb this smoothness. I have yet to understand what the problem really is, and, as I said, the physicists I most respect suggest it may be illusory. I think we need something more specific than Greene talking man in the street metaphors.
Dear Edwards, Many thanks for your reply and the emphasis on sources in physics--to which I certainly count Greene. If there is something wrong with the quotations, then I think the critical points ought to be made clear.
My experience is that we don't get far going at this one-to-one. So, lets step back and see what else may develop. There seems to be a good deal of interest for the question from many sources.
H.G. Callaway
Here follows a short quotation from physicist Max Born, writing in his Nobel Prize lecture of 1954, and defending Heisenberg's interpretation of QM. The entire lecture is available on line at the following address:
http://www.nobelprize.org/nobel_prizes/physics/laureates/1954/born-lecture.html
Quote Born:
How does it come about then, that great scientists such as Einstein, Schrodinger, and De Broglie are nevertheless dissatisfied with the situation? Of course, all these objections are leveled not against the correctness of the formulae, but against their interpretation. Two closely knitted points of view are to be distinguished: the question of determinism and the question of reality. (pp. 263-264).
--end quotation.
Though I have only quoted the opening, I would urge interested readers to read through Born's argument, and the paper (12pp.) completely. Born represents Heisenberg's argument for indeterminacy as following Einstein's example in its form, and I think Heisenberg-Born argument compelling. At the same time, I suspect that Born over generalizes in the direction of a positivist notion of meaning.
There are times, given the array of evidence and the success of theory in prediction, that we definitely should do away with unobservables (usually posits of earlier theory), and this can have a quite revolutionary effect --as in Einstein's elimination of the Maxwellian aether and absolute simultaneity. Born and Heisenberg both see the argument for rejecting the full determinism of classical physics in similar terms. Yet, on the other hand, I want to suggest that such an elimination of unobservables be counted as an hypothesis. Elimination of unobservables can certainly count toward the relative simplicity of such an hypothesis. But, if we are dealing with an hypothesis, alternative hypotheses will usually need to be considered. Regarding indeterminacy, this came in the form of "hidden variable" theories. Thus the relevancy of Bell and Aspect to this question and thread.
H.G. Callaway
With regard to Born's quote: "all objections are leveled not against the correctness of the formulae, but against their interpretation", I would specify that the interpretation of the formulae is very important and decisive in the history of science. In fact, for instance, Lorentz and Poincarè maked use of so-called Lorentz's Transformations in order to save the aether, while Einstein maked use of the same transformations in order to deny the aether and to save the Principle of Relativity, in fictional way because he misrepresented it. The same tranformations are used therefore for different objects and as per different interpretations, consequently three cases are possible:
1. only one of two interpretations is correct
2. both interpretations are uncorrected
3. Lorentz's Transformations are uncorrected
I think my viewpoint is manifest.
Every mathematical relation isn't a cold sequence of numbers and symbols and must be accurately interpreted.
With regard to the following statement: " Born and Heisenberg both see the argument for rejecting the full determinism of classical physics", it is evident the full determinism isn't real because of the presence of unknown and unexpected perturbations that can change strongly or weakly the evolution of a physical system. Nature is fundamentally deterministic in the meaning that the same causes generate always the same effects in the same physical conditions. When perturbations are present, they tend to change physical conditions, but luckily they aren't always present. Most of the natural and artificial physical systems, cosmological systems and microscopic and submicroscopic systems work always similarly.
At last unosservable variables or hidden variables have both no physical meaning and belong more to the metaphysical domain than to physical domain. In physics only variables, that are observable and measurable, have a meaning.
Main theories of Modern Physics (Special Relativity, General Relativity, Quantum Mechanichs, Standard Model) are today obsolete theories that had an important function in the twentieth century. Now in the beginning of the third millennium a new physics (Contemporary Physics) is coming to light.
I understand my words are strong but my viewpoint is that.
I apologize, but I cannot answer to prospective replies for one week.
Best regards.
Daniele Sasso
The uncertainty principle does not alter causality. That it leads to indeterminacy is a matter of degree. Physics likes to reduce action to its simplest form. The law of inertia tells us that a particle moving through space with no forces acting on it at time t1 will be at a known position at t2. The uncertainty principle does not change the conclusion. The real world may require us to state the error in our prediction, but in no way does it change the law of inertia.
We tend to forget that in applications (not theory) we use parameters not fixed values. If we measure the distance traveled by a particle in the time t2-t1, we state the measurement as D. This D is a datum. If we report the measurement as D+/-d we have two parameters. The mean value D and its uncertainty d. All discussion or use of the estimated measurement D+/-d is parametric. This does not make it indeterminate, but within error. Our use of the parametric distance requires an answer that involves probability. This use of the parametric distance as an exercise in probability does not negate causation. As scientists we seek to reduce the uncertainty to better define cause-effect.
In QM we cannot reduce uncertainty below a limit that involves Planck's constant. As in macro-physics we use parameters and probabilities with a limitation to reduction in uncertainty, but we still can define cause-effect.
The theory of error is a very useful theory that serves to define the uncertainty which is present in the measure process of any physical quantity due to both systematic errors and accidental errors.
The Uncertainty Principle is different: it defines an uncertainty that is not due to the process of measure but is due to the intrinsic indeterminacy of nature. That indeterminacy does not concern the single physical quantity, but it concerns at the same time two different physical quantities whose uncertainties are connected through the Planck constant. This uncertainty is independent of measure instruments and it is intrinsic to the nature. That principle was enunciated at first for the microphysical world and successively it was tried to expand it with no success to the macrophysical world.
The law of inertia, for instance, defines the inertial behavior of a system till an external action (perturbation) changes that behavior. It is valid whether for macrophysical systems or microphysical systems, but the Uncertainty Principle darkens all other principles and laws of physics.
The theory of error is a very useful theory, the Uncertainty Principle is, from my viewpoint, an useless principle, based extra on an unsuitable mathematical model.
The Planck scale is based on the Planck length L=1,616x10^(-35)m and it is very strange for me that if in future we will be able to maesure the Planck length, then we will not be able to measure smaller distances. Besides the Planck scale is based on the concept of black hole, event horizon and Schwarzschild's distance that have gravitational explanation in Modern Physics while I propose a relativistic explanation for dark matter.
The Modern Physics of the twenthieth century gives us a limited and closed universe by two constants: the speed of light at macroscopic scale and the Planck length at microscopic scale. Really this universe appears very confined.
My few remarks are directed to a question logically prior to the present discussion: what form would a well formed query about determinism take? My suggested answer is that, at least in the first instance, it take the form of a question about the representational structure of a theory: can it consistently represent, or at least be compatible with , a causally deterministic structure to its dynamics? This in turn can, as a first move, be turned into a mathematical question. Formulating things this way (where it can be done) imposes a useful discipline on philosophical speculation.
1. Unless we carefully specify what is to count as determinism we will simply stir up muddy waters (cf. http://plato.stanford.edu/entries/determinism-causal/). For instance, if determinism is just 'same follows same' then QM shows various deterministic features: A) every experimental situation produces the same kinds of exptal outcomes and statistics; B) the psi [wave function] evolution in time is deterministic in this sense; C) energy-momentum is strictly conserved across interactions. Etc.
2. Quoting formulae won't in itself help. It is a fundamental property even of classical waves that their wavelength and position cannot be jointly precisely defined. (Obtaining a single precise wavelength requires a pure sine wave, which is only possible over an infinite spatial extension and conversely, as Fourier analysis reveals, concentrating a wave at a point requires the superposition of waves of infinitely many different frequencies.) The famous quantum non-commutation relation is just a mathematical expression of the same feature. But classical waves have no relevant indeterminism. On the other hand, quantisation (discretisation) is profound: if only distinct discrete energy values are possible and time is continuous then during energy transitions there must be times at which the energy is unspecifiable. (If the E value is still the initial value no transition has occurred, if it is the final E value then the transition is over, but if the transition takes no time then the E value is contradictorily both initial and final values, so the transition takes time and during that time no E value is specifiable.) Quantising time to solve this problem has its own mathematical difficulties, as well as contradicting the Schrodinger equation.
3. The issue is how to tell whether QM statistics can be consistently given an ignorance modelling, that is, can it be understood in such a way that, while we may be ignorant of the precise values of the properties involved, and even unable to determine them all at once, they nonetheless can be consistently assigned such values and the QM statistics arises from these values in various ways (averaging, selecting, etc.). But we don’t want to limit the entities that have precise property values to classical versions of current atomic particles. - especially because it is very hard indeed to obtain any satisfactory version of such a convenient ontology (see below). Given this and the issues in 2 above, I suggest that the most useful criterion for deciding whether QM is deterministic is something like this: Is there a specifiable ontology of kinds of entities and their interrelations such that for every allowed global state of these entities there is exactly one possible succeeding dynamical state and such that QM is homomorphically (structure-preserving) embeddable in this ontology in such a way that all experimental results are reproducible and all statistics are provided with an ignorance model, and the dynamics unfolds in a single, globally coherent manner. (This last clause is to prevent solutions situation-by-situation but having no dynamical way to connect them - the ‘measurement dependence’ problem that dogged David Bohm’s efforts.)
4. If that is the issue, then we have to say that at present there is no convincing ontology of this sort in the offing and there are powerful hurdles in the way of easily creating one. One way to state the ignorance requirement is that there are underlying states characterised by sets of entities (usually particles) each entity having precise property values; but their states must form Boolean lattices, so that what is required is a way to give the QM theory an embedding into a Boolean lattice. But then we face the Kochen/Specker embedding theorem which states that there is no homomorphic embedding of the lattice of QM states into a Boolean lattice. And for the lazily hopeful there is Bell's no-hidden variables theorem which tell us that you cannot avoid the K/S theorem simply by adding additional properties or particles to the classical ontology. (A useful review is ‘Quantum Logic and Probability Theory’, Stanford Encyclopedia of Philosophy: http://plato.stanford.edu/entries/qt-quantlog/, although that article has a different focus from determinism and ontology, having already shifted to QM as just a measurement theory.)
5. There remains the bare possibility that there is some new kind of ontology to be discovered that does the trick. But it is not going to be easily won as, say, tailoring the properties of a collection of non-classical entities just to suit, e.g. 'smearons' or 'here-nows' or the like. For the problem is inherently structural in the QM itself, as the K/S theorem, and my remarks on discreteness above, shows. Until that structure is more deeply understood and faced it is unlikely anything useful can be said. (This includes the present discussion.) Further, it is clear that this does not exhaust the problem, as the appearance of apparently non-local relations like the Pauli Principle and EPR spin determinations show. These further issues have never even been systematically catalogued (I had a brief crack at it once, see Brit, J. Phil. Sci. 42 (1991), 491, cf. larger perspective in my essay in Faye, J. and Folse, H. J. (eds), Niels Bohr and Contemporary Philosophy, Dordrecht, Kluwer, 1994, 155-199.). All up, to make progress it looks as if we will have to deal in some new way with space and time itself. So that this project becomes bound up into the grander mess of how to unify general relativity and QM, What could its ontology be? Such proposals pop up regularly (here is the latest: ‘The GRW flash theory: a relativistic quantum ontology of matter in space-time?’ Esland, M. and Gisin, N. Phil. Sci. 81 (2), 2014, 248-264). But I don't know of any proposals that deal with the full suite of distinctively QM features.
That's it. none of this stuff is new and I don’t want to teach anyone’s grandma to suck eggs, but I felt the need for sharper characterisations of the problem, and for its multi-faceted nature to be clear. Discussion software like the one here already over-encourages the idea that anyone who has an opinion has an equal right to state it. No, all that superficial policy engenders is muddier water in which self-delusion and drowning compete for importance. Both science and intellectual vigour will be best served by respect for deep insight. My personal view is that without a contribution to make to the formidable issue of grand unification, it is better to demand those who know tell us as clearly as possible what the problem is and what are considered possible solutions, and otherwise to remain watchfully silent. (Think Tractatus Logico Philosophicus last sentence, with ‘usefully’ inserted after “cannot”.)
Dear Cliff,
Your remarks are clearly very well informed. (I remember the remarkable richness of the Faye and Folse volume.) And there seems little to disagree with in your description of the problem. It does seem to be a structural issue within QM. And I agree that to make progress we will have to deal in some new way with space and time. But, with respect, and sincerely meant, I am not sure you can get away with the last paragraph.
What contribution are you personally making here? Or should you have remained watchfully silent? No, you should not, your post was useful, so the way forward in debate is a bit more complex. And I must object strongly to the idea that science and intellectual vigour are best served by respect for deep insight - if that implies to demand 'that those who know tell us'. Isn't the first lesson in science never to take anybody's word for it? Who 'knows'? In my experience most major progress in biomedical science comes from not respecting 'those who know'. It seems that respecting von Neumann's deep insight was not quite so wise.
These chat lists have their problems, of course. But at least they allow people to learn by arguing and making mistakes. This is not my field of professional expertise but I am interested in learning its deep insights. In my own field of expertise (immunology) I welcome stupid questions and opinions from others because the ensuing debate teaches both parties. That was the fun of teaching students, after all. And in this area of fundamental physics it seems to be far from clear who 'knows'. Did Einstein 'know better' about QM. If he was fallible, where do we go? What I think you are pointing out is that some of the blind alleys are known, and I agree wholeheartedly. But even pursuing blind alleys, like David Bohm, may lead to everyone understanding the problem better.
So what would you want of a new way of dealing with space and time? You are rather dismissive of my neologism but it was not intended as a Procrustean bedstead into which a theory must be forced. It comes with Clay's caveat that it is specious, at least in its naive interpretation. Leibniz dissociated himself from Spinoza not just to maintain free will but because there appears to be a need to posit 'real indivisible substances' because of the existence of points of view. A Democritan style particle does not seem to give us dynamic indivisibility (without contradiction) or point of view, so, as you imply, that seems to be off all menus. Quantum theory actually looks much more promising for both, but again, as you say, there are all sorts of issues in the basement, even before worrying about GR, that suggest that space and time have to be re-invented (yet again). Do you not have a dangerous speculation to offer?
Dear Hooker, I appreciate your suggestions, for the most part; and I'd say that we have been through some of them. I can understand that not everyone who happens across this question and thread will find it of interest to continue to participate. Yet, reading through your note above, I found much to agree with and some places where I think an elaboration might be helpful.
In reply, for now, I will just say that I believe it belongs to the philosophy of science to attempt to understand and communicate, in ordinary language, or the language of the educated public, something of what is going on in science, and in theoretical physics in particular. That is sometimes a tall order, of course. Many in philosophy, inclined toward theoretical questions, are reluctant to take up a general discussion of the options abroad among specialists. Many physicists just want to do physics, and are skeptical of philosophical perspectives and alternatives --not without reason, I think.
Still, there is always better and worse and the sorting out of this in discussion. The advantage of the present platform is that it stands to bring together, in terms of interest and inclination, many people who would likely never otherwise even know of each other. That is a kind of opportunity. Something may come of it. Its one way to drag ourselves down out of the theoretical clouds. (Socrates had Aristophanes, we may recall.)
I appreciate, too, that we are all busy with some other work, and I suspect that the discussion of questions such as this one about QM and causality, benefits from just that fact. For, myself, I must admit, the answer seems rather clear and flat. There is no evidence for determinacy of specific outcomes in QM, though as you note, and has been noted here before, the formalism can be viewed as deterministically concerned with probabilities.
Many attempts have been made, over decades, to answer Heisenberg or "complete" QM with hidden variables. None of this has worked out. That is not to say that its impossible for more to be learned about such things; and new, speculative theories do arise. But in the meantime, at least, I find it hard to sympathize much with stand- pat insistence on determinism without evidence. That seems to me a reasonable position. Just today, I was reading a review by the philosopher of science, R. B. Braithwaite, (1900-1990) published in 1940. Even then, only 15 years or so after the first publication of the uncertainty principle, Braithwaite had been won over. He conceded the scientific case.
One may come to suspect something quite strange is afoot in some of the resistance to this conclusion. If Heisenberg didn't prove quantum indeterminacy, then I am in doubt on what would count as a proof. The skeptics should tells us what they would count as a proof. (Do we need to take up Hume here, perhaps?) But, of course, this is an (appropriate) attitude toward heterodox alternatives, not any desire to see them banned from discussion or consideration. On the contrary, such discussion may prove very useful and enlightening. In doing so, it is of the first importance to distinguish between what is accepted in theoretical physics, and what may be (only) a matter of theoretical speculation to the contrary.
H.G. Callaway
I would want to specify that my comments describe and represent my viewpoint as researcher of theoretical physics, and it is strongly critical towards modern physics that is accepted by most of present physicists.
1. Definition of determinism.
I have specified in my preceding comment what the determinism is in physics: "the same causes generate always the same effects in the same physical conditions". Consequently in a deterministic experimental physical situation only one outcome with a prospective range of error is possible. The indeterminism of QM is full in all different versions: Heisenberg's indeterminacy, Schrodinger's wave function, Copenaghen interpretation, etc.. QM theorizes for instance that also if a particle does not have sufficient energy it is able anyway to surmount an energy barrier characterized by greater energy (see tunnel effect, etc..). That is the effect is independent of the cause, the causal relationship isn' t valid and it is full indeterminism.
2. Mathematical models.
The mathematical model used in QM for representing photons and elementary particles, as it has been justly claimed, is the Fourier analysis and an important question concerning mathematical models has been raised. Nevertheless there the problem has been directed towards the fact that the energy is quantized, but it is true only inside of atom. In fact free electrons and photons can assume in general any value of energy. The Planck relation (E=hf) gives any value of energy for any value of frequency and only inside atom values of frequency and energy are quantized. In QM photon is represented through a packet of waves and this model generates indeterminacy, besides this representation of photon is in contrast with the Planck relation that instead assigns one only frequency to each photon. Schrodinger's wave equation then defines the dynamic behavior of a particle and it fixes the particle can be at every instant of time in any point of space generating still indeterminacy. When in the microphysical word quantities, that are used normally in the macrophysical word, are unobservable and unmeasurable then in that case it would be good thing to avoid of using those physical quantities and to use instead others.For these reasons I maked use of a different mathematical model for photons and particles that removes the intrinsic indeterminism that is generated by an inappropriate mathematical model. I specified this new mathematical model in my paper "Basic Principles of Deterministic Quantum Physics" that I link here for convenience of whom has interest in reading it:
https://www.researchgate.net/publication/211874241_Basic_Principles_of_Deterministic_Quantum_Physics?ev=prf_pub
Article Basic Principles of Deterministic Quantum Physics
As per this new interpretation of microphysical events it is also possible give a new non-probablistic explanation of the tunnel effect and similar effects based on a more scrupulous analysis of physical processes that happen inside physical systems.
3. Ontological question.
It seems to me that there is a big disappointment for insufficient answers given by physics to ontological fundamental questions. To that end suggestions and solutions have been given in this discussion, like for instance the inclusion of QM inside a Boolean lattice, the project of unifying General Relativity and Quantum Mechanics, a project of great unification, a new definition of space and time, etc.. I would want to specify, as per my personal expertise, all my efforts as researcher are directed in reverse with respect to small or great unifications predicted by the modern physics, because the understanding of the nature is more profound in underlining differences than unlikely unifying reasons of physical events that have different characteristics. Besides the ontological question concerns more properly philosophy than physics that strives of distinguish physics from metaphysics.
I have been obliged to break my comment in two parts for technical reasons.
Dear Sasso, This strikes me a a good start on what you may have in mind. I agree that it is important to distinguish established science from speculations concerned with particular problems, the prospects of new theory and similar new possible directions of research. These things can be fascinating, but they definitely include a lot of territory and options. I'm not against related discussions, but I think it important to distinguish them from elucidation of established science. As for metaphysics, as far as our question is concerned, I think it best to think of it as a possible result of physics, but not a firm presupposition. On the other hand, people disagree, at least in philosophy regarding what counts to metaphysics in the negative sense--or at least, there has been a lot of debate on related topics over the years and centuries. But keeping close to established physics, we will naturally tend to avoid related problems.
You wite:
As per this new interpretation of microphysical events it is also possible give a new non-probablistic explanation of the tunnel effect and similar effects based on a more scrupulous analysis of physical processes that happen inside physical systems.
---end quotation
For one, I'd be interested to hear more about this idea. Can you explain it in detail, in layman's terms? It would be especially interesting if you could relate the theme to causation and to the relation of QM to causation.
As for grand unification theories, quantum gravity and such, I think they can be respected as outlining directions of research. It would be a lot to expect that we could significantly evaluate them here, but it would be quite an accomplishment if we could better understand them --especially as this relates to QM and causation. The topic of causation seems to be clearly involved in the tensions between the determinism of classical physics, GR included, and the indeterminacy or probabilitic character of QM.
H.G. Callaway
I would want to continue my preceding comment considering the "physics question".
That question relates to the fact that the greater part of present physicists believes modern physics (twentieth century) and a small part doesn't believe and it would want to go back to classical physics (from seventeenth to nineteenth century). But in actuality both, modern physics and classical physics, are obsolete theoretical structures and the future is in contemporaneous physics and science that will be the fundamental structure of physics and science of the third millennium.
Dear Sasso, How do you see this contemporary physics and the "fundamental physics of the future"? Where do you see classical physics and modern physics as going wrong? Are there people in the field you would point out as on the right track?
H.G. Callaway
Dear Callaway, I know several people have a critical viewpoint with regard to either one of main today predominant physical theories, but I don't know if somebody thinks about a new physics that must represent the evolution of modern physics as this represented the evolution of classical physics.
When I claim a physical theory is obsolete I don't say that theory is necessarily wrong in all but a few important aspects have to be changed also in the light of new theoretical and experimental developments. I have indicated several times my critical viewpoint on the four main theories of modern physics (Special Relativity, General Relativity, Quantum Mechanics, Standard Model) and I have provided with new alternative theories.
I think modern physics is characterized by the indeterminism, openly with regard to Quantum Mechanics and more hiddenly with regard to other three theories. I think still indeterminism and the consequent probabilistic development are also necessary in a few situations but it seems to me that idea now is catching on in all physical situations, also when it is not necessary and essentially it is due to an insufficient analysis of the physical situation, because the probabilistic mathematical model is able anyway to give an answer to problems. From my viewpoint the indeterminism is very often a shortcut.
The alternative theories that I like to consider for the contemporary physics are The Theory of Reference Frames, Deterministic Quantum Physics and the Non-Standard Model that we can discuss in next comments.
Thanks for nice attention.
Daniele Sasso
Social Science has always followed the mothership of science, physics. But in the last few decades, esp. post heisenberg, physics has become comfortable with quantum worldview, but the social science researchers (who study or build our perception of causality) are still stuck with classical Newtonian worldview and therefore having tough time grappling the physical science discoveries around causality. There are five main aspects where this is becoming tough:-
1) Difference of perception of space-time characteristics by different observers (Theories of Relativity)
2) Unified Theories could exist (unified field theory, string theory..)
3) Effects with no physical/ observable influence medium (Quantum Entanglement)
4) Inseparability of microcosm and macrocosm (Single Electron Universe)
5) No measurements or probabilistic measurement (Determinstic Quantum physics) Observation may not translate to measurement always. It's Ok, if the instruments can't measure, still an observation could be useful.
Consequently, social science research outputs a narrow view of reality, the part of reality which is measurable through classical means driven by newtonian paradigm. Since reality has many more dimensions, the experiential learning and practice are getting divorced from social science research due to this bottleneck of newtonian cognitive framework of causality.
Attached is an 1876 English translation of classical sanskrit text, which was written to know reality through substance (not matter alone) and reasoning. We are trying to bring that into research methodologies for a a better view on causality. Hope this is useful.
Dear Puneet Bindlish thank you for the text. I believe that this classical text together with the findings in QM and the experience in art will give a new ourcome for the future.
Many ways are blocked and the thinking will find something new that is again in harmony with the classical texts.
Something is still not so very clear as you end your text with 'better view on causality', in this way science becomes metaphysical because pointing to the first cause is no more the field of science. Also the separation between animal soul and supreme soul, from the sanskrit text, I don't see as duality. There must be a gradation which is always changing, but motion is another problem, causal or not.
Dear Daniele, I agree with H.G. I quote "But in actuality both, modern physics and classical physics, are obsolete theoretical structures", however modern and classical physics as well obey to strict repeatable initial and boundary conditions. Consequentlyt physics af any walk is not so obsolete. But it is completely out of the question to entitle physics with epitemological power, that is never claimed by any serious physicsist.
Dear Verstraeten, I belive you misqote me. I did not write that "both, modern physics and classical physics, are obsolete theoretical structures." I believe this text comes from Sasso.
He went on to say, in his most recent note, directly above:
When I claim a physical theory is obsolete I don't say that theory is necessarily wrong in all but a few important aspects have to be changed also in the light of new theoretical and experimental developments.
--end quotation
I've been waiting to hear more of what he thinks in wrong. He's mentioned several topics, but I think we are yet to get the interesting needed arguments and details.
H.G. Callaway
Dear H.G. I quoted Daniele, I agreed with your response. Sorry
@Rita De Vuyst
Thanks for the appreciation and hope. It is indeed a path of gradation. The ancient vedic worldviews are fine with constantly changing nature of reality. Infact a synonym to "world" is "jagat" meaning "one that is constantly moving or changing"
A philosophy to understand causalities and relationships among various entities in reality, is provided in that given text. I would be completing a research paper next month - with review on this text and covering Bhartiya research paradigm/ methodologies. Then we can discuss in a seperate thread.
I am answering to Puneet Bindlish.
The alternative in physics today isn't between classical physics (Newtonian worldview) and modern physics but between this last and the new contemporary physics.
With regard to five points that have been underlined, I am able to reply like this:
1. The physicist has not to consider only the perception of the space-time, but he has to investigate it in depth. It is manifest that different observers, motionless or in relative motion, can have different perceptions of physical quantities, and not only of space and time, but the object of physics is just to uncover general laws that are independent of the perception of the particular observer. It is needs also to say among all possible observers there is always a preferred observer who is different from the absolute observer in classical physics.
2. All theories are possible, also unified theories. It needs nevertheless to verify :
a. if those unified theories are concordant with the experimental evidence,
b. is those unified theories are intrinsically coherent,
c. if those unified theories simplify or complicate our wordview.
3. In default of time I could not study in depth, with object of research, the entanglement but I am sure that it can be explained in deterministic way in the order of conservation laws.
4. I agree with the inseparability of macrocosm and microcosm, nevertheless I am also able to affirm, as per my researcher experience, that not all physical laws that are valid in the macrocosm are valid also in the microcosm and vice versa and it confirms my scepticism about unified theories.
5. On observation and measurement I said when a physical quantity cannot be measured, it is better for physics to make use of other quantities. I thought, for instance, about position that is easily measurable on a human scale while today it is hardly measurable on a submicroscopic scale. Anyway I agree that observation is always useful.
I thank then H.G. Callaway and Guido J.M. Verstraeten for discussing my comments.
Dear all, I'd like to point out a new paper, by Hal M. Haggart and Carlo Rovelli, which has just been made available. The short title is "Black hole fireworks."
https://www.researchgate.net/publication/263663012_Black_hole_fireworks_quantum-gravity_effects_outside_the_horizon_spark_black_to_white_hole_tunneling?pli=1&loginT=uaZzfLioTCoNZeHtosZlDEIAOJJcAQDA1rjpImBETmo*&uid=8e320d55-ef43-47f2-a85f-bac5269dc83d&cp=re289_x_p12&ch=reg
(That's the RG address). I think this paper interesting for the present question is so far as it concerns the relationship between GR and QM--and, of course, in relation to the theme of "black holes." As I understand it, this is a work in mathematical physics, related to, but differing from Hawking's and Beckstein's work on black holes and black hole radiation. I wonder if we might get some comments on the paper which would be helpful for present purposes. One basic idea is the "tunneling" of a black hole into a "white hole," and there are some related considerations of causality.
In substantial agreement with a point I take from Hawking, they write:
...certainly classical GR fails to describe Nature at small radii, because nothing prevents quantum mechanics from affecting the high curvature zone, and because classical GR becomes ill-defined at r = 0 anyway.
---end quote
Have a look.
H.G. Callaway
Article Black hole fireworks: quantum-gravity effects outside the ho...
My claim in a preceding comment: "classical physics and modern physics are obsolete" is at the same time a historical fact and an opinion that is not yet a historical fact.
The historical fact is that classical physics, based on the Newtonian concept of absolute space and time, and consequently of absolute reference frame (aether), has become obsolete, in the sense specified in a preceding comment, because of theoretical and experimental evidences that generated modern physics.
The opinion is that also modern physics, which has characterized the twentieth century, is by this time an obsolete view of the physical world, founded on absolute indeterminacy on the one hand (QM, SM, and consequent other theories) and on an unsatisfactory interpretation of the Principle of Relativity on the other hand.
The paper indicated by H.G. Callaway on "black holes" belongs completely to modern physics even though authors strive to exceed a few critical points like the behavior of the nature at small radii in the order of GR.
The black hole is presented in this paper according to the gravitational theory: i.e. a gravitational collapse of matter into singularities of space-time. Evident contradictions with GR are surmounted then supposing the formation of a white hole at radii compatible with Planck's metric. The idea that matter of space goes into a black hole and goes out from a white hole, by means of a kind of quantum tunnel effect, is a further example of the indetermination, in the Greek ancient world supported by Anassimandro, that governs modern physics. On black holes experimental evidences there are, it isn't possible to say the same thing for white holes. With regard to so-called black holes, the experimental evidence shows that nothing, whether matter or energy, can go out from those celestial bodies. Modern physics explains those events through the gravitational theory.
I strive to reason about them in the order of the new contemporary physics, and I understand that the absence of observations of light and matter coming from those celestial bodies can have two different physical explanations:
1. The celestial body actually emits nothing, whether energy or matter, because of a strongest and singular gravitational field. The gravitational theory assumes this hypothesis and those celestial bodies are called "black holes".
2. The celestial body emits actually matter and energy, but those emissions don't reach the observer because of a highest relative speed, greater than the speed of light, between the celestial body that moves away and the observer. This hypothesis is assumed by the "relativistic theory" , formulated by myself, in the order of contemporary physics and here those celestial bodies are called more precisely "dark bodies".
Two completely different logics have applied in these two explanations, and my fiducial Greek ancient philosopher is just Aristotele, founder of the logic.
Dear all, Here is a new paper, which just appeared on RG from arXiv, authored by John W. Moffat: "Quantum Gravity and the Cosmological Constant Problem"
You can find it at the following address:
https://www.researchgate.net/publication/263773944_Quantum_Gravity_and_the_Cosmological_Constant_Problem?ev=pubfeed_overview&trpid=53c66dbdd5a3f22c338b46d9_1
This paper tells us something about loop quantum gravity which is of considerable theoretical interest--that it is renormalizable. This point has been made by others, but it appears here in the context of a discussion of the "cosmological constant problem." The cosmological constant (lambda) was added to Einstein's GR in order to accommodate the possibility of a non-static universe--say, an expanding universe. It was an afterthought which Einstein later regretted, but which has more recently seen a revival, e.g., in the form of theories of "dark energy" --a force invoked to explain the expansion of the universe and the growing distances detected among the galaxies and other large-scale structures. (Moffat, I should say here, makes no mention of "dark energy," but the themes are often connected.
The theme of "re-normalization" has to do with the frequent occurrence of apparently non-sense infinities in physical calculations, and this is connected with "perturbation theory." In particular, there is some considerable temptation to identify "dark energy" and the "cosmological constant" with an expansion of space arising from quantum fluctuations. However, when the actual calculations are made, then it typically turns out that the scale of the calculated force, arising from fluctuations (on many theories/approaches) is many orders of magnitude above what fits with the observed expansion. (To simplify, if you've got fluctuations, then there will be fluctuations on fluctuations, etc. ad infinitude, and that's what produces the problems addressed by techniques of re-normalization.) So it is a virtue of the kind of theory which Moffat puts forward here, that it is renormalizable, as is generally the case with LQG, as I understand the matter.
This paper represents a work in mathematical physics, as I take it, and it involves various idealizations or simplifications, such as restriction to consideration of Minkowski spacetime. But it builds on past results regarding quantum loops and gravity in connection with the "Standard Model" --which unifies 3 of the 4 known forces of nature.
What we see here is something of a snapshot of research in a developing paradigm. The general topic, I take it, will interest readers of this thread. I noticed in particular a statement in the Introduction:
An important feature of the field theory [here] is that only the quantum loop graphs have nonlocal properties; the classical tree graph theory retains full causal and local behavior" (p. 1)
--end quotation
The "nonlocal" features, I take it, have everything to do with QM, where prediction of non-local correlations may be involved, while the concern with "causal and local behavior" reflects the influence of GR. In general, I suppose that any theory of quantum gravity seeks to go beyond GR by treating GR as a special or limit case and modifying it, partly by reference to QFT, in the more general case.
H.G. Callaway
Article Quantum Gravity and the Cosmological Constant Problem
Dear Gruner & Schmeikal,
Many thanks for your thoughts. If we were simply reading Heisenberg, then I think there might be more room for your negative answers--as I read them. Causality is both a scientific and a philosophical concept, and there is plenty of talk of causality in physics and in philosophy. Eddington uses the phrase "principle of indeterminacy," not without some reason, I think, in preference to "uncertainty principle." He certainly does see some connection to causality.
So, imagine that we have a lump of some unstable isotope. The QM theory of nuclear decay will predict just what percentage of the nuclei will break down in a given period of time--explaining the half-life. However, there is no predicting which nuclei will break down at any given time or the sequence of breakdowns. This appears to be completely random. There is an aspect of the process of decay which can be predicted, and other aspects which cannot be predicted. All the nuclei are identical, and yet in the result of the process one breaks down at a given time rather than another, though there is no evident difference between them. So, it seems that we have something like, same cause --and different effect. There is an irreducibly random or statistical aspect of QM phenomena which does not fit in well with our usual concept of causality. Right?
If our general concept of causality does not map onto scientifically attested QM phenomena, then there is reason to doubt the general applicability of this concept of causality. --or so it seems. According to Eddington, I can tell you, after Heisenberg and Co, people lost interest in the general concept of causality and this in spite of Einstein's famous protest that God "does not play dice with the universe."
H.G. Callaway
There are scientific papers today that go beyond modern physics and go into that scientific-cultural area which can be defined "post-modern physics" where concepts of modern physics acquire a meaning of scientific decadentism losing every logical value and assuming simultaneously a meaning and its opposite. An illuminating example is the concept of quantum vacuum that means just "a vacuum which is not a vacuum". It is a consequence of Heisenberg's Principle that represents the theoretical fundamentals of modern and post-modern physics.
In macrophysics known laws of physics, in the absence of perturbations, fix a deterministic mathematical relationship of cause and effect, of input and output. It is manifest that in the presence of perturbations that deterministic mathematical relationship isn't valid and it needs therefore to make use of probabilistic stochastic mathematical models.
In microphysics Heisenberg's Indeterminacy Principle fixes an indeterminacy relationship about pairs of physical quantities for which the knowledge of one quantity is proportional to the ignorance of the other. It involves the invalidation of known laws of physics, independently of the presence of perturbations, and therefore the necessary use of probabilistic mathematical models for which the causal relationship isn't valid. Heisenberg's Indeterminacy Principle, as per its most true meaning (like Eddington's) goes beyond the pure act of measuring and theorizes an intrinsic indeterminacy of the nature, independently of the presence of the observer.
It is manifest that there are aspects of the physical world which appear random to observer for an evident ignorance of physical conditions that have determined rather an output than another. Here the question can be solved assigning a value of probabilty to every possible output. There are then other physical situations in which neither probabilistic models do well and in that case it is possible to make use of deterministic statistical models that work on global variables bypassing completely the Indeterminacy Principle. It is the event for instance concerning the decay of a radioactive element in which billions of nuclei can participate to the process and it would be truly an inhuman work to assign a value of probability to every nucleus. In that case we know after the half-life the half of nuclei is decayed (with a hardly calculable value of probability) while the other half is sound with a null probability of decay. In actuality in these cases the physicist isn't interested in knowing which nucleus is decayed but he is interested in knowing the law of decay that is defined by a deterministic exponential law.
That is in physics it isn't always necessary to know the local (point) behavior of a system but often is sufficient (above all in cases where the first option is impossible) to know the global behavior through global variables. The temperature of a body is a global variable that is connected in causal manner with the amount of heat provided the body: it doesn't mean every particle of body has the same temperature (it is impossible surely to measure the temperature of every particle) but what is measured is the average temperature of all particles. The consequence of this logic is that the random behavior of some physical situations is determined rather by a observer's principle of ignorance than by an intrinsic indetermination principle of the nature.
Dear Sasso,
I think this is your best effort yet. It seems that you have learned a lot from the exchanges--IMHO.
Eddington says somewhere, as I recall, that assigning a temperature to an individual particle would be somewhat like reading the riot act to an individual --since temperature is usually defined as the mean kinetic energy of a collection of particles. This illustrates the statistical character of the concept of temperature, and need involve no quantum mechanics. We are close here to the concepts involved in the second law of thermodynamics. All very interesting, of course. It may suggest an analogy to phenomena such as half-life which do involve quantum mechanics. Your argument, seems to depend on the analogy.
I would suggest that the analogy is leading you astray here. In the case of a collection of unstable nuclei, if we can predict the decay of a particular percentage of them over a particular period of time, partly by means of QM, then each particular nucleus does have a particular (small) probability of decaying, and the point of the illustration--and it is only one among various possible illustrations--is that though the nuclei are identical and have an identical probability of decay, some one decays at any given time, and not another; there is no predicting which one will decay or the sequence of the nuclei which decay. So, we have same cause, different effect.
No doubt, our ignorance is responsible for some phenomena which mimic the phenomena of QM, but that is not a reason to identify them. There is a constant of nature, the Planck constant, h, involved in the inability to predict QM relevant phenomena, and this is not involved in cases where ignorance leads to statistical results. I take the word of the physicists for this. It seems you want to contest the reigning theory.
It is not my objective here to do physics (or to re-do it, as you perhaps suggest), but to draw out the philosophy of science related to established physics. It would simply be hubris to attempt to re-do contemporary physics here. The aim is to understand it.
Beyond these few remarks, I think to observe that though you do not say much with which I am inclined to disagree (and it strikes me that you have clarified your views considerably), the overall rhetoric of your posting suggests that you are making a counter-argument. Maybe that is the key move of the "post modern"? But in any case, aren't we all by now somewhat post post-modern?
H.G. Callaway
Dear Callaway, I don't understand your statement:
"Dear Sasso,
I think this is your best effort yet. It seems that you have learned a lot from the exchanges--IMHO. "
I would be glad if you translate it in a understandable language for me.
Anyway at last I am beginning to understand your philosophical-scientific thought.
Sasso
The principle of indeterminacy is a contradiction in terms. If you can assert that something is indeterminate, and hold it as a logically valid statement, then you thereby annul the indeterminacy principle. I think perspectivism solves the problem.,
He may well be right, using his mathematical paradigm, but the issue is that he was able to make a determinate statement. How could he have sustained this?
Dear Opata,
Thanks for your comments and questions. I think that the general point is that making determinate (definite, clear, well established, theoretically compelling, etc. ) statements is not inconsistent with asserting QM indeterminacy. This is not the same thing as saying that statements about QM indeterminacy are (always) definite and completely clear. Certainly, there has been some considerable confusion about the matter. But consider the idea of de-coherence. If we hold that states of entanglement or superposition are often or frequently decaying due to random interactions, then this seems an improvement over the old Copenhagen view that an observer must be involved. We have to do with a quite definite statement about various sorts of quantum indeterminacies, and the mere fact that it concerns QM indeterminacy renders it no less definite, interesting or compelling as an element of the interpretation of QM.
We make many quite definite, and presumably true statements about matters which are themselves highly vague or otherwise problematic. I don't really see a problem in that. We don't have to interpret everything of interest in order to interpret anything of interest.
H.G. Callaway
"The uncertainty principle says only: the product of the spreads of location and of momentum always is greater than a certain quantity"
I think "inequaity" is all that may be structured into accountable theory. This raises controversy about Einsteins general theory. It should be intuitive that future historical events might only sometiimes be estimated from the path of the past. It is assumed that Heisenberg begins his investigation from like intuition from the fact of existence. Einstein's philosophical exploitation do not seem to contradict, he never resolved to elucidate a nature to innate motion. How might an expanding universe be witnessed and measured?
There seems to be a fascination with the phenomenon of entanglement. Einstein called it a "weird association", insisted on the 'event horizon' and common cause....here is where theory is obstructed..whereas Einstein is quoted "the experimenter is the most important part of the apparatus" no account is made at all other than "human individuals are composed of particles; they also invented the idea of elementary eternal particles.
A good example I think involve models that purport notions of startrek like teleportation. Total abstraction, but abstraction can also be employed in contradiction e.g. for such an event the world has to be the cause, and has to be both 'unlimited' or has to be of infinite possibility rather than composed of 'infinite possibilities' to cause, or approach to cause but one teleportation 'event' if an event horizon results and yet events (predicted coupled states) are uncaptureable to witness. "Infiniteness" and eternalness" have become seriously confused. Quoting Einstein "maybe the measuring rod and the clock are not the root elements."
If interested I am enclosing "Determining the Determined State : A Sizing of Size From Aside/The Amassing of Mass by a Mass"
teryouransweradd: though the use of 'inequality' to account for situations seems applicable "(no witnessible event (1) + no witnessible event (2) + no witnessible event (e)............. = event horizon) is not valid.
to add: This leads on to enable argument to challange the idea that events leading to birth are witnessible.
Dear all,
Those who have been following this thread may find the following video, on "Einstein and Eddington" of interest:
https://www.youtube.com/watch?v=BG2sDVjL1wg
It just recently became available on line, so far as I know; unfortunately, I am not sure in which countries it can be viewed. The video originated from the BBC.
I suspect the story is a bit romanticized, in this account, but it covers the period up to Eddington's eclipse observations of 1919, and their acceptance --resulting in great and lasting fame for Einstein of course.
H.G. Callaway
On Eddington and Einstein....Employing proper correspondences, relativity entails an event where "light got bent". Witness to such an event is precluded, thus an event horizon is precluded.....i.e. where might one stand, from what perspective might he witness from position A, unbent light, and (then) from position B, bent light. I think thus surfaces the advent of quantum mechanics and entanglement. There is a difficult conceptuat obstruction to encompass eternality to a unity. I think changes are needed at the axiom level-the world-universe beginning with light, a mirror and an inversion, is unaccountable for but followed by mass=number, axiomatic to rational interpretation, is accountable (for matter) and supervenes over all.
to add: I think the chicken-egg paradox always remains; light might be gotten from mass, mass gotten from light, though it is possible to account for the world in a rational manner, dismisses ideas of beginnings and ends, beginning with mass-the entity/self/first perspective, exists for testimony on the mass side of the equation.. The oscillation, involving again number or mass, may be very large or small, is always essential.... of mass to produce light to produce mass; of alternating light-mass to produce the weave of space, a delay between potential and kinetic from which change from within emerges.
Certain aspects involving prediction need to be released....if causes exist, they have an open path on the horizon, if causes exist, similarly do effects, ad-infinitum.
Philadelphia, PA
December 5, 2014
Dear Nagata,
Again regarding your comment here, I do not see any reason to think that the wave/particle duality of QM divides up as you suggest. Also, I think you may need to clarify what might be meant by the claim that "the property of the wave exceeds the property of the particle."
I see no reason to think that the uncertainty principle is more relevant to the wave conception than to the particle conception. So, if on your account, the uncertainty principle may be related to the wave conception, then it would seem to follow that it is equally relevant to the particle conception.
On the standard conception of causality, whenever the same cause is repeated, the same effect is expected to follow. But if there is a basic randomness is nature, then this seems not to be the case. So, consider a mass of some unstable isotope. QM enters into the account of its half-life, and it is possible to calculate exactly what percentage of the atoms will decay in a particular period of time. But though each atom is in a presumably identical causal condition, it is not possible to predict which atoms of the isotope will decay at any given point. So, apparently, we have, "same cause" followed by various different effects in an unpredictable manner. The uncertainty principle, also sometime called the indeterminacy principle, does appear relevant to causality, since it limits what we are able to predict on the basis of what we may know about initial conditions and the laws of QM. Moreover, this is not a matter of limitations to our knowledge which might be remedied. The idea is that the randomness is basic.
I see that you want to make some use here of the Bell inequalities, but it remains unclear what you want to show or how you aim to show it--in the present context of discussion.
H.G. Callaway
Indeed Bernd, Jung and Pauli were very close to the true, but is it not mainstream to connect Jung with archetypes and Pauli with the fourth quantum number.
It was Kepler who introduced the archetypes and this goes back to Plato. The 4 goes back to the alchemists who added the 4 as 1 to the 3. This was a major thought of Pauli, he was obsessed by the 4.
It is also not difficult to think in 4 dimensions when for so long we had to think in 3. Our brains are conditioned that’s a fact.
The ‘idea’ is stronger than ‘thought’ because it gets a vector, a certain directions and a potential to manifest itself in a new spacetime dimension.
If we make more work about the idea itself we are driven to composers and poets much more than to the computer. It is the creative brain that brings the idea to the explicate order. (David Bohm did his best to explain)
Yes I think we read the same books Bernd and I felt much sympathy for Pauli, I think his dreams are even more important than his publications. The next step for physics could be the difference between dreams and how they change during the individuation process. It is the man that has to develop first of all and afterwards the hidden variables will be visual observed in the explicate order. The mechanism of bringing the hidden variables to the surface is what matters.
In my movie I prove how I brought together 2 spacetime dimensions into 1 only by the intuitive method, which afterwards can be objective observed at the hand of 2 archives. So 1 plus 1 is not always 2
To penetrate or to fish in the implicate order is what can turn chaos into order.
Come over to Belgium in Afkikker December 27 at 9 p.m. The docudrama is explained in Dutch, you probably can understand, and you step into the history of 1940 in France, the way Pauli was escaping.
It was in the book Lago Maggiore of Johan Daisne (you can find) that I found the contact with a certain spacetime which led me to an unknown spacetime of my father. It came out that Daisne and my father were in Limoux (France) May 24, 1940.
Actually it was Steve Lacy who formed me at the inside.
Can it be if an idea is strong enough it becomes a spacetime at its own, different spacetime are fusing within a certain affinity and we reach them by intuition.
Philadelphia, PA
December 6, 2014
I just call'em like I see'em.
H.G. Callaway