Perhaps the quintessential component of quantum mechanics, at least with respect to what makes it qualitatively different from classical mechanics and indeed classical conceptions of physics (and which is behind much of what makes up the "weirdness" of quantum mechanics) is how probability is calculated. Being someone who has studied mathematics, you are doubtless aware that probability is found everywhere in the sciences and in other disciplines, not to mention the role it plays in philosophy, interpretations of causality, epistemology, etc. However, nobody calculates probabilities the way required in quantum mechanics (i.e., calculating the probability amplitude, a complex number, and taking its mod square). We cannot do this using the structure of any real-valued space, but rather we require the structure provided by the complex plane (and, obviously, Hilbert space). The motivation for this method of computing all relevant probabilities wasn't mathematical even in the sense that numerous aspects of the standard model were (i.e., once QM was sufficiently developed it was extended to incorporate classical electrodynamics and other fields of physics, such as field theory, but mostly via mathematics not empirical study). The easiest way to see this (IMO) is via the classic double-slit experiment. Classical physics tells us that in order to calculate the probability that e.g., an electron went through slit 1 or slit 2 is obtained using probability. Simple, easy, straightforward, and wrong. Quantum mechanics gives us the right results providing that we calculate this probability but with the additions that result from computing the mod square of a complex number (namely, the probability that the electron traveled through slit 1multiplied by its complex conjugate and the probability that it went through 2 times the complex conjugate.
Nobody knows why that's the rule, and nobody wanted it to be the rule. So if you can formulate a mechanics that reproduces the successes without reliance on complex numbers, I'll be in the first row when you receive your Nobel prize and the one desperately trying to get your attention for the after-party I'm sure you'll throw in order to thank you. However, I rather suspect that the only way one can do away with complex numbers and retain modern physics is by introducing mathematical spaces, structures, and/or operations that make quantum physics more complex than if when we use complex numbers.
Perhaps the quintessential component of quantum mechanics, at least with respect to what makes it qualitatively different from classical mechanics and indeed classical conceptions of physics (and which is behind much of what makes up the "weirdness" of quantum mechanics) is how probability is calculated. Being someone who has studied mathematics, you are doubtless aware that probability is found everywhere in the sciences and in other disciplines, not to mention the role it plays in philosophy, interpretations of causality, epistemology, etc. However, nobody calculates probabilities the way required in quantum mechanics (i.e., calculating the probability amplitude, a complex number, and taking its mod square). We cannot do this using the structure of any real-valued space, but rather we require the structure provided by the complex plane (and, obviously, Hilbert space). The motivation for this method of computing all relevant probabilities wasn't mathematical even in the sense that numerous aspects of the standard model were (i.e., once QM was sufficiently developed it was extended to incorporate classical electrodynamics and other fields of physics, such as field theory, but mostly via mathematics not empirical study). The easiest way to see this (IMO) is via the classic double-slit experiment. Classical physics tells us that in order to calculate the probability that e.g., an electron went through slit 1 or slit 2 is obtained using probability. Simple, easy, straightforward, and wrong. Quantum mechanics gives us the right results providing that we calculate this probability but with the additions that result from computing the mod square of a complex number (namely, the probability that the electron traveled through slit 1multiplied by its complex conjugate and the probability that it went through 2 times the complex conjugate.
Nobody knows why that's the rule, and nobody wanted it to be the rule. So if you can formulate a mechanics that reproduces the successes without reliance on complex numbers, I'll be in the first row when you receive your Nobel prize and the one desperately trying to get your attention for the after-party I'm sure you'll throw in order to thank you. However, I rather suspect that the only way one can do away with complex numbers and retain modern physics is by introducing mathematical spaces, structures, and/or operations that make quantum physics more complex than if when we use complex numbers.
The use of complex numbers is ubiquitous in electrical engineering as well. Both of these are a result of trying to find solutions of wave equations and the very important result of Euler:
EXP( i X ) = COS( X ) + i SIN( X ),
that at once unifies several branches of mathematics. A fun fact is that if X = PI then we find:
EXP( i PI ) + 1 = 0.
The only thing missing is the golden ratio ( PHI ). :-) I once saw a nice cartoon where normal probability theory was represented by the line segment [0,1], we imagine randomly picking numbers from this region. Then QM is imagined as an extension of this to 2D using the surface of the Bloch sphere, this works well for 2-state (spin 1/2) systems. Interestingly the Bloch sphere is an extension of the Poincare sphere which is used in optics to find polarization states.
Finally, always remember that not even the natural numbers "exist." In my world-view, there's no such thing as the number 1. We may observe a certain quantity of this or that thing. But the number itself is nothing more than an abstraction of the human mind's experience of the world. There is no PI in the sky :-D
Dr. Dadras introduces an excellent point (although I myself am not sure were I stand on the whole nominalism/realism debate). Stephen Hawking "wrote" a book God Created the Integers which is essentially a collection of selected works my famous mathematicians going back to the ancient Greeks, each with an introduction Hawking wrote. The title is taken from a translation of a well-known quip by Kronecker ("Die ganzen Zahlen hat der liebe Gott gemacht, alles andere ist Menschenwerk", which is normally translated "God created the integers, all the rest is the work of man", although a more literal translation might read "the entirety of the integers has the god God made, all else is man's work"). What is less well-known is that this is an expression of his view that any and all work in mathematics that involves numbers like pi or even the entire set of irrationals was trivial as these were fancies that eventually would be resolved when mathematics was restored to its proper form consisting only of integers. Centuries before him, the concept of negative numbers proved to be very difficult to accept (and let us not forget that, while not as critical as Kronecker, Dedekind responded to the presentation of Cantor's proof with "Je le vois mais je ne le crois pas".'
As important as complex numbers are to QM (and numerous other fields, as Dr. Dadras pointed out), without Hilbert space we're just as done for as we would be without complex numbers. And while imaginary numbers were treated with skepticism for some time, that's nothing compared to the reception infinity has received from Zeno to evangelical apologist and scholar W. L. Craig. Hilbert space doesn't just "extend" infinitely but does so along infinitely many dimensions, something that would probably have caused many a mathematician from antiquity's head to explode. Quantum logic as it is usually formulated (the most "canonical" form being that by von Neumann) violates the LNC and non tertium dataur/excluded middle. Yet even those who have worked to produce quantum logics have found their own results deeply troubling (what, for example, can it possibly mean to say that A & ~ A is true?).
In short (and to quit rambling), is there a philosophical or ontological objection to complex numbers playing the role they do (and if so, I would be very grateful if you would share it), or are you merely curious if QM could be improved somehow were the complex numbers removed?
While researching something else, I came across the following I hope may be useful:
http://en.wikipedia.org/wiki/ Spherical_harmonics
Use in quantum chemistry
As is known from the analytic solutions for the hydrogen atom, the eigenfunctions of the angular part of the wave function are spherical harmonics. However, the solutions of the non-relativistic Schrödinger equation without magnetic terms can be made real. This is why the real forms are extensively used in basis functions for quantum chemistry, as the programs don't then need to use complex algebra. Here, it is important to note that the real functions span the same space as the complex ones would.
For example, as can be seen from the table of spherical harmonics, the usual p functions ( ) are complex and mix axis directions, but the real versions are essentially just x, y and z.
The simple answer would be: yes, you can, but why should you?
If you have a complex function f, you can simply split it up into real-valued functions a, b, via f = a + i*b or f = a*exp(i*b). Plugging this into the Schroedinger equation yields coupled equations for the real-valued functions a,b without any complex algebra. There are reasons why this might be fun (semiclassics, for example). However, usually the normal Schroedinger equation is just simpler.
Manuel, Glad you took a look at the table of spherical harmonics, the conversions there of complex to reals is clear as is your well laid out cause and effect correlation with the reals; however, I do find Wheeler's inference that information is fundamental still circularly and still esoterically begs the root cause question. In other words, Einstein's nonlocal hidden variables may be self-similarly (cause=effect) right out in the open under our noses all along -- although a topic for another thread.
So many nice answers each one no less thought-provoking than the rest.
The question of the existence of negative numbers and more so of complex numbers will always be there. Even some are skeptical about the existence of even natural numbers! We learn what we are fed with constantly. So any new way of thinking, even going back a bit to the old way is welcome endeavor.
However, as someone expressed, all probabilities may be quantum mechanical. Please see my article"Are all probabilities fundamentally quantum mechanical?" on my RG page, wherein lot of issues involving probabilities that are usually hidden are brought out.
In fact real numbers are stronger, because they build an ordered field. C has no ordering, and there is no compatible ordering. A lot of phenomena, like complex dynamic systems (usually called complex fractals) can be properly defined ONLY working with real numbers, because one has a condition given by norm, followed by an inequality. All algebraic system in complex numbers can be written in real numbers by doubling the number of variables. Finally, the conditions of complex differentiability, which are so strong that they imply complex analiticity, can be also written as a real system. So, it is just a matter of taste if one uses complex variables or not. They are just more elegant in some situations, lead to more compact formulae, and enjoy some special properties - as the fundamental property of the field of complex numbers, to be algebraically closed.
Hello Natalia! Well, they shouldn't be so scared if one explains SQRT( ) as geometric operation. If multiplication is "add angles and multiply lengths" than SQRT( ) is naturally "halve the angle and real SQRT the length" - with a supplimentary explanation of finding two different angles because of 2 \pi periodicity... I find that geometrically SQRT(-1) makes a lot of sense.
complex numbers are like the shadows of others ...:) and as the shadows move as the light moves, so your shadow can turn around u even if you remain stationary. Without complex numbers it would be very complicated to treat simple arguments, but not impossible: I think just to phasors.
"Le plus court chemin entre deux vérités dans le domaine réel passe le plus souvent par le domaine complexe" ("the shortest path between two truth in the real domain often passes through the complex domain")
Jacques Hadamard.
"Die ganzen Zahlen hat der liebe Gott gemacht, alles andere ist Menschenwerk" ("God made the integers, all else is the work of man.")
Leopold Kronecker
Really, the real numbers (and the p-adic numbers) are the hard ones to understand because you have to create and come to grips with limits, i.e. an essentially infinitary concept. Creating the complex numbers from the reals, however, is a breeze, and they crop up naturally all over, including in quantum mechanics where they are the most natural description for interference, obviously an essential part of quantum mechanics. Going over to a completely real description of quantum mechanics is sometimes useful, however, see the WKB method.
I was exaggerating somewhat with respect to the Nobel prize for the sake of levity (even humor!).
More importantly, thank you for pointing me to your paper, which I've just finished reading over and found intriguing and valuable. It is always a good idea to take a step back and question assumptions, and important (I think) to at least occasionally do so in radical ways, such as to "reject all merely postulated concepts such as mass, point particles, wave functions and even spacetime.”
Descartes is probably the most famous example of approaching ontology from a perspective of initial radical skepticism and building upon the absolute minimum, but is by no means alone. In particular, as quantum physics developed, various philosophically inclined physicists provided numerous excellent philosophical & metaphysical treatments, both on the nature of physics and physical theory as well science and scientific theory more generally. Although there is some truth to Gell-Mann’s accusation that Bohr brainwashed a generation of physicists into thinking the measurement problem was solve, that was a few decades after Bohm’s beautiful text Causality and Chance in Modern Physics, several decades after Hugh Everett III’s radical “just follow the math” interpretation-less interpretation of QM, a couple of decades after Bell had delved into the infamous EPR paper, and so on. Basically, the “shut up and calculate” approach (accredited to Feynman but was actually a quote from N. David Mermin) never wholly dominated, nor should any single framework, epistemology, or ontology. So I applaud the effort taken to take so little for granted and work your way up. I am reminded by it of many related works by the founders of modern physics, from von Neumann’s Mathematische Grundlagen der Quantenmechanik & Dirac’s “The Relation between Mathematics and Physics” to later works by Bell, Wigner, etc.
But most of all your paper brings to mind EPR and Bohr’s response (the Phy. Rev. response, not the Nature paper). EPR was not only a critique of QM but a critique based upon a description of what EVERY theory in physics (“fundamental” or no), must provide, and in so doing the authors also defined “physical reality” (for a physical theory, according to EPR, is only complete iff every element of the theory is explicitly distinct from reality AND corresponds directly to a unique element of it). Bohr’s response practically begins with EPR’s “criterion of reality”, namely that “physical reality cannot be determined by a priori philosophical considerations”. However, to put simply what has been and arguably still is an enigma in the history and philosophy of physics, Bohr argues that this is itself an a priori philosophical consideration, and that Einstein sought to make nature compatible with his epistemology via a particular ontological perspective that is assumed, yet need not (and, according to Bohr, is not) accurate.
In particular, Bohr argues that if we assume some metaphysical perspective of what a theory of physics must provide, we have made fundamental assumptions about the nature not only of theories but of reality, and we have applied therefore a priori philosophical considerations, which I think in your case is most concisely contained in your statements “In a fundamental theory the variables are supposed to represent entities of physical reality and hence they are entities that are characterized by their existence in time" & "Physical meaning depends on a constant quantity that could serve as a reference". This stands in rather stark contrast to Whitehead’s ontology and the influence it had among physicists and philosophers. In particular, the statement “we assume that physical reality can be described by the totality of all fundamental variables” takes as fundamental atemporality rather than process. Additionally, although you use the word “variable[s]” in your paper, the first reference to fundamental variables has a footnote in which the term used is changed to “fundamental constants”. I find this particularly interesting because it appears (and please correct me if I’m wrong) that you are equating the two, and thus treating as constants that which you call variables, i.e., defining class of “variables” that MUST NOT vary. Ironically, then, by certain “fundamental” variables to be invariable, you take as fundamental the notion of time (for that which does not vary with time cannot be defined as such without assuming time exists and in such a way as to allow reality to be characterized by it).
It seems you may be open to the same criticism leveled at EPR, namely that in assuming you have rejected all postulates except that measurements characterize physics and- as patterns, repeatability, etc., characterize measurements- then “physics is irresovably committed to time.” Of course, we need not rely on Bohr’s criticism alone, for Whitehead, Bohm, and others take almost opposite approaches to what is fundamental. In Bohm’s Causality and Chance in Modern Physics, it is no surprise that perhaps THE essential (or fundamental) component of physics and physical theory is causation, but while we can use causal relationships in their various ways to allow for an “adequate degree of approximation…as a constant background", we must face that “in reality, no perfectly constant background can exist”, and the approximation is contextualized. To the extent physics is “irresovably” committed to anything, it would appear that the only reason time is one such thing (and the only reason that reproducibility and so forth matter) is to allow for causal relations (predictions, after all, are one way of saying that some set of processes and properties of a system or collection of systems will find itself in a certain state at time t because of a collection of causal processes that bring about this state). Patterns, however, do not require time, and causality, even for one like Bohm for whom it is so fundamental, must be understood contextually.
There is a fascinating text by Vesselin Petkov entitled Relativity and the Nature of Spacetime. There is much in it I disagree with, but again radical interpretations, challenges, and questions are of vital importance. Petkov argues that relativity not only implies the absence of simultaneity, but among many other interesting claims that this entails fatalism: spacetime MUST be understood ontologically, this MUST mean the universe is a “block universe”, and therefore the only reason something has or hasn’t happened is due to a particular reference frame (i.e., for any event E that has occurred, is occurring, or will occur, there is a reference frame R which E has occurred and a reference frame R' in which it has not). While I view his conclusions as insufficiently supported, even a basic, textbook presentation of special relativity describes the differences in measurements that cannot be judged against any invariant background, and so far experiments have demonstrated this to be so.
It seems that you are taking as necessary the need for a constant background, some invariant property, “fundamental variables”, abstraction from/of nature, or simply some constancy as a requirement without addressing some of the arguments of the most influential physicists and philosophers of the 20th century, and falling prey to the part of EPR that is now rarely considered compared to the number of citations of the paper: the belief that we can, let alone should, require that physical theories contain or be characterized a priori by anything other than physical reality, which can only yield physical theories through an approximately constant background and via contextualization. Both criticisms (which are developed in various ways by various philosophers and physicists even before the 20th century), while certainly not necessarily true, are rather serious challenges (in my view).
some sciences (like economics) can live without complex numbers despite extensive use of other mathematics.
When studied the theory of functions of complex variables, it was really an introduction to mystic world. Just remember Riemann surface of complex logarithm as an infinite staircase!
Applications are quite substantial in physics. It is much easier to describe electromagnetic wave as complex exponent, but formally one can deal also with sine and cosine waves.
The dynamic Shrodinger equation in quantum mechanics is also written with the use of i=sqrt(-1), but of course all probabilities are real. Complex numbers represent a compact form of writing equations and finding solutions. So why not to use them?