Is it sometimes worthwhile to try to unpack the mathematics? By that, I mean to say, to look at what the mathematics is doing symbolically and try to relate it to what is physically happening to the phenomenon being mathematically modeled. A mathematical derivation might take lines or pages. It may be difficult to follow the mathematical reasoning. Even if one manages that, then one may ask, what steps in the mathematics correspond to physical events in the real world? Steven Weinberg at page vi in the Preface to his Cosmology (2008) mentions that “Occasionally the formulas were wrong, and therefore extremely difficult for me to rederive.”
An example of physicists unpacking mathematical formulas is the book Spacetime Physics by Edwin Taylor and John Wheeler.
Entropy dS = dQ/T is often presented as a mysterious ratio, but it can be considered as the number of degrees of freedom in an amount of energy proportional to dQ relative to T. Mathematics as a succinct symbolic representation of the relationship between attributes that can be characterized numerically should permit seeing through to the essence of the relationship. Is it possible though that sometimes the mathematics obscures rather than illuminates the relationship?
Most theoretical physicists idealize math. Of course math is necessary in physics, that is not the point. The issue stands in how it is used. An inadequate usage may be very counterproductive. Nowadays math is very often used in a very fanciful and unrealistic way, and in this respect, indeed “mathematics sometimes obscures the physics”. Many theoretical physicists should leave the confines of the Universe and go back to Earth.
I know this will not please to theoretical physicists but that is the way it goes. Instead to just complain they should make an introspection. To exemplify my opinion let me address the following comment to the question:
How much credibility can we grant to theoretical physics?
The image that theorists are giving is not that much upright. Let me illustrate this.
In December 2015 an excess at 3.6 local sigmas was observed at 750 GeV in the difotonic channel (H → γγ), by pure chance, both in ATLAS and CMS (LCMF, Dec 15, 2015). This led to the publication of more than 600 theoretical articles in arXiv about the process (LCMF, Mar 22, 2016). Months later the excess disappeared (LCMF, 05 Aug 2016).
So, in a few months 600 theoretical articles were written justifying a fake event. What is therefore the credibility of mathematical issues in view that they can manage to justify an inexistent upshot?
Theorists should be more careful in “not throwing so many stones against the roof of their house”. Furthermore, “Publish or perish” may not be so wise after all and publishing compulsivity may have the opposite effect.
Still, this question, which is not mine, posted on researchgate, may be quite relevant:
“A wrong turn developing new theories leads to a cul-de-sac. Without other perspectives we just creep to the end wall! How can we stop this happening?”
Dear Robert Shour
For now, only one quote:
"Today's scientists have substituted mathematics for experiments, and they wander off through equation after equation, and eventually build a structure which has no relation to reality." Nikola Tesla, Modern Mechanics and Inventions, July, 1934
Your Sincerely
Idealized (Platonic) mathematics not only obscured; but destroyed theoretical physics since Einstein's incorporation of mathematical idealism in so-called "New Physics" from the turn of the 20th century. The following comment made in another related RG forum may also be relevent to this question. Article Free Fall in Gravitational Theory
[I was obliged to come back (again) to this forum to debunk the last-ditch, lame and pathetic effort to claim the validity and the credibility of an esoteric and mathematical idealism based theory that has no philosophical and scientific validity and does not represent positive knowledge of the world; other than a sophisticated mathematics driven mythology! False claim is a false claim, IS a false claim; no matter how many times it is repeated and especially when the false claim is based on a fiction caused by a fiction, all derived from a “mother fiction” and represents a “mother of all tautologies”!
Even if the “mother fiction” was a consistent story, still it could have some poetic or psychological value to some; but it is in fact “a mother of all confusion” as the proponent of the “mother fiction”, Albert Einstein himself said, “Who would imagine that this simple law has plunged the conscientiously thoughtful physicist into the greatest intellectual difficulties?” A. Einstein, in "Relativity, The Special and General Theory" (Three Rivers Press, New York, 1961). The 2 years-long (not to speak of the past more than hundred years) “debate” (i.e., “confusion” by another name) among the “mathematics-loving physicists” (both supporters and opponents) in this forum alone; shows how right and insightful the great Einstein was! Indeed, the fact that Einstein was a giant and a towering figure among the “position-hunting, mathematical cobweb-spinning and eclectic flea-cracking” pygmy physicists of the modern times he helped to breed; is demonstrated by his two other great insights; namely his assertion of the impossibility of gravitational waves and his following statement about a year before his death, “I consider it quite possible that physics cannot be based on the field concept, i.e., continuous structure. In that case, nothing remains of my entire castle in the air, gravitation theory included, (and of) the rest of modern physics” A. Pais, Subtle is the Lord …” The Science and the Life of Albert Einstein”, Oxford University Press, (1982) 467.
This “Castle in the Air” that Einstein talked about was not built through practice, but is based on axioms, esoteric premises, pre-suppositions, and during more than a hundred years of its existence, this Castle did not lead to even a single social/historical practice for humanity; - the only way to distinguish between positive knowledge of objective reality on the one hand and myth, mystery, fantasy etc., on the other. Parasitic monopoly capitalism in alliance with the obscurantist Vatican has turned this “Castle in the Air” into an intellectual kingdom of medieval type “scholasticism” that has no relevance to reality and is an end in itself – a perfect and secure ruling idea and a replacement for moribund theology.
The only justification for this scholasticism comes from centuries long innumerable so-called “experimental proofs”, all of which “proved” this "mother fiction” with “flying colours”; but still needs more and more without any end! It is like the story of a king who wanted to fill up a new pond with milk for his new and beautiful queen to bathe in. He ordered that all his subjects must pour in the pond a liter of milk each during the course of the night, to fill the pond. But each poured a liter of water instead; hoping that others would bring milk, so one liter of water will make no difference. By morning the pond was filled up only with water!
Contrived “proof” alone does not a scientific theory make, especially when that “proof “ is a tautology and has gapping holes in it. Newtonian physics worked satisfactorily for centuries and still does, without even a single proof, because it arose from practice and was “proved” through further social/historical practice, technology etc. that this physics brought forth!]
The fundamental relationships are important, but you don't necessarily gain a lot of insight from them alone. We are still unpacking these relationships and finding new meaning to them. One example, related to thermodynamics, is that the second law seems to react in a reverse fashion, for a system within a heat bath. Some of my work relies on this idea, established, in part, through Jeremy England's work: Article Statistical Physics of Self-Replication
.The so-called "entropy " doesn't exist at all, it was "derived" by mistakes in history.
During the process of deriving the so-called entropy, in fact, ΔQ/T can not be turned into dQ/T. That is, the so-called "entropy " doesn't exist at all.
The so-called entropy was such a concept that was derived by mistake in history.
It is well known that calculus has a definition,
any theory should follow the same principle of calculus; thermodynamics, of course, is no exception, for there's no other calculus at all, this is common sense.
Based on the definition of calculus, we know:
to the definite integral ∫T f(T)dQ, only when Q=F(T), ∫T f(T)dQ=∫T f(T)dF(T) is meaningful.
As long as Q is not a single-valued function of T, namely, Q=F( T, X, …), then,
∫T f(T)dQ=∫T f(T)dF(T, X, …) is meaningless.
1) Now, on the one hand, we all know that Q is not a single-valued function of T, this alone is enough to determine that the definite integral ∫T f(T)dQ=∫T 1/TdQ is meaningless.
2) On the other hand, In fact, Q=f(P, V, T), then
∫T 1/TdQ = ∫T 1/Tdf(T, V, P)= ∫T dF(T, V, P) is certainly meaningless. ( in ∫T , T is subscript ).
We know that dQ/T is used for the definite integral ∫T 1/TdQ, while ∫T 1/TdQ is meaningless, so, ΔQ/T can not be turned into dQ/T at all.
that is, the so-called "entropy " doesn't exist at all.
in ΔQ/T, the relationship of Q and T is the ratio of ΔQ and T or the product of 1/T and ΔQ.
But in dQ/T, the relationship of Q and T is not the ratio of dQ and T or the product of 1/T and dQ, but the relationship to Find the Original Function of 1/T in dQ/T=1/TdQ.
For we know Q is not a single valued-function of T, namely, there is no such function relationship as Q=f(T).
in fact, Q=f(P, V, T), so, dQ/T=1/Tdf(P, V, T) is meaningless in itself, that is ΔQ/T can NOT turn into dQ/T.
The premise of integral is the existence of a function !
To a single integral, first of all, there must be a Function y=f(x), then, ∫ ydx=∫x f(x)dx may be meaningful, similarly,
∫ xdy=∫x xdf(x) =∫x xf'(x)dx=∫x G(x)df(x)
=∫x dF(x) is meaningful.
( in detail, to review the definition of the integral)
But, as we know, Q is NOT a single valued function of T, this alone already results in ∫1/TdQ is meaningless. And, in fact, as I pointed out in my paper, Q=f(T, V, P), so, 1/TdQ=1/Tdf(T, V, P) is meaningless, that is ∫T 1/TdQ = ∫T 1/Tdf(T, V, P) is not a meaningful integral, or say, it is not a integral at all.
maths does obscure physics merely when it is misunderstood, ergo: false applied
Physics end engineering students somehow shallowly "chew" math without really "consuming" it, thus getting impression that "it may difficult to follow the mathematical reasoning".
This is not the case with experienced researchers. On the contrary.
And, Weinberg is not complaining about mathematics, he obviously was just bored to do tedious work again.
Can the mathematics sometimes obscure the physics?
Any field of physics must have at least two major aspects to it. It should have a logical foundation to understand what part of the natural world it addresses, and why the mathematics it uses or it proposes should be valid. Classical physics does a good job of both, regardless of its correctness, but modern physics, such as quantum physics, the standard model of particle physics, the Big Bang model, special relativity, general relativity, etc. often greatly lack concerning logical explanations.
The physics consists little else than the mathematics and verbiage, language interpretation of what is involved. So generally speaking physics consists of at least these two foundational parts, its mathematics, the verbal interpretation of mathematical calculation and interpretations of related observations.
So one cannot say that mathematics obscures physics because a major part of physics is the math, but one could say that the interpretations of the math and observations in one of more cases defies logic, and many or most such interpretations result in a poor understanding of the underlying physics involved.
physics use maths because its laws are built mathematically (by using proportionality for instance)
Most theoretical physicists idealize math. Of course math is necessary in physics, that is not the point. The issue stands in how it is used. An inadequate usage may be very counterproductive. Nowadays math is very often used in a very fanciful and unrealistic way, and in this respect, indeed “mathematics sometimes obscures the physics”. Many theoretical physicists should leave the confines of the Universe and go back to Earth.
I know this will not please to theoretical physicists but that is the way it goes. Instead to just complain they should make an introspection. To exemplify my opinion let me address the following comment to the question:
How much credibility can we grant to theoretical physics?
The image that theorists are giving is not that much upright. Let me illustrate this.
In December 2015 an excess at 3.6 local sigmas was observed at 750 GeV in the difotonic channel (H → γγ), by pure chance, both in ATLAS and CMS (LCMF, Dec 15, 2015). This led to the publication of more than 600 theoretical articles in arXiv about the process (LCMF, Mar 22, 2016). Months later the excess disappeared (LCMF, 05 Aug 2016).
So, in a few months 600 theoretical articles were written justifying a fake event. What is therefore the credibility of mathematical issues in view that they can manage to justify an inexistent upshot?
Theorists should be more careful in “not throwing so many stones against the roof of their house”. Furthermore, “Publish or perish” may not be so wise after all and publishing compulsivity may have the opposite effect.
Still, this question, which is not mine, posted on researchgate, may be quite relevant:
“A wrong turn developing new theories leads to a cul-de-sac. Without other perspectives we just creep to the end wall! How can we stop this happening?”
apropos "Nowadays math is very often used in a very fanciful and unrealistic way": that is very true. RT leads to singularities. I have used distributions theory to avoid singularities, say: operate even in singularities too. i.e.: classical singularity isn`t a singularity any more- if one uses maths.
Yes, mathematics can obscure the physics.
Example 1: The Lorentz transformation is simply a symmetry group of a wave equation. Of every wave equation. Say, you have water waves, or sound waves. Then, given a solution, you can use the Lorentz transformation (with the c as the speed of those waves) to construct the Doppler-shifted solutions.
In relativistic spacetime metaphysics, this symmetry group obtained a quite obscure status.
Example 2: Probability theory as the logic of plausible reasoning (Jaynes). Plausible reasoning intuitively seems something vague, thus, like something which cannot be ruled by strict mathematics. But it can. These strict rules are the rules of probability theory. These precise character of the rules seems to have prevented the interpretation as rules for such an imprecise thing as plausible reasoning.
Example 3: The foundation of thermodynamics. Essentially the same effect - entropy seemed to be such a precisely defined thing following objective laws that it has to be some objective thing. In modern Bayesian interpretations it merely describes our incomplete knowledge.
Dear R. Poznanski
I disagree.
The delta-function is good mathematics - by the way, an example where sloppy reasoning in physics (by Dirac) has improved mathematics, where the sloppy use in physics has been replaces by precise, well-defined definitions and constructions, which gave the former, imprecise and sloppy "physical" notions a well-defined mathematical meaning.
This well-defined meaning was not the same the same as intended initially by the physicist who proposed this. What appeared to be well-defined after the mathematicians have evaluated this were expressions like \int f(x) delta(x-a) dx, giving f(a), but there was nothing proposed to make sense of \int delta^2(x-a) dx. (Even if Dirac would have liked to have a definition of \int delta^2(x-a) dx too, because some of the expressions one needs in QFT are of this type, this did not happen).
Despite this, the range of applicability of the delta-function was wide enough that the mathematicians have accepted even the denotation, so that you can find the delta function today in many mathematical papers without any hesitation. Except that mathematicians will not even consider products of delta-functions (except, possibly, in some considerations about the possibility that one can somehow make sense of such denotations.)
Your "where they teach physicists only classical mathematics without teaching modern mathematics" sound dubious for me. Quantum theory has, without doubt, had a large influence on the development of mathematics. A lot of functional analysis is the development of something which came from physics, namely quantum theory, similar to the delta function.
But after this there was nothing similar. Ok, with the exception of string theory, which, I quote out of memory some Fields medalist, has, "given that it requires a 26 dimensional space, nothing to do with reality, but given nonetheless excellent insights for mathematics". It is not an accident that string theory guru has a Fields medal (and for good reason, as far as I can see) but not a physics Nobel price (there would be no point to give him one).
So, if we do not name "modern matematics" matemathics which are more than a century old, there is no point of teaching physicists "modern mathematics". The only parts of modern mathematics I would consider to be important for applications like physics are ... those related with how to do computer approximations correctly. Thus, essentially those where "computations take over".
I strongly support the position expressed by Prof. Georges Sardin in his comment above. The revival of Platonic mathematical idealism and the concept of “spacetime continuous field” (Matter is a Myth and so is Motion!) as the basis of objective reality (if it exist at all for some "physicists"!) in modern physics is a regressive step in the face of the discovery of the “Evil Quanta” and breakdown of causality by the turn of the 20th century.
This is a reactionary step by official physics (with Albert Einstein at the lead) to safeguard the venerable notions of rationalism/theology, namely, causality, certainty, continuity, determinism etc. Immanuel Kant did the same in philosophy by declaring "objective reality" as an “unknowable thing-in–itself. This is a complete negation and undoing of materialism, which was once the greatest merit of physics and natural science since the Copernican revolution.
But in the final analysis, Albert Einstein must of course bear the ultimate responsibility, which in fact he did admit by the end of his life but no one cared! It is because unlike the subjective idealism of Kant (for whom reality was an unknowable thing-in-itself) in philosophy; Einstein (who apparently followed Kant) eliminated the difference between what is ideal/rational (i.e., the thought content of the “logical categories”, “analytic functions” of mathematics, God of theology, etc.) and what is real; between the pure and ideal world of mathematics and material/physical reality that physics deals with; between pure mathematics, whose program is the exact deduction of consequences from logically independent postulates, and the applied mathematics of approximation needed for physics. Physics traditionally used approximate empirical data, which could be fitted on in various ways to analytic functions of pure mathematics, but the results are only valid in a narrow range of the data values for the argument.
An ideal property of the analytic functions, which impresses the worshipers of symmetry, beauty and aesthetics is that, such functions are known for all values of their argument when their values in any small range of the argument values are known. Thus, the proposition that the laws of nature involve analytic functions leads to a complete mechanistic determination of the world based on their experimentally determined (or even calculated from theory) value in a narrow range only. The folly of such an enterprise is brought home with the recognition of the "weird" quantum phenomena, which vindicates the assertion of materialist dialectics: “There can be no matter without motion and no motion without matter”:
Article The Philosophy of Space-Time: Whence Cometh "Matter" and "Motion"?
Article Real/Virtual Exchange of Quantum Particles as a Basis for th...
Einstein is responsible for turning theoretical physics into an impotent and medieval type antinomy laden scholasticism and tautology, that is playing out now and for initiating mathematical-cobweb spinned Fairy Tales of cosmology. The Big-Name charlatans of official “theoretical physics” extended it to the absurdities with "proofs", as it is now!
Dear R. Poznanski
sounds like you have not understood my point, given that you argue that some modern mathematics "will make quantum mechanics look like kindergarten". My point was that these parts of modern mathematics don't have any applications in physics which make it worth to teach these mathematics to physicists.
Einstein faked his mathematics derivation in his theory of relativity. Very unfortunately, the equations in the theory of relativity has great dominated on both physics and mathematics. So, there are fundamental problems in the mathematics in physics and pure mathematics. Please see:
https://www.researchgate.net/publication/330482341_Is_the_math_in_current_physics_beautiful
Regarding how much can we rely on modern mathematics:
Quarks, with their six flavors, their three colors, their two fractional charges, plus the eight types of gluons, all these entities are daughters of mathematics (the QCD), but they do not exist in reality, even though the CERN scientists interpret all their experimental results in total intellectual subjugation to the Standard Model. It turns out that some time ago free quarks could not exist, and the official reason was that the gluon cohesive strength increases with the distance between quarks. But that is the past and now fortunately it turns out that the CERN scientists pretend to detect free quarks as products of the proton-proton collisions, through the Higgs boson decay (H → b b).
Yes, the Higgs boson was discovered. Of course, it had to be precisely the Higgs boson, not any boson. But how is it that they are so sure that it is indeed the magical Higgs boson, the creator of mass, the one that blows mass to the particles, the privileged “God’s particle”, and not an anonymous homeless boson. Well, no doubt it is the genuine Higgs boson, since apparently it had a label that bears its name, but not the expiration date! What would be the CERN without the Higgs boson! It was a matter of survival; at all costs it was necessary to justify how successful have been the enormous investments made. Still, it turns out that now, fruit of so much success, they will go for more, i.e. a new accelerator of 100 km in circumference. Poor taxpayers instead of being provided with housing, hospitals, children nurseries, geriatric residences, renewable energy, etc., i.e. things that would be useful to them.
Furthermore, it turns out that they have apparently discovered another boson. But the bad news is that it was not foreseen and does not fit with any current theory, but of course they are working hard to urgently develop one in which it will fit. If this is not achieved, evidently it will be needed to make it disappear. What a bad fortune! But, have you heard about this new boson? I'm sure a majority, don’t. What a difference with the Higgs boson hype! How discreet is this new boson, and of course if it fails to fit, in a very forced way as always, into a new approach beating the records of artificiality, how a short life that unfortunate boson is going to have! It will not become famous; it will remain anonymous without deserving any publicity. I may not even get a name and it will pass to the rank of phantom particle. What a bad luck its, if it had been discovered before the Higgs boson, it would have been itself declared to be the Higgs boson, with all its glory. What a bad fortune some particles have, they fail to access to the Mount Olympus and live with the Greek gods!
But let's be rigorous. Those who really know tell us this: “Predictions for the probabilities of various decays of the lightweight Standard Model Higgs”.
60% of such particles would decay to bottom (b) quark / antiquark pairs
21% would decay to W particles
9% would decay to two gluons (g)
5% would decay to tau (τ) lepton / antilepton pairs
2.5% would decay to charm (c) quark / antiquark pairs
2.5% would decay to Z particles
0.2% would decay to two photons (γ)
0.15% would decay to a photon and a Z particle
Amazing! Yes, we also have or will have lightweight Higgs bosons. How lucky we are!
But now, without any joke, how sad it is to evidence how experimental physics is subjected to such a fanciful mathematics as is the QCD. What a Pity!
There is not much “physics” left in modern theoretical physics; it is all usurped by monopoly capital and the Vatican, to preach mathematics guided theology - “The heavens sing the glory of the Lord and the firmament showeth His design.” Mathematics finds and physics “proves” those designs with contrived and manipulated “experiments” by hundreds of multi-national “scientist serfs” (To borrow an expression from the Bengali poet Tagore), lured by the promise of fame, fortune and funds. The “discovery” of the “God Particle”, “Gravitational Waves” etc. are such “proofs” to convince the poor and sinful mortals of the “handiworks” of the creator.
We discussed the “discovery of the God Particle” (especially in the light of the Guardian Blog “Life and Physics” by Prof. Jon Butterwoth, Head Physics, UCL and the leader of the British Team with the LHC/ATLAS) in the following short RG forum: https://www.researchgate.net/post/How_much_and_how_does_a_Global_Positioning_System_GPS_depend_on_relativity_theories
The following is a quote from a person writing in the Guardian who works with the LHC/ATLAS, describing the modus operandi of the earth-shaking “discoveries”: “A hundred years (ish) ago there was lots of great experimental science going on. There was no clear idea what atoms were, but there was an understanding that they weren't solid balls. Ernest Rutherford and others had established, through experiments, that there was a nucleus in the center of an atom and had postulated that electrons were 'in orbit' around this nucleus. He came up with this description based purely on his experimental results: there was no theory at the time that predicted this.
I feel quite nostalgic for this sort of experiment-driven theory. In the era of the LHC it is not the done thing to come up with a new theory to describe what we see experimentally. In general, our results have to fit some theory that has already been proposed. When they don't (they don't) we tune the theoretical predictions to match our data, like twiddling a load of knobs (most of these theory predictors, which we call Monte Carlo, have twenty or thirty knobs) until we get some agreement. I am moaning about this, but really we don't have a choice.”
How mathematics obscures physics is exemplified in contrasting a long series of equations leading to a formula, such as (1) in Stefan’s Law or (2) the 4/3 fractal envelope of Brownian motion which uses the sophisticated mathematical tool of Loewner evolution processes. In both cases dimensional analysis can be used to derive the results more simply.
There is an argument that dimensional analysis in these two cases, as examples, is not merely an analytical tool to arrive in a kind of algorthmic way at the formula that models the physical relationship, but rather that the pertinent formula is a product of dimensional relationships that are fundamental to the physical processes involved. Both of these examples appear to involve the relationship between a 4 dimensional system and a corresponding 3 dimensional system.
Dr Schmelzer in the third of his 3 examples reply mentions entropy.
How Clausius derived the entropy concept seems to me utterly remarkable. He was looking for an invariance that resulted in an ideal Carnot heat engine being able to repetitively cycle.
Bimalendu N. Roy in his marvelous text, Fundamentals of Classical and Statistical Mechanics, at p. 29 says the concept of entropy is, so to say, abstract and rather philosophical. He is right but it should not be so much so.
Incidentally, the Royal Society of Chemistry includes this in part: `the entropy of a system corresponds to the molecular distribution of its molecular energy among the available energy levels’ which is a nice verbal description.
Part of the problem is the T (temperature in degrees Kelvin) in delta S = delta Q /T. What is this T? It is an invention, it is not intrinsic. But T itself for the system is proportional to an amount of energy just as delta Q is. In this way the original formulation of entropy seems to obscure that there is an amount of energy proportional to delta Q that results in degrees of freedom, delta S, proportional to a kind of scale factor (based on T) being an amount of energy proportional to T. Might it be more physics simple to regard entropy as degrees of freedom of a quantity relative to a specified quantity, to avoid the abstractness and philosophical aspects of entropy? The mathematical ratio formula using T makes it harder not easier to understand what the entropy concept describes, while being useful in cases where the Boltzmann constant plays a role.
Physics is nothing but applied mathematics.
Enjoy the distinguished mathematical physicist and string theorist Robbert Dijkgraaf explaining how geometry becomes of physics or vice versa through Einstein's tensorial field equations of General Relativity, which is ,as an effective perturbative theory, the correct theory for gravity, in this very short and intriguing YouTube video
https://www.youtube.com/watch?v=-TGvde_mVXY
Those who says physics is nothing but applied mathematics have no understanding of any real physical process. Will they explain that why Bragg's diffraction happens? With what Bragg's planes are made of? Will they explain that why simple cubic metal can not exist in nature? Why every mass rotates about it axis? Why Venus, Uranus, Neptune, Pluto, Iris and Cupier belt rotates clockwise? Any Ignorant mind can blabber as much nonsense as he/she want, but nature is very very simple. No complicated math required to explain any real physical process. To explain fake virtual process, complicate math required. But virtual process has nothing to do with nature and it process.
Outside view
I see a somewhat different problem in modern mathematical physics. Physics initially studied the nature. But nature is complex and diverse. Therefore, the transition from reality to a model (reflecting only certain aspects of the reality) seems logical. After that, all the efforts of physicists were directed to the mathematical solution of models. And now, in my opinion, physics is not engaged in reality, but in its own models. These models often have no natural basis, these main property is solvability (in the mathematical sense), if nature does not correspond to the models - these are problems of nature ...
For me, as a chemist, it was always interesting to go through the classical chemistry chain: composition-structure-(defectiveness)-property and find out the effect of composition changes on the target property, using intermediate (for this chain) models. It would seem that physicists should be the first allies on this path. Unfortunately, this is not the case. The biggest shock in my scientific life was the case when I synthesized a number of chemical compounds that differ in only one parameter (defectiveness). There were 5 samples that demonstrated the linear dependence of the properties available to me (the crystal cell parameter, oxygen content, density, electrical conductivity, etc.). But when I gave these samples to physicists, they took spectra (I will not say what kind, but it were very specific ones) and described 5 samples with five different models !!! You can say that these are bad physicists (or bad chemist). Maybe, but there are no others! I tried to explain to them that we can assume one, even two concentration phase transitions, leading to the fact that a single series of samples disintegrates, but not five! I am not so brilliant as to plan the composition of the samples so that 5 different phases are obtained (which must be described by different models ...). Moreover, the differences in composition were miniature.
Therefore, I do not believe in the physics of mathematical models.
But I do not leave attempts to get from them anything other than beautiful pictures and complex formulas.
The situation reminds me a joke:
A passer-by (P) late at night saw a drunk man (D) who was crawling intently on the sidewalk.
P: What happened, can I help you?
D: I lost my watch, I cannot find.
A passer-by for some time to no avail looks for a watch near to a drunkard.
P: Tell me more precisely where exactly did you lose this?
D: There in the ditch.
P: But why then are you looking here?
D: Here is a street lantern, here is lighter.
Yes, mathematics can for sure obscure Physics and propose models which are not Physical.
Physics is an experimental science which deals with Natural science for which most of the math have been invented since Newton. Phsyics needs concepts, new concepts which are also express through math (continuity, differentiability, integrability) Many current theories will be sooner or later falsified for better of even very different models.
It means that Physics is not math, but can be expressed by it sometimes to a very good extent.
The key point to make Physics is to understand Physics then you can use first logic then math as a universal, coherent and powerful language to express the understanding, share them and perform experiments.
You can make Physics through math and in some case it was fruitful but most of the time it is the only way for the people who do not have a Physical feeling and cannot judge if what they write can be Physical or not.
There is also the opposite reasoning though, i.e. mathematics often provides a much better insight into physics.
I’ll give an example from my own experience. According to the Hilbert space formulation of quantum mechanics, the state of a quantum system is described by an element – a so called Dirac ket – of a separable complex Hilbert space. But what does it mean for a Hilbert space to be separable? The definition which is often given in mathematical-physics books, notes, etc, is that a Hilbert space is separable if and only if it admits a countable basis. A “mathematical-literate” reader can easily understand that, here, “a countable basis” actually means that “ALL bases are countable”, because any two basis sets must have the same cardinality (the same “size”), and thus if one is countable, the other must be countable too. On the contrary, a reader with no adequate mathematical background may be left with the impression that a separable Hilbert space may have uncountable bases too, which is the wrong picture, especially in connection with the essential role of the position and momentum states. In cases like this, mathematics provides a crucial and invaluable insight into the topic.
Excessive use of mathematics in physics may have reached a sensitive imbalance. This is evident even at the basic levels of classical physics: solid deformation theory, flow of fluids, etc. The development of the theory has led to very high-performance software programs, with which we can achieve very good research and design performance. As these programs have become more powerful and user-friendly, the physical and mathematical scientific material included has become more and more voluminous ... but less well explained, and users are often unaware of physical and mathematical assumptions used in various models used by the computer software. The temptation to use more and more advanced software leads to users without sufficient knowledge (neither in physics nor in mathematics). In the best case, the result is a funny one. In sad cases, failures occur due to experimental validation missing. If validation is done ... but it is a very expensive operation, for complicated models, often nothing is done ... in this case, sadness is even greater ...
Many generations of young people live the illusion of "virtual" simulators ... for which validation is extremely expensive ... that's why it tends to distort reality to fit it "well enough" into the theoretical model, in the virtual ...
Let us remember that we generally have five basic senses known to us. If we do not try as much as possible to report the results of experiments validating the theoretical results to the five senses, how can we explain the science of those who work so that we can have everyday food ... of course, first of all by applcations which make ease their work ... but the gap between the level of these applications and the theoretical peaks, is not too high? Are we at the limits of understanding the Ecclesiastes' words?
Our luck is that we are tolerant enough to understand each of other (at least apparently), although our practical efforts to rigorously define any word (including definition) are useless at least in this age ...
God help us
Hello, we can find not few occasions to apply geometry to physics that I often find that surprising. For this, with mechanical inventions, is what merits most of our attention; otherwise, as somewhere, we lose our intelligence in playing with futile calculations as well as calculations in theory of space curvature theory.
@Stefano Quattrini is right in that mathematical results and findings may be physically unacceptable, but the historical development of physics teaches us that what is considered physically unacceptable today, it may be absolutely acceptable and an important scientific discovery tomorrow. There are plenty of examples for that.
In my viewpoint and as an exercise of concision, I have selected parts of the most significant comments here posted:
I see a somewhat different problem in modern mathematical physics. Physics initially studied the nature. But nature is complex and diverse. Therefore, the transition from reality to a model (reflecting only certain aspects of the reality) seems logical. After that, all the efforts of physicists were directed to the mathematical solution of models. And now, in my opinion, physics is not engaged in reality, but in its own models. These models often have no natural basis, these main property is solvability (in the mathematical sense), if nature does not correspond to the models - these are problems of nature ... Vadim S. Gorshkov
Those who says physics is nothing but applied mathematics have no understanding of any real physical process. Gokaran Shukla
Excessive use of mathematics in physics may have reached a sensitive imbalance. This is evident even at the basic levels of classical physics: solid deformation theory, flow of fluids, etc. The development of the theory has led to very high-performance software programs, with which we can achieve very good research and design performance. As these programs have become more powerful and user-friendly, the physical and mathematical scientific material included has become more and more voluminous ... but less well explained, and users are often unaware of physical and mathematical assumptions used in various models used by the computer software. Cardei Petru
Physics is an experimental science which deals with Natural science for which most of the math have been invented since Newton. Physics needs concepts, new concepts which are also express through math (continuity, differentiability, integrability) Many current theories will be sooner or later falsified for better of even very different models….You can make Physics through math and in some case it was fruitful but most of the time it is the only way for the people who do not have a Physical feeling and cannot judge if what they write can be Physical or not. Stefano Quattrini
Yes it can obscure the situation, particularly when the precise mathematical formalism chosen apply marginally or not at all, to the physical problem under consideration. It takes substantial expertise in both these disciplines to be able to correctly link these two aspects of the problem.
To me, Mathematics is like a blank paper on which we can bring to life objects through our creative ingenuity and mathematical insights , its existence and beauty is independent and unconditional w.r.t. its applicability in natural sciences.
N.B : There is no such thing as "complicated" maths or "easy" maths, only the portions we understand and those we don't, at least not that well, what mathematics nature really follows will be clear only after we fully understand nature itself, and we are quite far away from attaining that understanding as of date, till that date all theories, concepts etc. are nothing more than well guessed speculations.
Mathematics, diagrams, graphs etc. are just tools to help illustrate patterns in nature. Although mathematics is perhaps the best tool, it is not foolproof. It is limited by the system of units in use. Conventional systems of units are incapable of giving a complete description of nature. Resolve this issue and everything will easily and seamlessly fall into place.
Mathematics not only obscures modern physics; it has become impediment for further progress as a form of alienation. Alienation is a condition in which a creation of man (meant for his own need) at a certain stage of its development goes out of his control as if coming from outside and like a Frankenstein Monster sets itself to control and haunt its creator!
This alienation is nowhere as powerful as in modern official cosmology. The mathematics-Monster now dictates what man should know or not know about the universe. This monster created other cosmic monsters like “Black Holes”, “Dark Matter”, “Dark Energy”; etc. ad nauseum that dominate the narrative on cosmology. This Mathematics Monster also forced antimatter as a taboo subject; in spite of the fact that antimatter is ubiquitous throughout the visible universe both as isolated particles and chance accumulated small, large and giant aggregations that are the sources of X-ray, diffuse or concentrated gamma-ray and giant Gamma-ray-bursts (GRBs) due to annihilation processes with ordinary matter throughout the universe. Any evidence of antimatter if admitted at all, is attributed to the mischief of the cosmic monsters: https://www.researchgate.net/post/Why_the_ubiquitous_presence_of_large_scale_antimatter_in_the_universe_does_not_lead_to_the_demise_of_the_Big_Bang_theory
Yes math can obscure physics. But that is in the hands of the physicists.
Math is counting of things or relating geometry. Physics adds the counting is of standard units with the assumption the standards are constant. That is a physics assumption. For example, consider a pendulum clock. Move the clock or increase its altitude and the time is as Special Relativity says it is. We are examining the wrong thing.
Further, the equal ("=") mening. In math it means counts are the same. Physics places many unstated meanings to "=". Changing the meaning over time induces confusion upon confusion. The origional meaning of the field equation was just Riemann's method of solving a 4 dimensional equation by reducing it to 3 dimension and a geometry - a transformation which needed to be inverse transformed back to reality. then the view of the left side changes to be spacetime. Then spacetime became real. So, the calculation of the speed of gravity uses the left side is real and ends with a value inconsistent with some experiments. Another error is to note a a same form, set an "=" sign between physically different things and conclude they are related as math says they are. For example, the form of masses attracting is the same as electric charges. The equality here is assumed. It is circular to find some relation.
Another form of the = sign is where physics should use a cause-effect. For example, the GR field equation could be read as the right side (T) causes the left side (G). or is it the reverse? which if either is subject to experiment?
like physics attached standard units (and forgot what a standard means) to the counts, physics needs to have different symbols of "=" for the type of calculation being done.
No, mathematics is the basis or cornerstone of physics is a subject. If a student is good in mathematics, such student will do well in physics.
The so-called "entropy " doesn't exist at all.
During the process of deriving the so-called entropy, in fact, ΔQ/T can not be turned into dQ/T. That is, the so-called "entropy " doesn't exist at all.
The so-called entropy was such a concept that was derived by mistake in history.
It is well known that calculus has a definition,
any theory should follow the same principle of calculus; thermodynamics, of course, is no exception, for there's no other calculus at all, this is common sense.
Based on the definition of calculus, we know:
to the definite integral ∫T f(T)dQ, only when Q=F(T), ∫T f(T)dQ=∫T f(T)dF(T) is meaningful.
As long as Q is not a single-valued function of T, namely, Q=F( T, X, …), then,
∫T f(T)dQ=∫T f(T)dF(T, X, …) is meaningless.
1) Now, on the one hand, we all know that Q is not a single-valued function of T, this alone is enough to determine that the definite integral ∫T f(T)dQ=∫T 1/TdQ is meaningless.
2) On the other hand, In fact, Q=f(P, V, T), then
∫T 1/TdQ = ∫T 1/Tdf(T, V, P)= ∫T dF(T, V, P) is certainly meaningless. ( in ∫T , T is subscript ).
We know that dQ/T is used for the definite integral ∫T 1/TdQ, while ∫T 1/TdQ is meaningless, so, ΔQ/T can not be turned into dQ/T at all.
that is, the so-called "entropy " doesn't exist at all.
.
.
.
Why did the wrong "entropy" appear ?
In summary , this was due to the following two reasons:
1) Physically, people didn't know Q=f(P, V, T).
2) Mathematically, people didn't know AΔB couldn‘t become AdB directely .
If people knew any one of them, the mistake of entropy would not happen in history.
Please read my paper and those answers of the questions related to my paper in my Projects.
https://www.researchgate.net/publication/230554936_Entropy_A_concept_that_is_not_a_physical_quantity
When A and B are any two quantities, you CAN NOT turn a product of any two quantities AΔB into AdB directely, then said B=f(A) !
e.g., when L is the Cable Length, I is the current intensity of the Cable, could you turn LΔI into LdI ?
NO, you certainly can’t ! for there is not the function relationship as I=f(L) or L=f(I), so, Lid is meaningless.
Similely, could you turn 1/TΔQ into 1/TdQ directely then said Q=f(T) ?
NO, you certainly can’t, for there is not the function relationship as Q=f(T) at all, in fact we know Q≠f(T).
And once we know Q≠f(T), then we immediately know that 1/TΔQ can’t be turned into 1/TdQ.
In fact, Q=f(P, V, T), generally speaking, two of them(P, V, T) are independent.
If one starts with a false axiom then to come to a correct conclusion one must make at least 1 error
Dear Paul Pistea
That's wrong.
I can start with "A and B", where A is true and B false, thus, the axiom is false too. Then I make a completely correct derivation "A and B" => A, and obtain a correct conclusion.
Dear Ilja, a false axiom is only 1, not A and B. I said A. Merely one!!
You would have to add that in defining your one axiom it is not allowed to use the logical "and" operator.
Whatever, I start with NOT(NOT A OR NOT B) Now its one axiom, not?
(Hint: counting axioms is not a well-defined operation.)
1st axioms need not be defined
2nd false that false is not true, it can be unknown (fuzzy)
3rd you are right, I should have say, start with a false hypothesis and ARGUE, not combine axioms
Dear Ilja, my proposition is not about combining axioms, or presume presumptions, it is about: make 1 presumption, argue and come to the conclusion.
Directions in which Einstein searched are inaptly, because of space contraction in those directions :))
Absolutely, but mathematics supports physics, so they can complement each other.
https://www.quantamagazine.org/secret-link-uncovered-between-pure-math-and-physics-20171201/
Many researchers and philosophers (including the famous science fiction writer and philosopher Stanislav Lem) believe that a mathematician can create a construction that does not have a material referent. In that case, mathematics could really “obscure” physics. However, another group of specialists, led by the great Plato, (and I, the sinner, along with them) believes that the human brain, being part of nature, can not invent anything that would not have a material referent anywhere and under no circumstances. Accordingly, any mathematical construction is a certain archetype, and it is only necessary to determine the physical conditions under which this archetype is realized in the material world. It is clear that SUCH mathematics cannot obscure physics in any way.
It is not just the mathematics which can obscure physics but also the system of units of measurement.
The SI system of units was designed around a dimensionless fine structure constant so that a number of theoretical radii could be created, (ie. Classical electron radius, reduced Compton wavelength and the Bohr radius) to promote an orbiting electron model of the atom.
I don't know if anyone has noticed, but the orbiting electron model has been replaced by the electron cloud model. Even though physicists have made this realization, physicists to this day continue to use a system of units and equations which were specifically designed for an orbiting electron model. Electrons don't jump from one orbit to another, so why is this system still used?
This represents the true flaw in physics. A system of units designed to show radii incremented by 1/137, obscures the true physical property which should be related with the fine structure constant.
Perhaps in the future, physicists will come to realize that it is the volume of the electron cloud which is related to the fine structure constant and not the fictitious radii. Specifically, the fine structure constant is the quotient of a volume divided by velocity squared giving it the dimensional units [m^1s^2]. Presently, theoretical volume can not be calculated, because a dimensionless fine structure constant does not allow for volume calculations. This is why physicists must claim that an electron is a point particle with no spatial extent.
So in summary, the mathematics in physics gets lost by a system of units which does not completely correspond with reality. In other words, some of the numbers used in physics are just numbers and don't correspond to anything physical which we can relate to. Physics is currently engaged in manipulating its way around a flawed system, rather than accepting that which is correct and rejecting those parts of the model which are obviously flawed. This means that all aspects of physics which pertain to an orbiting electron must be rejected or modified. Until this is done, best of luck unifying physics.
Dear Randy Sorokowski
The SI was not designed "around a dimensionless fine structure constant". It was designed around the real best measurement devices for every particular unit, and is modified if a better measurement device appears. All this has completely nothing to do with a promotion of particular models.
I would recommend you to learn how the SI works before criticizing it.