Electromagnetic wave is an example of phenomenon which is invisible (except for light or high power fields) and which has been discovered thanks to visible experiences with electricity and magnetism and the development of a theory for joining them together.
I wonder if some scientists are working on new potential phenomena which are not related yet to any experiences but find their origin from hypotheses?
Thank you.
I would put
'Supersymmetry in particle physics' in top position.
'Proton decay' (diamonds are not forever) in second.
'Magnetic monopoles' may be the oldest unverified hypothesis.
'Superstrings' (but some traces of string behavior is visible as Regge trajectories, in the organization of elementary particle spectra).
Maxwell's equations indeed led to the acceptance of electromagnetic radiation but it took several decades. In modern times following from Maxwell some astrophysicists are pretty confident they have discovered gravitational radiation, predicted by Einstein's GR dating back to 1916. So that's almost 10 decades. But scientists are not nowadays encouraged to experiment and investigate without some solid theory behind them; so I don't think many research grants are going to be awarded for investigations based only on hypotheses. Unfortunately!
I am not sure whether you really mean “Are they research subjects on …?”? If it is correct, you sort of doubt if working on theoretically predicted but not discovered yet phenomenon is being a research subject. I think, in this case the answer is definitely yes - for instance, recently experimentally confirmed Einstein’s gravitational waves, discovery of graphene by Geim and Novoselov.
Looking at your second question “I wonder if some scientists are working on new potential phenomena which are not related yet to any experiences but find their origin from hypotheses?” one can assume that there is a typo in your initial question, and instead of “Are they … ” it should be “Are there …”. If it is the case, it is still not clear if it is a rhetorical question, or you indeed asking for examples of such research subjects that are not resolved yet? The latter is a very interesting and important question in a professional career of any researcher.
I think, there are several possibilities in choosing a research subject “on theoretically predicted but not discovered yet phenomenon”. One option is relying on a personal scientific background, actual past achievements, and motivation. Pursuing such subject is a serious challenge and a high risk endeavor. It is more typical for mature scientists skilled in weighing all the factors involved. Another option is that less tested scientists may choose or happen to be a part of the team working on such a subject, and then the lack of personal strength in some aspects can be compensated by other team members. One is certain – there are and will be examples of research subjects on theoretically predicted but not discovered yet phenomenon.
As far as your second question
“I wonder if some scientists are working on new potential phenomena which are not related yet to any experiences but find their origin from hypotheses?”
is concerned, look at the definition of a “phenomenon” below:
________
1. A fact or situation that is observed to exist or happen, especially one whose cause or explanation is in question.
Synonyms: occurrence, event, happening, fact, situation, circumstance, experience, case, incident, episode
2. Philosophy: the object of a person's perception; what the senses or the mind notice.
________
A number of scientists, whose domains are physics or other natural sciences, consider a phenomenon as “A fact or situation that is observed to exist or happen”, so that these categories of scientists are not typically working on “new potential phenomena which are not related yet to any experiences but find their origin from hypotheses?”.
However, a number of philosophers and mathematicians who are inclined to another meaning of a phenomenon as “the object of a person's perception; what the senses or the mind notice” do rely on and expect sometimes in their work of “new potential phenomena which are not related yet to any experiences but find their origin from hypotheses”.
Hi Halim
The string theory seems to be one of them
https://en.wikipedia.org/wiki/String_theory
BR
Dear dr. Halim Boutayeb,
let me express my viewpoint.
Faraday’s induction and the Faraday disc are the experimental basis of Maxwell’s equations. To show the feasibility of telecommunications was a remarkable achievement on the part of Heinrich Hertz. Since then electromagnetism is successfully applied to “telecommunicated” (induction) signals, which are phenomena, indeed (phenomenon=appear, to be manifest or manifest itself).
The wave equation describing propagation of electromagnetic waves was mathematically derived from Maxwell's equations’ system. Although at that time interference was associated with light, wavy propagation was not considered reliable at once.
In my opinion, electromagnetic waves were and are still as invisible as is light itself. As for light, it cannot be perceived. Before the invention of LASER it was argued that that happens because light vibrations are emitted at random. Light sources, illuminated objects, and received electromagnetic signals are perceived instead.
Since electromagnetism was by no means conceived as a predictive theory (https://en.wikipedia.org/wiki/Predictive_power), but modern theories are, let me add something about predictive theories. A prediction is a statement about an uncertain event, i.e. an event whose effective appearance is ruled by chance. Measured data are the quantities assigned to the observed events. Since predictions are mathematically handled by probability theory, and Σpi=1, probabilistic theories require to enumerate all possible events, or either all possible data sequences. Thus, among all possible events, there can be predicted ones which have not been discovered yet. Again, each extension of the theory allows new, possibly still undiscovered, possibilities.
There is an additional point to consider. On the one side dark matter (https://en.wikipedia.org/wiki/Dark_matter) is a predicted event, beyond what was observed until now. On the other side, I myself cannot exclude that (electromagnetic induction) signals in new frequency ranges will be picked up from deep space. Thus, in a certain sense, also electromagnetism faces not yet discovered phenomena.
Dr. Halim, the mechanism of natural events is carried in simplistic and convincing manner, and as Dr. Sara cleverly stated, "electromagnetic waves were and are still as invisible as is light itself." and "Thus, in a certain sense, also electromagnetism faces not yet discovered phenomena."
Many asked how Electric Field and Magnetic Fields are combined? To become Electromagnetic Radiation (EM-R)? What condition initiate it? And how it is been produced?
If radiation is produced through acceleration, then any accelerators should continually radiates EM-R, but this doesn't happened, hence something is missing.
We suggested a Flip-Flop (F-F) mechanism in “The Electromagnetic Radiation Mechanism,” at: (http://fundamentaljournals.org/ijfps/downloads/68_IJFPS_Sept_2014_72_79.pdf), while the energy of such radiation is given in “Electromagnetic Radiation Energy and Planck’ Constant,” at: (http://www.ijirae.com/volumes/vol1/issue10/67.NVEE10087.pdf).
Other controversial issue is the quanta (photon), we found that Planck' energy formula E=hv, clearly stated that the energy is embedded in the EM-R, then we suggested a force is also embedded in EM-R, this is given by Eq. (24) in "The Photoelectric Effects-Radiation Based With Atomic Model," at: (http://www.fundamentaljournals.org/ijfps/downloads/82_IJFPS_March_2015_18_31.pdf), it contain 5 tables showing data using the formula, while the Compton effect which endorsed Einstein quanta (photon), is resolved using Eq. (24) in “The Compton Effect Re-Visited,” at: (http://crescopublications.org/pdf/JAAP/JAAP-1-004.pdf), it also gives how EM-R is pulled from an electron in an atom, it also help solving “The Double Slit Experiment Re-Explained,” at: (http://www.iosrjournals.org/iosr-jap/papers/Vol8-issue4/Version-3/M0804038698.pdf), which contain the structure of Planck constant.
All these were facilitated by a magnetic force formula equivalent to Lorentz formula, given by Eq. (8) in “The Magnetic Interaction”, at: (http://www.journaloftheoretics.com/Links/Papers/MY.pdf), and thanks.
I would put
'Supersymmetry in particle physics' in top position.
'Proton decay' (diamonds are not forever) in second.
'Magnetic monopoles' may be the oldest unverified hypothesis.
'Superstrings' (but some traces of string behavior is visible as Regge trajectories, in the organization of elementary particle spectra).
I would say the effects of quantum vacuum fluctuations are observed in f.i. atomic spectra like Lamb shift. Not so much different from many other effects being 'observed'.
Unruh effect, that the quantum vacuum looks hot to accelerating observers, has been used to explain why polarized electron beams can never become 100% polarized, in some beautiful papers by John Bell and Jon Magne Leinaas^*. Some details are different, due to differences between longitudinal and transverse accelerations.
The Casimir effect is probably a good demonstration that external geometric influences can change the vacuum energy, just like finite temperature can change it (by duality, these two are mathematically almost the same). It has been written about how the same causes make two nearby ships in rough sea attract each other; that is certainly observed.
Horizons and Unruh effect have close analogies in other systems; I vaguely remember they have been (attempted) observed in such,
Some predictions (magnitude and form of fluctuations in CMB) of the "big bang + inflation" model seem to come out right.
Dark matter seems to be observed by gravitational lensing ('Bullet cluster').
^* Leinaas and I once had the honor of been thrown out of John Bells office at CERN, due to a heated discussion about the nature of the quantum wave function :-)
Electromagnetic waves describe light, which is very visible. There isn't any such notion of ``visible'' experiences, in distinction to ``invisible'' ones.
Any and all descriptions of natural phenomena have their origin in hypotheses, involve working out the consequences and are related to experiments. So that's everyday science.
This question is a bit a puzzle to me ! I understand the general meaning but not the details.
Maxwell’s equations are not theoretical predictions. They summarize experiments and observations in electricity and magnetism using the concept of electric and magnetic fields. Neither did Maxwell develop his equations to theoretically predict that light is an electromagnetic phenomenon. The fact that light is an electromagnetic phenomenon was deduced from combining an equation for the electric field with an equation for the magnetic field, both transformed into wave equations, which made appear a constant 1/(μ0.ε0)1/2 which has the value of the speed of light c. There was no previous hypothese that light was an electromagnetic phenomenon. And the nature of light was already studied for centuries : it was not a new potential phenomenon.
Yes, scientists do work on new potential phenomena which find their origin from hypotheses but are not « discovered » yet, as showed in several examples given in former posts. But these hypotheses on potential phenomena are never made gratuitously. They are forced by observations (in a broad sense). For instance, the not yet discovered dark energy corresponds to an hypothese to explain the observed acceleration of the expansion of the universe (thus the pressure must be negative, thus there must be a kind of energy to allow for the acceleration, but we don’t know it, thus we call it « dark energy », etc…). Similarly String Theory is triggered by the need that Quantum Physics and General Relativity be merged in one theory to describe « observed » phenomena like the big bang (which is indirectly observed by the expansion of the universe and the CMB) and black holes (which are indirectly observed by gravitational movements like around the center of galaxies, the Milky Way etc…).
No, scientists never work on new potential phenomena which find their origin from non-grounded hypotheses. This might be done in philosophy if well structured. But more commonly this is fantasy. Not science.
And one can usually not say now whether a potential phenomenon which would find its origin from a grounded hypothese and on which scientists are thus hardly working will be discovered once in the future. On the contrary, as demonstrated by the deduction from Maxwell’s equations that light is an electromagnetic phenomenon, future discoveries could result from unexpected crossing of well known uncorrelated research paths.
In the case of the discovery that light is an electromagnetic phenomenon, two uncorrelated research paths unexpectedly crossed thanks to Maxwell’s synthesis : (1) experiments and observations on electricity and magnetism and (2) studies on waves and on light as a wave (and measurement of its speed in vacuum).
Nobody could imagine that one would make this discovery before Maxwell’s synthesis, even Maxwell himself !
There isn't any such notion as ``grounded'' or ``non-grounded'' hypotheses-there are hypotheses. These have consequences, that can be worked out-and the consequences matter, since they only can lead to experiments, that show whether the hypothesis does describe certain phenomena-or not. It's after the fact, not before. If the hypotheses are not consistent with what's already known, they will, inevitably, predict consequences that have been, already, excluded and, thus, will be eliminated.
So not only classical electrodynamics-i.e. the consequences of Maxwell's equations-but the Standard Model, i.e. the consequences of the quantum effects of the electroweak and strong interactions-along with the realization of the Brout-Englert-Higgs mechanism-is, now, the hypothesis, from which new consequences are worked out.
And these are explored not only at the LHC, but in experiments in atomic physics, since measuring the electric dipole moment of the electron or the neutron, for instance, can probe the effects of unknown particles.
Research in materials science, also, probes new effects.
Stam,
Of course the words "grounded" or "non-grounded" do not correspond to anything "official". I used them to insist on the fact that the hypotheses are not gratuitous. They are made on basis of what is already known. As you put it : "If the hypotheses are not consistent with what's already known, they will, inevitably, predict consequences that have been, already, excluded and, thus, will be eliminated." It's just to preclude any fantasy. I was a bit afraid of what could be understood from the last sentence of the question on "potential phenomena which ... find their origin from hypotheses" and wanted to close any misinterpretation ....
Hypotheses *are* fantasy-by definition. It's by comparing their consequences to what's known, already, theoretically and experimentally, and by working out the consequences that's it's possible to establish what fantasies are, in fact, relevant for describing what natural phenomena.
Ok, I understand ! Oui, l'aspect ludique fait aussi partie de la recherche :=)
Some physical phenomena are theoretically predicted, but not observed. Here the theory is motivated by its beautiful properties. Supersymmetry is such an example, but to fit observations its possible manifestations have had to be mutilated to such an extent that it is now hard to recognize any leftover beauty.
I will classify General relativity as a successful example where the theory became before observations.
In many other cases a theoretical "hypothesis" is developed to explain some (perhaps) observed phenomena. Cold dark matter is in my mind a very good example. It is a hypothetical common explanation of several unrelated observations. In my opinion, it would be an enormous triumph for theory if stable particles with the right properties of cold dark matter were discovered at CERN (or elsewhere).
The "discovery" of monojets in the mid-1980's was an interesting experience. There came a multitude of hypothetical explanations, each rapidly killed because they were in conflict with other observations. In the end no theoretical explanations were left. As amply verified by later experiments, in reality there were no effects to explain. (I am not sure if any observations were ever formally published, it was mostly rumors about low-statistics signals.) The more recent excitements about faster-than-light neutrinos was of similar nature. The conclusion by Glashow et. al., that is could not possibly have occurred, was correct.
Superstrings are different; they are so far from observable predictions that anything can be speculated -- within certain standards of mathematics and logic.
Supersymmetry isn't a theory-the Minimal Supersymmetric Standard Model is an example of a theory.
And it's misleading to speak of beauty-it's in the eye of the beholder and isn't impersonal. Physics is impersonal.
The motivation for the relevance of supersymmetry, once more, is quite practical-though, curiously, it's not stated this way:
The mere existence of the Higgs boson implies that its vacuum expectation value receives quantum corrections. If these are too large, perturbation theory breaks down. This was already clear, before its discovery. For perturbation theory not to break down, new particles had to contribute quantum effects to cancel the effects of the known particles. The new particles had to have spins related in such a way with the spins of the known particles, that the effects of known fermions would be cancelled by unknown bosons and vice versa. This implies transformations that can be shown to be realized by structures known as superalgebras.
After the discovery of the Higgs boson this became even more cogent, because it turns out that perturbation theory does describe it. This is an experimental fact.
However the cancellation isn't exact, the superpartners don't have the same mass as the known particles. How to look for them isn't known, because this, imperfect, cancellation is all that constrains them. On the other hand, the data analysis of the events at the LHC is extremely challenging, so measuring Standard Model processes to discovery precision takes much more time than many people thought-and some may not be possible to measure to discovery precision at the LHC.
Therefore it's simply wrong to speak of supersymmetry as if it were a theory-it isn't. And it isn't known how to realize it in theories that can describe elementary particles, in the most general way.
Yes, General relativity is a successful example where the theory became before observations. However note that it was triggered as a generalization of Special relativity. By the way, in the context of present question, the existence of the cosmological constant is very interesting as a fruitful hypothesis. It was introduced by Einstein to achieve a static universe according to the knowledge of his time. When the expansion was observed, he later declared it was his "biggest blunder". But Einstein was so creative that even his biggest blunder could be reclothed to become a key hypothesis of the present standard Λ-CDM model in cosmology, 100 years later !
By 'Supersymmetry' in my previous post, I meant 'Supersymmetry in particle physics' (as qualified in an earlier post). Sorry for being imprecise. But the study of groups is also very often referred to as 'group theory' (while I have never seen the analysis of euclidean plane geometry, or complex analysis, or the likes, being labelled as theories). The mathematical aspects of supersymmetry is well established, and profitably applied to some physical problems.
I stand on may opinion that beauty is an important aspect of a theory, and that this is much more that a subjective concept. Economy may, to some extent, be a comparable quality. From this standpoint, it becomes very difficult to accept 'Supersymmetry in particle physics' as an attractive theory. Because it requires the introduction of either too many spin-0 fundamental fields, or too many fundamental spin-1 fields. Because there are too many fundamental spin-1/2 fields in the standard model. A truly beautiful extension of the standard model must provide a believable explanation of the three families of fundamental fermions.
Guibert> it was triggered as a generalization of Special relativity.
And surprisingly many aspects of General relativity follows from the analysis of Special relativity in other coordinate systems than inertial ones.
A theory always comes ``before'' observations-if it doesn't, it doesn't predict anything, it explains in other terms-and it's important to realize that, since there are many, equivalent, ways of explaining the same effects, the, apparent differences are irrelevant; so a theory that doesn't provide for something that hasn't yet been measured is empty.
The cosmological constant isn't a hypothesis-while, when Einstein introduced it, it wasn't known, it is, in fact, an inevitable term of the Einstein-Hilbert action and of the Einstein equations. What's also, known, is that, within classical gravity, it's not possible to determine its value, because classical gravity is only sensitive to the ratio between the normalized Newton's constant and the cosmological constant.
``Supersymmetry in particle physics'' is a term empty of content, just like for any other symmetry, incidentally. The content is provided by the representation of the symmetry group and by the dynamics-unless the system is integrable, which means it's free. And it's not known which representation is appropriate and, since there are infinitely many, it's not possible to scan them. Absent any assumptions, there's no meaning that can be attached to ``how many fundamental particles are too many''-that's the size of the representation-or to the term ``fundamental'', for that matter. What does matter, is whether calculations, under controlled approximations, can be made.
The only thing that is known is that the discovery of the Higgs implies the existence of new particles. What their properties are remains to be discovered.
Vacuum energy has been measured-it's the cosmological constant.
Quantum vacuum fluctuations have been measured-they're known as the Casimir effect.
Black holes have been observed and what event horizons are, is understood, regarding astrophysics.
The Higgs particle has been discovered and its interactions have been measured and are being measured.
Dark energy can be described quantitatively by the cosmological constant. At the scale where it's relevant, classical gravity is sufficient.
It is known that to describe the mechanism of inflation and the Big Bang requires a theory of quantum gravity, which is, still, not known. However to compute the consequences of inflation requires just classical gravity and quantum field theory and this is being done.
It might be useful to learn the technical issues.
Stam,
>> “The cosmological constant isn't a hypothesis …. it is, in fact, an inevitable term of the Einstein-Hilbert action and of the Einstein equations….”
So one can say that Einstein discovered it while trying to solve his – that time relevant, but now irrelevant - problem (achieve static universe in the equations). In fact Λ was there and should have popped up from applying GR to cosmology (first attempt of Einstein in 1917 I think). But it didn’t. Thus the « biggest blunder » lies rather there ….. In fact Friedmann and Lemaître had quickly noticed the potential instability in the equations of GR. This showed their incompleteness and Einstein correctly added the missing term ( = the contrary of a biggest blunder : a biggest insight !).
Does this mean that teaching nowadays that Λ can be supposed Λ=0 (like k=0 for the flatness) in Friedmann’s equation (Robertson-Walker metric) is a mistake ? Or is it mandatory to say that Λ≠0 because observations (vacuum energy ? acceleration of the expansion ? WMAP + Planck's results ...) impose it ? Similarly for k which should be k=0 all the time because observations (WMAP + Planck's results, ...) impose it ?
From the philosophy of effective field theories it is mandatory to add a cosmological term to the Hilbert-Einstein action. We have no way to estimate its magnitude. At the same time, quantum fluctuations tells us that there should be a contribution of similar form to the matter action, also of unknown but very large estimated magnitude.
What we observe is that these two possible contributions neatly cancel each other. Almost perfectly. Why is that? Why only almost? Did that cancellation also occur before the early-universe phase transitions, which should have changed the matter contribution significantly?
Yes-there isn't any preferred value for the cosmological constant-the Einstein equations admit solutions for any value. It can be measured-and has been found to be non-zero.
It's not an issue of philosophy-it's a mathematical requirement to include it.
The statement about quantum fluctuations of matter contributing to the cosmological constant is, actually, meaningless, because the cosmological constant is part of the classical action, the zero point fluctuations are quantum effects, computed in Minkowski spacetime and it isn't known how to calculate the reaction of the spacetime geometry to them-so it doesn't make sense to take them together.
Not to mention the fact that the matter contribution by no means includes all matter: it doesn't include dark matter and it doesn't include the contribution of particles that haven't, yet, been discovered.
Even had, by coincidence, the contribution of the zero point energy of known matter given a result that were compatible with the value of the cosmological constant, that would have been meaningless, also, to be considered as determining the value of the cosmological constant, for the same reasons. Any value of the cosmological constant would, still, have been allowed and the backreaction would still have needed to be computed.
In fact, despite its title, and some of its non-technical statements, the most relevant calculation was made by Weinberg, in 1987, http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.59.2607
While the title may have, (in)advertently ensured that the popular press would pay attention, the technical arguments are relevant, since they relate the cosmological constant to structure formation in the Universe-independently of whether humans are around at all, in fact. The arguments don't rely on the time humans *did* evolve, and even less on the time scale of human civilization. They only rely on the known properties of structure formation.
>> “… the cosmological constant … can be measured-and has been found to be non-zero.”
>> “Vacuum energy has been measured-it's the cosmological constant.”
Thus :
Vacuum energy (QFT) = Cosmological constant (GR)
Can one say this (« = ») with 100 % certainty although a theory of quantum gravity is still not known ?
I thought assimilating both (vacuum energy and the cosmological constant) is the simplest for the present : only assimilating, not « = »…. !?
Or have two separate measurements indeed been performed : one of vacuum energy and another one of the cosmological constant ? And the results have proven to be equal ? If it is the case, could anybody provide references ?
But “… there isn't any preferred value for the cosmological constant-the Einstein equations admit solutions for any value”.
So what’s the present truth : can the measured value of vacuum energy be definitely taken as the value of the cosmological constant to be put into the Einstein equations ?
Black holes as astrophysical objects have been observed and have properties consistent with general relativity. Once more, what any person offers as opinion is irrelevant-what is offered as calculation is. It's known that it's not possible, in practice, to observe Hawking radiation from astrophysical black holes. Hawking radiation is, also, a quantum effect and since it's not known how to compute the backreaction, either, it doesn't make sense to discuss it.
What is a solid result of Hawking's work, however, is that black holes have a finite entropy, when quantum effects can be described in a controlled way. This is, for the moment, not applicable to astrophysical black holes, however.
Once more:
In the absence of gravity-on a flat background-the vacuum energy of any matter system is undefined. This is due to global time translation invariance.
In the presence of gravity-on a curved background-the vacuum energy of any matter system is equal to the cosmological constant of the background.
So the question is what fixes the value of the cosmological constant to something other than zero in our Universe. What Weinberg argues is that it can be related to observed properties of structures in the Universe.
These are, of course, statements in classical gravity. For cosmology, it suffices.
Guibert> Vacuum energy (QFT) = Cosmological constant (GR)
Can one say this (« = ») with 100 % certainty
As I indicated in a post above, I consider them to be different. The cosmological constant is truly a constant. Vacuum energy (more precisely, the local energy-momentum tensor of vacuum) may vary. F.i., quintessence has been introduced as one possible form of dark energy.
The LHS of the equality is, in fact, meaningless in flat spacetime-hence the confusion. In curved spacetime it's an identity; where it refers to the energy density, of course. More precisely it's a definition of the (ambiguous) LHS.
Dark energy is just another term, it doesn't mean anything. Quintessence implies that unknown scalar fields are involved. They don't seem necessary. And even if they were present, it wouldn't be possible to distinguish their contribution to the vacuum energy density from that of a cosmological constant. They might have been useful for describing other effects.
>> “… So the question is what fixes the value of the cosmological constant to something other than zero. What Weinberg argues is that it can be related to observed properties of structures in the Universe.”
Ok, I understand that this refers a.o. to the path followed in the standard Λ-CDM model with :
- Cosmological constant = dark energy (= "another term")
- Dark energy corresponds to 70 % of the critical density, as given by a best fit on SN Ia measurements (apparent magnitude vs. redshift).
The value of the cosmological constant (as well as -p) can then be calculated with 3H2/8πG using a value of H as measured by WMAP or Planck’s missions.
indeed. Regarding quintessence, this paper by Weinberg is of interest: http://xxx.lanl.gov/pdf/astro-ph/0005265v1 Once more, it's of sociological interest to note the usage of ``anthropic''-but the technical arguments don't depend on it. It's amusing to keep in mind that human presence in the Universe is much less than the error bar on the age of the Universe...
Halim. For centuries, scientists are familiar with magnetic field and magnetic lines of force, but did anyone knows there are elements for the magnetic lines of force?
"ELEMENTS OF THE MAGNETIC LINES OF FORCE" at:
(http://www.journaloftheoretics.com/links/Papers/MY-E.pdf)
http://www.journaloftheoretics.com/links/Papers/MY-E.pdf
In the List of unsolved problems in physics (https://en.wikipedia.org/wiki/List_of_unsolved_problems_in_physics), mentioned by Alimjan Abla, among the unsolved problems is:
Nuclei and nuclear astrophysics: What is the nature of the nuclear force that binds protons and neutrons into stable nuclei and rare isotopes?
We answered these in “The Spinning Magnetic Force”
(http://www.journaloftheoretics.com/links/papers/my-s.pdf)
http://www.journaloftheoretics.com/links/papers/my-s.pdf