The classical electron radius is well known and effectively represents the charge radius which is 2.82 X 10^-15 m.
The "physical" radius of the free electron has yet to be determined experimentally but is known to be less than 10^-18 m.
Has any body got a more accurate experimental value for the electron radius
Or has anyone got a handle on the theoretical free electron radius,
Theoretically it is zero.
Experimentally they are confident that there is no *measurable* structure down to 10-16 cm.
@Robert
Does that mean there could be a structure at somewhere close but below 10^-16 cm?
And do you have a reference for this experiment and what are the error bars.
Dear Andrew,
Among physicists there is much confusion concerning the notion of an "electron radius". I found it always strange that most people say that in QED the electron is point like. What is really local in this theory are the basic interactions. But the electron has two form factors, usually called F_1(q^2) and F_2, which are computed at least at the 1-loop level in any serious textbook on QED or particle physics. These form factors enter in many processes, for instance in e-e scattering, and the extraordinary agreement of QED with experiment show that any possible deviation from the QED prediction has to be extremely tiny. In this sense the one electron state describes an extended object. You may say that this is semantics and have probably a kind of "naked" electron radius in mind. But how would you define this conceptually? Of course, one can introduce ad hoc modifications of the form factors and then compare with experiments. This must have been done in the literature. I cannot point out a reference to this, but a rough bounds should be easy. However, what would that mean?
"Conceptually? Maybe a naked singularity (Kerr-Newman black hole type) surrounded by a cloud of virtual particles that could not be observed when e and positron annihilate.
@Norbert
I am guessing F-1(q^2) is the electron charge radius at 2.82. 10^-15 m.
and F-2 is the electron Compton wavelength used in re-normalization.
Interestingly the phase wave velocity of the electron charge spin can be calculated as c/alpha
The complementary group wave velocity alpha.c - yields a real radius of 1.5. 10^-19 m.
I would be interested to see if you think this is correct.
The upper limit of the "physical radius" you cite is a heuristic one, indicating that no substructure of the electron has been found up to some maximum collision energy; that energy is converted to a momentum (relativistic formula), and further to a wavelength ("maximum possible size") by the de Broglie relation.
There is no unique way to define the radius of the electron. The most obvious one is its Compton wavelength, 137 times the classical electron radius. The form factors mentioned are other possibilities: They reflect the fact that in QED a pointlike "bare" electron sometimes split quantum mechanically into two electrons and one positron (or more complicated configurations); the size distribution of all such configurations determine the form factors, which corresponds to a size much larger than the indicated 10^-18 m. However, such a size is deduced from a pointlike "bare" electron, taking into account all known computed quantum fuzziness. The cited upper radius indicate that any substructure of the bare electron must be smaller that 10^-18 m; otherwise this would have shown up as a deviation between observed and computed behavior.
The closest experimental value is zero-the electron, like the other leptons, up to the highest energies probed to date, behaves like a point, which can be consistently described by the Standard Model, which is a quantum field theory of points, not extended objects. It's useful to contrast this to the proton-that is not a point in the Standard Model, since it's a bound state of quarks and its radius can be computed by lattice techniques, e.g. http://www.sciencedirect.com/science/article/pii/S0370269314003852 though there are some subtleties.
That it is possible and useful to define different ``form factors'' for the electron doesn't mean that the electron isn't a point, since these form factors don't imply that the electron is a bound state of other particles, that can be consistently described as points-which is, for instance, what happens with the proton and quarks.
@Kare
Many thanks for a useful answer, can you give me a reference for the value 10^-18 m
The Standard Model is a provisional approximation at best. I can give you 7 good reasons why everybody knows this. Just ask. Oh, OK, I sense a coming request. Here they are:
The Standard Model of particle physics is our best conventional model for what is going on at subatomic scales.
However, here are a few well-known problems with the SM.
1. The Standard Model is primarily a heuristic model with 26-30 fundamental parameters that have to be “put in by hand”.
2. The Standard Model did not and cannot predict the masses of the fundamental particles that make up all of the luminous matter that we can observe. QCD still cannot retrodict the mass of the proton without considerable fudging, and even then it is only good to within 5%. As for retrodicting the mass of the electron, the SM cannot even make an attempt.
3. The Standard Model did not and cannot predict the existence of the dark matter that constitutes the overwhelming majority of matter in the cosmos. The Standard Model describes heuristically the "foam on top of the ocean".
4. The vacuum energy density crisis clearly suggests a fundamental flaw at the very heart of particle physics. The VED crisis involves the fact that the vacuum energy densities predicted by particle physicists (microcosm) and measured by cosmologists (macrocosm) differ by up to 120 orders of magnitude (roughly 1070 to 10120, depending on how one ‘guess-timates’ the particle physics VED).
5. The conventional Planck mass is highly unnatural, i.e., it bears no relation to any particle observed in nature, and calls into question the foundations of the quantum chromodynamics sector of the Standard Model.
6. Many of the key particles of the Standard Model have never been directly observed. Rather, their existence is inferred from secondary, or more likely, tertiary decay products. Quantum chromodynamics is entirely built on inference, conjecture and speculation. It is too complex for simple definitive predictions and testing.
7. The standard model of particle physics cannot include the most fundamental and well-tested interaction of the cosmos: gravitation, i.e., general relativity.
The only real answer to the electron substructure issue will come when we can observe the electron at high resolution in the lower energy regime. I know most physicists would say that is impossible because it certainly is for now. But we may learn a few new tricks in the next 50 years, like we did with observing atoms, protons, etc.
Robert L. Oldershaw
http://www3.amherst.edu/~rloldershaw
Discrete Scale Relativity/Fractal Cosmology
Dear Stam,
You repeat what many people say again and again, but, as I already wrote, I find this terminology rather strange. Our successful quantum field theories (of the standard model) are so far all local, but this property should not be confused with point particles. The particles we know are excitations of the interacting quantum fields and have structure, described by form factors, etc, that have been measured, in the case of QED with enormous precision. (As you certainly know, for the electron and muon even the fourth order radiative corrections to the form factor F_2(q^2) had to be computed, because F_2(0) gives the anomalous moment of the electron that has been measured with extreme precision. Nobody will ever see a "naked" electron that has no anomalous magnetic moment, or a structureless "point" electron. This exists only as a zeroth order approximation of the physical electron, and is just an expression of the local nature of the basic couplings.)
We physicist are (usually) precise when we use mathematics, but often sloppy in our way of talking. Pauli and others of my great teachers and estimated colleagues in quantum field theory never used the word point particle for the electron or other particles related to the basic quantum fields of the theory.
Andrew> can you give me a reference for the value 10^-18 m
I am not sure there exist any formal reference, since this is a rather heuristic concept. One did not discover any sign of substructure of the electron at LEP, with collision energies up to E = 115 GeV. This roughly corresponds to a maximum size of
λ = ħc/E = (197/115) * 10-18 m.
More precise statements must very likely refer to explicit models for substructure.
Robert> 7 good reasons
You are more than welcome to seriously^* improve on any of these points, without violating any of the verified predictions of the Standard Model.
^* To say that it is primarily a heuristic model does not indicate any serious knowledge of the Standard Model.
And gravity is not the most well-tested interaction of the cosmos.
You are entitled to your opinions and me mine. Empirical evidence alone will decide who is more correct and has better intuition when it comes to nature, as opposed to models thereof.
@rob
Herein the answers to most of your 7 points.
I am currently working on a 3D "lighthouse" model of the electron where the electron charge is a projection of the charge at a particular averaged distance - which is the classical radius of the electron.
It yields some very interesting answers, but does not readily fit with the half integer spin. Don't get me wrong h/2 fits the maths but the concept is still elusive. Unless of course the lighthouse is a twin beam and has to turn twice to achieve symmetry
Article Harmonic quintessence and the derivation of the charge and m...
One should, indeed, distinguish physics from mathematics-but, also, ambiguous words from less ambiguous ones.
Electrons and quarks are described in the Standard Model as points, not extended objects-the form factors don't affect this Renormalization doesn't affect this, either: the ``bare'' electron is as point-like as the ``dressed'' electron. That certain quantities receive quantum corrections doesn't change this. The quantum fields of the Standard Model describe point excitations-and this is a, mathematically and physically, non-trivial statement.
And, indeed, it is possible to distinguish this situation from that of the proton, that, in the energy range, where quarks-described as point excitations-must be taken into account to describe its properties, is not a point--whereas it can be described as a point, in the limit where an effective theory of meson exchange an be used. Just as the hydrogen atom isn't described as a point, when electromagnetic interactions can be resolved and the proton and the electron can be identified. Incidentally, ``point particle'' is an unfortunate term, since it says twice the same thing.
@Stam
It is more likely that an electron has a radius but that it cannot be measured easily.
Somewhat like trying to measure a smoke ring with a physical instrument.
Stam,
When the standard model can retrodict the masses of the major subatomic particles observable in the real world at the
The parameters of the Standard Model can be determined at much higher level of accuracy than 1%, from the calculational side, cf. http://lepewwg.web.cern.ch/LEPEWWG/ or http://www.slac.stanford.edu/cgi-wrap/getdoc/slac-pub-13830.pdf
And the consequences of the Higgs has been measured, already, for certain processes, beyond this accuracy too: http://arxiv.org/pdf/1503.07589.pdf
Faith doesn't have anything to do with either mathematics, or physics-what matters is that the calculational rules are well defined and the response of the experimental apparatus understood.
Dear Stam,
Here a true story. My father was a dedicated public school teacher, without a higher education in science. When Weinbergs book on the first three minutes had appeared in a german edition, I gave it to him. He read it with great interest, but at one point he came to me with the following worry. He had read that the electron is a point particle, and found that very strange. He said: A point is a mathematical abstraction, but an electron passing through a wire, is a physical object. My immediate reaction was: You are completely right, such a statement makes no sense. What the man {SW) means is something else, but his words are really misleading. Forget this statement, apart from this the book is really excellent.
Cite
8th Jan, 2016
Robert L. Oldershaw
Independent Researcher
Not true if you mean MASSES of fundamental particles.
Before the LHC came online the GUESS for the mass of the putative Higgs particle was somewhere between 100 Gev and 800 Gev, and even all the way up to the enormous Planck mass.
Sure certain things like the electron's gyromagnetic factor can be determined to 9 decimal places, but the majority of the crucial parameters of the SM cannot be accurately retrodicted, far less predicted.
You must know these facts but choose to ignore them because they conflict with your liturgy.
Cite
1 Recommendation
9th Jan, 2016
Kåre Olaussen
Robert
Your liturgy should inspire you to quantitatively explain any fact of nature covered by the Standard Model, however small, better than already done. Without increasing the number of adjustable parameters, and without destroying any of the already established quantitative explanations.
Cite
9th Jan, 2016
Robert L. Oldershaw
Independent Researcher
Well I have already identified the new fundamental principles required to accomplish that task, and also demonstrated via observational evidence that nature gives its blessing to those principles. It would be productive to have several generations of physicists willing to develop the new paradigm. There is only so much one person can do, especially in the face of the usual resistance to new paradigms.
Unfortunately, we are hampered in our quest for a better understanding of nature by our unquestioned devotion to absolute reductionism and a remarkably persistent refusal to give more than lip-service to the serious problems with currently fashionable models.
Cite
1 Recommendation
10th Jan, 2016
Andrew Worsley
University of London
Hi rob/Norbert
Norbert, looks like your dad beat Hawking to it
Rob please restate the new fundamental principles.
By the way the dimensional analysis is slightly out in the paper I quoted above ,I will publish the correct analysis plus Bohr magneton ratio derived from first principles soon.
Cite
10th Jan, 2016
Robert L. Oldershaw
Independent Researcher
The 3 fundamental principles of discrete scale relativity are found in the "Main Ideas" section of http;//www3.amherst.edu/~rloldershaw .
Briefly, here is a synopsis
1. Nature is organized hierarchically.
2. The hierarchy is grouped into discrete scales (...Subquantum, atomic, stellar, galactic, metagalactic,...) and there are probably a countably infinite number of scales.
3. The cosmological scales are either strongly self-similar to each other or (as I would prefer) exactly self-similar, but that can only be determined empirically.
If these 3 principles are correct, and the website has a lot of material that supports this possibility, then we have a new and radically different paradigm for physics/cosmology. GR, EM, much of QM, thermodynamics, hydrodynamics, nonlinear dynamical systems, etc are retained, but modified to incorporate discrete conformal geometry that is the underlying reason for the ubiquity of self-similarity in nature.
This cannot be absorbed and evaluated in a brief reading but takes sustained effort. There is no way to get around that. It is required in the case of any new paradigm.
Cite
10th Jan, 2016
Andrew Worsley
University of London
That paradigm is quite simple the fundamental mass m=h/c^2, 42 magnitudes smaller than the Planck mass
Cite
10th Jan, 2016
Kåre Olaussen
Andrew> m=h/c^2, 42 magnitudes smaller than the Planck mass
In which units did you calculate that?? h/c2 has dimension mass*time [kg s], not mass.
Cite
10th Jan, 2016
Robert L. Oldershaw
Independent Researcher
The corrected Planck mass of DSR is 1.2 x 10-24 g which is about 0.72 times the proton mass, rather than the horrendously large conventional Planck mass of 2 x 10-5 g!!!
The corrected Planck length is 2.934 x 10-14 cm ( ~ 0.4 times the proton radius).
The corrected Planck time is also intimately associated to the size of the proton.
This is all calculated on the website: see #9 of the "Technical Notes" section.
Cite
10th Jan, 2016
Andrew Worsley
University of London
@ kare
Well spotted
Of course everything in the quantum Universe has a frequency, which brings the dimensions back to [M].
Cite
10th Jan, 2016
André Michaud
Service de Recherche Pédagogique (SRP Inc)
For what it's worth, and if I am not mistaken in the calculation, if you integrate the energy of the electron rest mass from infiity to \lambda_C and calculate the volume that this energy would occupy if it was immobilized into the smallest spherical volume possible, assuming that energy is incompressible and hasisotropic density, you get a volume of
$
V_e = \frac{\lambda_C^3 \alpha^5}{2 \pi^2} = 1.497393267E-47 m^3
$
which gets you a theoretical electron radius of
$
r_e=\sqrt[3]{\frac{3 V}{4 \pi}} = 1.529029116E-16 m
$
This of course would not be the actual radius of the electron. Similar in fact to bundling up all of the leaves on a tree into the smallest uniformly isotropic sphere possible to more easily calculate the limit volume and density of the material that the leaves are made of.
Andrew> Of course everything in the quantum Universe has a frequency
That is not "of course", so that is where your theory starts to differ. What is your frequency?
With the greatest of respect,
Einstein once said
" Two things are infinite, human ignorance and the (observable) Universe and i am not sure about the latter
So in modern terms the phrase is
"go figure"
So you have postulated the existence of a fundamental mass mW, which can be used to define the Worsley units. F.i., the Worsley frequency is fW = mW c2/h. What exactly is the value of mW in SI units? You only gave an order-of-magnitude relation to the Planck mass. Could it perhaps be that fW = 1 Hz exactly? That sounds very fundamental.
The computation of the self-energy of quarks and leptons in the Standard Model, in particular and for any quantum field theory in general, is in all textbooks on the subject-so there's no point in rehashing historical debates, that have been resolved-cf. for instance, the paper by S. Coleman,https://www.rand.org/content/dam/rand/pubs/research_memoranda/2006/RM2820.pdf that's not as well known as it should be.
Only gravitational interactions imply that the notion of a particle-i.e. a point-is problematic, when these become relevant, since a point, with finite mass, defined, as usual, for the conserved charges in gravitation, at infinity, would be a naked singularity of the gravitational theory. While mathematically well-defined-cf. the Hawking-Penrose theorems-the physics at such scales requires a consistent description of quantum effects, that are beyond the scope of classical gravity. And the notion of the electron only makes sense within the Standard Model-at energies where gravitational interactions can't be neglected, it must be replaced by other degrees of freedom, that resolve this issue-and how to do this isn't, yet, known.
The Planck units are the result of dimensional analysis and lack, for the moment, any dynamical origin. In electromagnetism it is possible to show that the dielectric constant and the magnetic permeability of the vacuum do define a velocity-that's numerically equal to the speed of light-but this, by itself, doesn't imply special relativity, that imposes constraints on the dynamics of the electromagnetic field and of charged matter. So any proposal for units that, inevitably, can be expressed in terms of Planck units, requires an explanation of the dynamics. There have been many proposals that attempt to show that quantum gravitatational effects could be relevant at scales parametrically smaller than the Planck scale-e.g. http://arxiv.org/abs/hep-ph/9807344
The Planck scale most certainly does have a dynamical origin. It is the scale at which General Relativity must play a major role in the microcosm.
The problem with the conventional Planck scale is that it assumes that G is an absolute constant and is the same on all scales, rather than having distinct values on different scales, which differ by a factor of ~ 1038 for neighboring scales.
RLO
http://www3.amherst.edu/~rloldershaw
No: general relativity, a dynamical theory, implies that an object of size less than the Planck length, with mass greater than the Planck mass, would be a black hole. However that assumes that quantum effects can be neglected-which Hawking's work shows they can't, which is consistent with the fact that these units depend on Planck's constant that sets the scale for quantum effects, that are beyond the scope of general relativity. Also, quantities that aren't constant are meaningless as units. Any parametric dependence of Newton's constant on an unknown scale requires a model for the dynamics that defines that scale and is consistent with general relativity, at the scales the latter has been tested-just like a consistent description of refraction, that implies that the speed of light in a medium is parametrically different from that in vacuum, requires a theory of the origin of the refractive index, I.e. electrodynamics.
@kare
Correct
the frequency of harmonic quintessence
fq =1
QM follows directly, and everything in the Universe is constructed from it -including dark energy.
You are quite right it is fundamental.
Frequency is related to time, so a frequency scale implies a time scale. 1Hz isn't selected by any dynamical mechanism, however-the frequency of the spectral line that relies on the speed of light standard might be, however its parametric dependence on the Hz unit relies on distinguishing quantum electrodynamic effects (more generally, non-gravitational effects due to Standard Model and, eventual, beyond the Standard Model effects) from gravitational effects.
Regarding GR and the Planck scale, if one is unwilling to question fundamental (but untested) assumptions and make the effort to explore alternative ideas, then one is probably not going to make any great progress in theoretical physics. Witness the "nightmare scenario" at the LHC and general stagnation of theoretical modeling for 4 decades.
If you like mediocrity, continue repeating the same mantras.
If you want to move forward, think anew.
There hasn't been any ``nightmare scenario'' at the LHC (a meaningless term) , nor any ``general stagnation'' of theoretical modeling (another term, that's empty of content). What isn't appreciated by non-experts is the difficulty of performing data analysis at the LHC-as well as the effort it takes in designing, calibrating and operating experimental setups in high energy physics-it takes twenty five years from design to commissioning and, after each shutdown, the accelerator and the detectors have to be tested. This activity is at odds with a 24/7 news cycle and the blogging of non-experts. There are many, known, Standard Model processes, that haven't been measured to discovery precision at the LHC yet; some can't be measured, directly, at the LHC at all and their contribution must be checked by indirect measurements, that do, nonetheless, constrain them-which isn't as well known as it should be. And the data collected make any detection of beyond the Standard Model processes, yet, problematic-and this is well known.
If you want to move forward, think anew.
and don't bother acquiring any standard knowledge of the field you move into ;-)
Look pal, it was Steven Weinberg who coined and defined the expression "nightmare scenario" for exactly what occurred in the LHC's first run.
One tires of this endless denial of problems and rationalizations for mediocre models that surely are ready for scrap heap.
@stam/kare
Please (if it is you) stop putting negative points in.
I would like to assist you, but Robert is correct. Standard knowledge is fine but does not equal progress.
The last statements have absolutely nothing to do with the topic, which is what value can be assigned to the radius of the electron, so it's not possible to see their relevance here. The statement that the current theoretical description of the electron, along with current experimental measurements, at the LHC, is that it is consistent with zero, relies on both and can be shown in an impersonal way.
(Incidentally, Dirac did study ``an extensible model for the electron'', http://rspa.royalsocietypublishing.org/content/268/1332/57 which is classical and found how it was inconsistent with what was known about the electron and the muon, more than fifty years ago. He attempted to describe the muon as an excitation of the electron. The paper is, now, considered as a precursor for the description of extended objects more generally. That's why the action used is called the Dirac-Born-Infeld action.)
HMMM, my retrodiction for the mass of the electron also has a crucial [alpha]2 component.
I get 0.5131 Mev using only a Kerr solution of General Relativity and a bit of intuition.
@Stam
Already trying to change a beautiful theorem in to the old hag that lives in Gormenghast castle which is the SM.
There is clearly far more subtlety in this than you could possibly grasp.
@ROL,
Can you show me your workings on this. By the way i get the mass to 8 d.p using the speed of light
Hi Andrew,
If you go to http://www3.amherst.edu/~rloldershaw and choose #! of the New Developments page, then you will find a discussion entitled "Retrodicting the Electron and Neutron Masses.
If you are interested in #1, you might also take a look at #2, which is entitled "Understanding the Particle Mass Spectrum (100 Mev - 1860 Mev)".
I would welcome comments/criticism.
Rob O
It is a bit too self critical to call it a retrodiction, it is more of a derivation.
However, i still think your Planck mass is way too high it should be h/c^2.
The usual way to get the Planck mass is [h(bar)c/G]1/2, and I propose that the exceedingly weird conventional value is due to the use of a wildly inappropriate value for G in this context.
You will have to explain to me, or direct me to an available explanatory source, why you use M = h/c2 , and probably you mean M = hw/c2.
While the experimental uncertainty on the value of Newton's constant isn't as small as could be wished, it is known with considerable precision, cf. http://perso.ens-lyon.fr/sergio.ciliberto/Teaching/Cours_physique_experimentale/gravitation.pdf
And tests of general relativity, http://arxiv.org/abs/0806.1731 are sensitive to its value-more precisely that this value is a constant. However that the Planck mass does have the value it has doesn't imply, for the moment, anything more than dimensional analysis. It must be established separately that physical effects do exist that don't occur at any other energy scale.
Of course, since it is a dimensionful quantity, it's possible to choose a system of units in which it is equal to 1 (or any other non-zero numerical value), just like for the speed of light or Planck's constant-for the latter case it is more *convenient* to set it equal to 2π (i.e. hbar = 1; h and hbar have the same dimensions, since π is dimensionless). So what matters is that these can be taken to be *constants*. All *experimental* evidence is consistent with this statement, so any proposal that they are not must provide an explanation for them.
Dear Robert L.
your suspition that the value of G might be inaproppriate for some uses is well founded. It is even easy to mathematically demonstrate the fact and I wonder why this isn't discussed more. Good for astronomical purposes, but otherwise...
You can see the demo in the first part of this paper published in an engineering journal in 2013:
http://www.ijerd.com/paper/vol6-issue6/F06062734.pdf
Nobody has ever measured the value of G within an atomic scale system, and for the foreseeable future nobody will. The word within is critical here.
Likewise nobody is currently able to measure the value of G independently for galactic scale systems. The best one can do is measure the combination GM on the galactic scale.
We have excellent evidence for the conventional value of G external to atomic scale systems and within the entire Solar System.
We have no empirical evidence that compels us to assume that the conventional G is an absolute constant and exactly the same in any context other than that mentioned above.
Discrete Scale Relativity offers a wealth of evidence for the idea of discrete self-similarity between analogue systems on different cosmological scales. This implies that absolute scale is a myth, albeit a widely embraced one. If one postulates discrete (aka broken) relative scale, then one can empirically use self-similarity to calculate how lengths, times and masses for analogues on different scales are related. It also allows one to show how "constants" vary for different scales if they are dimensional (like G), or do not vary if they are dimensionless or involve L/T (like alpha and c) .
Is it not time to move on from the rubbish of the last millennium and into the future of a unified physics?
RLO
http://www3.amherst.edu/~rloldershaw
Actually there have been tests at submillimeter distances, cf. http://pdglive.lbl.gov/DataBlock.action?node=S071DGF that tests much more. Such tests are, precisely, sensitive, also, to whether G isn't a constant, since, if it isn't, this can be parametrized and bounds can be set. Such tests of extra dimensions, also, test for whether the four-dimensional G is a constant.
Cf. also https://einstein.stanford.edu/RESOURCES/KACST_docs/KACSTlectures/KACST-IntroPhysicsInSpace..pdf Were G not a constant, there would've been additonal fields that would parametrize this variation-such fields have consequences, that can be probed-e.g. scalar tensor theories do precisely that, to cite but one example.
Proclamations aren't useful, calculations are; calculations have been done and are being done. These count. One doesn't ``move'' by someone's exhortation, but by someone's calculations and experiments. For the moment the people that are ``exhorting'', can't seem to deliver either consistent calculations, or decisive experiments that can support the exhortations. (Fractal structures in cosmology have been studied for decades, incidentally, e.g. http://pil.phys.uniroma1.it/twiki/bin/view/Pil/AstrophysicsCosmology . And this doesn't have anything to do with any absolute scale.The description of cosmological structures doesn't have anything to do with the subject of this thread.).
Discrete scale effects are inconsistent with fractal structures-since self-similarity breaks down at scales below the scale set by the scheme. This raises the question of what sets this scale and what preserves it from corrections. Technical terms have content.
I do not bother to argue very long with those who are often wrong but never in doubt, so here is a last attempt.
I said that no human has ever measured the value of G WITHIN an atomic scale system. Can you read? It has NEVER been done. Forget about distances and focus on whether the measurement is INSIDE an atom, proton,etc., or BETWEEN UNBOUND objects that NEVER BECOME BOUND. Do you understand the difference and how that might be of critical importance in General Relativity. Do you know why the observable universe expands, but galaxies, stars and atoms DO NOT undergo the same type of universal expansion? There is some physics there that you need to think about long and deep.
At subatomic scales gravitational effects are suppressed by factors of E/E_Planck, where E is the typical subatomic energy scale one is using-so it makes no sense-that's why it's never been done. One idea behind the extra dimensions, indeed, was to test whether the effective scale, where gravitational and quantum effects become comparable, could be lower than E_Planck. In the early Universe (before inflation) it's known that general relativity isn't the consistent description and it's not, yet, known what takes over. During and after inflation measurements by the Planck satellite show that general relativity and the Standard Model work very well-which explains all that physics, at those scales.
On the other hand the energy levels of neutrons in a gravitational potential have been measured: http://arxiv.org/pdf/hep-ph/0502081.pdf and they are, of course, sensitive to Newton's constant and its putative variations-which were, among, the motivations for doing the experiment.
1. If you use an incorrect paradigm as a guide to answer questions about nature, you will be highly likely to get incorrect answers.
2. If physicists could measure the value of G inside an atom, they most certainly would try very hard to do so. They cannot - at least not yet.
Your implication that it has not been done because no one thinks it is worth trying is 15-watt thinking, not to mention false.
RLO
http://www3.amherst.edu/~rloldershaw
Fractal Cosmology
Robert.
If you use an incorrect paradigm as a guide to answer questions about nature, you will be highly likely to get incorrect answers.
Not that paradigms are enough; it is a good idea to interact with a variety of scientists with knowledge also.
Yes we agree on that.
Einstein did a lot of work on basic principles on his own, but need key input from others to fully develop them into the mature theories. Esp. in the case of General Relativity.
Quantum mechanics was an interesting combination of individual effort (e.g., Bohr, Dirac) and model-building by committee.
RLO
http:/www3.amherst.edu/~rloldershaw
(absolute scale guaranteed an eventual train wreck)
The fact that the speed of light does set an ``absolute scale'' is described mathematically by global Lorentz invariance and, in the presence of gravity, by local Lorentz invariance. These statements don't imply that Newton's constant, similarly, sets any such scale-but they do allow a parametrization of deviations-just like the running of the coupling constant in quantum field theory doesn't mean that Planck's constant sets any absolute scale-it means that quantum effects are controlled in a certain way. What isn't known, indeed, is how the variation of Newton's constant with scale, its renormalization group flow, can be consistently described. All these facts are part of standard courses in quantum field theory, incidentally.
But perhaps you can imagine that a new and radically different paradigm might base itself on new and radically different principles and assumptions and proceed to offer a new and less balkanized vision of how nature actually works, and answer questions that the old failing paradigm can no longer answer.
If your position is that you have no intention of ever considering alternative principles and assumptions, then you can never break free of the confines of the old paradigm.
You would counter that we must retain what has worked well in the past. I would agree until it stops working and new ideas seem to be called for. You are entitled to choose stasis and I am entitled to explore nature for a better paradigm. Just don't tell us your theories and models with their many untested principles and assumptions are the only game in town. That would belie an imperfect understanding of science history and practice.
RLO
http://www3.amherst.edu/~rloldershaw
They are the only game that consists in controlled calculations and experiments and not in preaching sermons and relying on rhetoric. History of science is a distinct discipline from science itself. There are plenty of alternatives that are being considered and it is known and understood how to judge them; in any case, any ``new and radical approach'' must show how it fits with what's already known-it has to be consistent with what's known and predict new things that can be checked where previous approaches can't-the Standard Model does that for the non-gravitational interactions and for *parametrizing* inflation and general relativity does that for all gravitational phenomena, beyond inflation. And both define a framework for testing their extensions. How to describe phenomena before inflation isn't known-and as BICEP2 showed, the largest background, for the moment, is the ignorance of the detailed composition of interstellar dust. Incidentally, no one's hindering production of *new* *technical* alternatives (not lamentations) by whoever wants to-where are they?
We can and must do better than the Substandard Model.
1. The Standard Model is primarily a heuristic model with 26-30 fundamental parameters that have to be “put in by hand”.
2. The Standard Model did not and cannot predict the masses of the fundamental particles that make up all of the luminous matter that we can observe. QCD still cannot retrodict the mass of the proton without considerable fudging, and even then it is only good to within 5%. As for retrodicting the mass of the electron, the SM cannot even make an attempt.
3. The Standard Model did not and cannot predict the existence of the dark matter that constitutes the overwhelming majority of matter in the cosmos. The Standard Model describes heuristically the "foam on top of the ocean".
4. The vacuum energy density crisis clearly suggests a fundamental flaw at the very heart of particle physics. The VED crisis involves the fact that the vacuum energy densities predicted by particle physicists (microcosm) and measured by cosmologists (macrocosm) differ by up to 120 orders of magnitude (roughly 1070 to 10120, depending on how one ‘guess-timates’ the particle physics VED).
5. The conventional Planck mass is highly unnatural, i.e., it bears no relation to any particle observed in nature, and calls into question the foundations of the quantum chromodynamics sector of the Standard Model.
6. Many of the key particles of the Standard Model have never been directly observed. Rather, their existence is inferred from secondary, or more likely, tertiary decay products. Quantum chromodynamics is entirely built on inference, conjecture and speculation. It is too complex for simple definitive predictions and testing.
7. The standard model of particle physics cannot include the most fundamental and well-tested interaction of the cosmos: gravitation, i.e., general relativity.
Robert L. Oldershaw
http://www3.amherst.edu/~rloldershaw
Discrete Scale Relativity/Fractal Cosmology
Robert have never been directly observed
That's right! Robert has never been directly observed, only light scattered from his body.
Robert> the most well-tested interaction
ROFL! You really know how to pull jokes.
With respect to the standard model, the gaps are not "small".
And here is a longer list of LHC results.
So far the LHC has found:
no string/brane exotica,
no sparticles,
no WIMPs (or any other particle dark matter),
no supersymmetry exotica,
no extra-dimensions,
no magnetic monopoles,
no mini-black holes,
no Randall-Sundrum 5-D phenomena (gravitons, K-K gluons, etc.),
no evidence for ADS/CFT duality,
no colorons,
no leptoquarks,
no “excited” (or bored) quarks,
no lazy photons,
no fractionally charged particles,
We can and must do better, and this will require radically new ways of understanding the cosmos.
Why not a fractal model that is in agreement with observations?
More natural; less abstract/cheesy.
The answer appears to be that the free electron is a cloud, like the orbital electron, but the averaged radius of the free electron is the classical electron radius .i.e the charge radius
In this case we should not expect to see any particulate nature ie no experiment will detect a solid radius.
My research suggests that the free electron is closely approximated by a naked singularity, but when it becomes bound to an atomic nucleus the situation is radically different and much changed.
A free electron with a classical radius (Compton?) is a non-starter in total disagreement with many observations.
By classical we do not mean the Compton or the Bohr radius, it is an averaged charge radius. But maybe I need to stop using the term "classical radius" as it appears non QM, and stick to charge radius
Yes, I clearly misinterpreted "classical radius" as the classical Compton radius which is on the order of a horrendous 10-12 cm.
Charge radius is much less likely to be misinterpreted. And defining it numerically would be even safer.
With the greatest of respect, it is defined numerically in the question.
P.S. your list of SM deficiencies also includes most of particle physics
Like my man Bernie Sanders says: "We need a revolution!"
The system is badly broken and the "powerbrokers" are either corrupted by their power, or in total denial.
@ RLO
Truthfully does SM recognize the un-knowables, It says why do we know what we know and ignore the vital un-knowables. For example what is the reason for the mass and charge of the electron-there is an answer. And it is not blowing in the wind
I hear no coherant answer
1> no string/brane exotica, (in agreement with the standard model)
2> no sparticles, (in agreement with the standard model)
...
22>no lazy photons, (in agreement with the standard model)
23> no fractionally charged particles, (no results are inconsistent with the existence of quarks with charge 2e/3 and -e/3)
I don't think it is wrong of the Standard Model of Particle Physics to explain all observed non-gravitational physics. Disappointing to many, perhaps, that we have a simple model which works so well --- a theory of almost everything (discovered so far).
Well Kare, if your model did not even predict the existence of dark matter and you have not a clue about what the dark matter even IS, then you are ignorant about somewhere between 85% and 99% of the entire cosmos.
If you are happy with that state of ignorance, good luck to you Sir! And enjoy your little house of cards.
RLO
http://www3.amherst.edu/~rloldershaw
Discrete Scale Relativity
@kare
Let's be a little less subtle here. Science knows what it does not know- what about the mass of the electron, muon and tau- just for starters.
But would never dare to admit it - how totally blinkered
Those are a few of the masses that are "put in by hand" with no understanding of why they have their measured masses or why the more massive leptons exist at all.
This is not satisfactory!
Part of scientific inquiry is learning what are the sort of questions it makes sense to ask. Why leptons exist isn't a meaningful question-since what's assumed isn't clear. What are their properties and how to describe them are meaningful questions-and they have found answers within the Standard Model, that predicts how the interactions affect the leptons' mass and, from that, can predict the existence of new particles and their properties-like the prediction of the mass of the top at LEP, before it was discovered at Fermilab. Similarly, the statement that there are as many lepton doublets as quark doublets is a prediction of the Standard Model-and of its extensions; so, when the tau lepton was discovered, the first particle of the third family, in 1975, it was clear that the corresponding neutrino should exist (and it was, indeed, discovered in 1995-and that two additional quarks should exist, with definite charges-and these were, indeed, discovered, in 1977 (the b quark) and in 1995 (the top quark)-though their effects could be monitored sooner and were. And this meant that CP violating effects could be described in a certain way, that's a major part of the program that probes for the effects of particles beyond the Standard Model.
Dark matter is a new form of matter, whose properties are beyond the Standard Model-since its major interactions are gravitational and the scale where gravitational effects become comparable to quantum effects isn't, yet, known. It isn't required by the Standard Model, but it can be described by it-that's the basis of dark matter searches, by trying to detect the weak interactions of the particles, that make up dark matter.
Andrew, Robert @
Do you think it is a great fallacy of Newtonian mechanics and gravity that it is unable to predict the few parameters required to determine the positions of planets? Or do you think it is a great success of Newtonian mechanics and gravity that it is able to predict all planetary orbits from those parameters, and how these orbits vary with time (to the extent of revealing the theories own limitations)? Each planetary orbit sort-of represents an infinite number of parameters .
The Standard Model is much of like that, only much more impressive. It depends on a small number of dimensionless parameters, with which it is able to correlate and predict an astonishingly large amount of continuously described phenomena.
It is of course a dream of many to predict some or all dimensionless parameters of the Standard Model (I have tried myself, with astonishing success --- although the calculation was wrong). But it could be that this is equally impossible as the calculation of initial conditions for our planetary system.
For Nature's sake Kare! The Newtonian paradigm has been totally replaced. Sure it is used in approximate calculations because that is easier, but it is NOT how nature works. Absolute time, absolute space, instantaneous action-at-a-distance? Laughable.
The Newtonian paradigm was Ptolemaic physics at its best. A credible model that gave very good answers but was NOT how nature worked. Einstein has proved that.
Welcome to the 20th century, if you still cannot enter the 21st century.
RLO
http://www3.amherst.edu/~rloldershaw
Robert> The Newtonian paradigm has been totally replaced.
That was grumpy bullshitting. Within its range of validity, i.e. most of everyday phenomena, Newtonian physics works as well today as before 1900.
But you are just trying to talk yourself away from my arguments, by trying to divert attention in other directions.
Yes, as well and as badly as before 1900.
But it's principles are not nature's principles.
It an incorrect model that works surprisingly well, but serious physicists no longer view it as an accurate understanding of nature. Something much, MUCH better has replaced it.
The same fate awaits the standard models of both particle physics and cosmology.
The same awaits all descriptions; however, dark matter doesn't contradict the Standard Model, nor do any measurements in cosmology contradict general relativity.
Just one remark connected to this never ending discussion: For somebody like me, who began his scientific carrier in high energy physics in the early 1960ties, the SM is an enormous achievement. Before that we had no real theories to work with. There were only all sorts of fragments, such as approximate higher global symmetries, Regge poles, current algebras, unrenormalizable Fermi type weak interactions, dispersion relations, etc.. Many of us (I was a CERN fellow from 1964-65) were really unhappy about the situation.
Of course, the SM has many limitations and weaknesses, as most people who know the subject are aware of, and since about 1980 we got stuck. But real progress can not always be as fast as during the first third of the twentieth century. There are other fields, in which progress is very impressive, in particular in astrophysics and cosmology.
Stam: But no theory predicted DM before Zwicky discovered the first evidence for it. Moreover, the SM and other fashionable theories cannot specifically retrodict what it is. DSR does and so far its predictions are borne out via microlensing.
GR will be the foundation of the next cosmological paradigm, and I mean paradigm, which is a general conceptual model for how nature actually works.
Norbert: I hear you and agree with you and sympathize with you regarding the SM, but we cannot view it as the unalterable word of god, which unfortunately many do.
Nothing got stuck since the 1980s; on the one hand, the Standard Model became subject to ever more refined tests-and has passed them-on the other hand the experiments have become so elaborate, that the time from conception to commissioning takes decades. Not to forget that the techniques for evaluating the backgrounds for the experiments require non-trivial work: even were, by magic, the LHC and the detectors, materially ready in 1990, it would have been impossible to evaluate the signals generated, since the techniques for evaluating the backgrounds weren't available then. The work of the lattice community shouldn't be left unmentioned, either.
That neutrinos are now known to be massive and to have non-trivial flavor mixing points to physics beyond the Standard Model, that, hopefully, will become elucidated in future runs. It's not the models that are lacking-but reliable data.
Insofar as dark matter can be probed through its gravitational interactions, its effects are described by general relativity; its particle content can be readily described by many extensions of the Standard Model, depending on the *additional* properties it possesses; which extension is the appropriate one depends on future data. Gravitational lensing is a standard tool in cosmology for decades in different contexts.
Robert> serious physicists no longer...
Yeah, sure!!!
So you think Feynman advocated the use of GR for analyzing how Challenger fell to earth? And QED to describe the properties of cold rubber? Because that was required for an accurate understanding of nature?