Thank you for your Sept 17 reply, very interesting, insightful. In effect, in cosmology (and quantum mechanics), some innovative theorizing is rejected because it does not conform to the accepted description of what physics is ignorant about. That is reflected in the related question on RG:
Dr. di Filottrano's remark that you quote, “purposely made-up ingredients as dark matter and dark energy, up to 96% of the content of the universe, without having an agreed explanation for those and even more lacking experimental evidence” is revealing.
In science, particularly in cosmology, it’s tempting to dismiss a theory if it doesn’t align with current knowledge or well-established models. However, dark energy is a domain where so much remains unknown that adhering strictly to present cosmological frameworks could hinder innovation. The key criteria for discounting a theory should rest not on its deviation from current models but on its failure to meet certain scientific principles:
Falsifiability: A theory must provide predictions that can be tested through observation or experimentation. If it can't be tested, it's not scientifically useful.
Consistency: While we are open to new ideas, the theory should still show internal consistency and align with fundamental principles, like causality and coherence with established physics, unless it proposes a compelling reason to challenge them.
Predictive Power: The theory should offer predictions that explain not just dark energy but other cosmological phenomena better than existing models.
So, a theory shouldn’t be dismissed merely because it conflicts with the unknown. Instead, we should measure it by how well it expands our understanding of the universe, while maintaining scientific rigor. Some of the most groundbreaking ideas in physics have initially been at odds with prevailing knowledge, but it’s their eventual empirical success that determined their value.
In short, it’s not the unknown that invalidates a theory—it’s the failure to extend our known frontiers of testable, predictive science that does.
Remarks about cosmology that echo those of Dr. di Filottrano, who Peter Guynn mentioned above, are outlined in:
Modern Cosmology: Science or Folktale? By Michael J. Disney
In American Scientist, September-October 2007 Volume 95, Number 5 Page 383 DOI: 10.1511/2007.67.38
Excerpts:
... theoreticians have had to create heroic and yet insubstantial notions such as "dark matter" and "dark energy,"
Outsiders are bound to ask whether they should be more impressed by the new observations or more dismayed by the theoretical jinnis that have been conjured up to account for them
The three successful predictions of the concordance model (the apparent flatness of space, the abundances of the light elements and the maximum ages of the oldest star clusters) are overwhelmed by at least half a dozen unpredicted surprises, including dark matter and dark energy.
Further to the preceding references is the article: What kind of science is cosmology? by Hubert Goenner, which can be found in: Annalen der Physik, vol. 19, issue 6, pp. 389-418, 2010 http://arxiv.org/abs/0910.4333. In section 6.2 in the arxiv version: "It is difficult, from the theoretical point of view, to make transparent the web of assumptions, logical deductions, and empirical input spun by cosmologists if the explanatory value of the cosmological model is to be evaluated."
To answer “What criteria determine when a theory of dark energy should be discounted?” it is necessary to understand that the “dark energy” cosmological postulate really was formulated only as experimental observations ad hoc interpretation, really only aimed at to describe of what is observable in Space,
- despite that in rest physics no any “dark energy that expands Matter’s spacetime” exist, and so this “energy” hasn’t any real physical grounds; besides that it can be described by introducing the “lambda term” in the GR equations, which describe changes in Matter’s pseudo Riemannian 4D spacetime metrics under at impacts of “masses”, and how masses that compose gravitationally coupled systems move in so “curved spacetime” etc. At that, though, the equations allow to introduce besides the masses arbitrary parameters that impact on the spacetime– as that the lambda term is.
Really, as that is rigorously scientifically proven in the Shevchenko-Tokarevsky’s really philosophical 2007 “The Information as Absolute” conception, recent version of the basic paper see
and more concretely relating to Matter, and so to physics that studies Matter, SS&VT Planck scale informational physical model, in this case it is enough to read https://www.researchgate.net/publication/383127718_The_Informational_Physical_Model_and_Fundamental_Problems_in_Physics
- Matter’s spacetime is the fundamentally absolute, fundamentally flat, fundamentally continuous, and fundamentally “Cartesian”, (at least) [4+4+1]4D spacetime with metrics (at least) (cτ,X,Y,Z, g,w,e,s,ct), which [metrics] fundamentally cannot be impacted, including “expanded” by anything in Matter,
- since the metrics is really determined fundamentally only by degreases of freedom at changes of states of the Matter’s ultimate base – the primary elementary logical structures - (at least) [4+4+1]4D binary reversible fundamental logical elements [FLE], which compose the (at least) [4+4+1]4D dense FLE lattice that is placed in the Matter’s spacetime above; FLE “size” and “FLE binary flip time” are Planck length, lP, and Planck time, tP, the speed of light c= lP/tP.
Everything in Matter is/are only some specific disturbances in the lattice, which [besides mostly fixed in 3DXYZ space the lattice’s FLEs] constantly and always move with 4D speeds of light in 4D space with metrics (cτ,X,Y,Z), and simultaneously in parallel in the real absolute time ct-dimension.
Really the fundamentally infinite spacetime appeared fundamentally absolutely obligatorily just at when the first FLE was created on initial step of Matter’s creation; where the lattice could expand; and, as that seems rather well scientifically rationally follows from cosmological observations, that really happened at least two times: the first exponential expansion [inflation epoch, which has no any rational explanation in the mainstream, including in this case even the GR isn’t applicable], and further more tolerant expansion that is, rather possibly, really observable; and illusory is described by the lambda term.
This illusion, in principle, has a sense, since the lattice is uniform and isotropic as the logical “empty container” with metrics (at least) (cτ,X,Y,Z, g,w,e,s,ct) is, while the lattice expansions above were made by spending of some portions of energy. However that is pure phenomenology, no really scientific physical consequences follows from which.
More see the unique now really scientifically rationally grounded SS&VT initial cosmological model of Matter’s creation in the 2-nd link, section “Cosmology”; here only note also that practically everything in cosmology can be really understandable only after the Matter’s topology in the at least [5]4D spacetime with metrics (cτ,X,Y,Z,ct) will be developed; more again see the SS&VT initial model above..
In addition to my remark that dark energy is nonsense: the data on cosmological acceleration can be naturally explained without uncertainties in semiclassical approximation to quantum theory without assumptions about space-time, the value of Λ etc. In fact, this problem has arisen in view of historical reasons because Einstein said that introducing Λ was the biggest blunder of his life and for many physicists Einstein's words is a law.
It looks as worthwhile to add here also some points about what are general criteria for a theory be really scientific one, more see SS post on page 29 in
Brilliant question! Very timely as well. Now, despite PG's & dF's useful views, I think a simpler, more general answer is best. For instance, if astrophysics, plasma-physics, and energy science are important enough - to warrant truly scientific, generally applicable theories, praxis, good practice, and real progress - then #1 seems to be that we can discard any hypotheses & conjectures (promoted as credible theorems) that conform/accord with no (zero) evidence of actual reality/observed phenomena. #2, the longer a bogus "theorem" is promoted for no good, truly scientific reason is a sign of its "mystic" (unscientific) falsity/absurdity. #3, we want theory with great explanatory & predictive potentials. The "dark" energy & "dark" matter conjectures explain nothing and are considered "dark" because the promoters & fans have no idea what they're talking about. #4, we also want good theorems because they eliminate anomalies & other artifacts of ignorant confusion and/or misperception & misinterpretation (of the data). Now, as both SM physics & Scientistic Cosmogeny have rambled along their linear chronologies (for nearly 100 years), they have caused a nonlinear increase in anomalies & shibboleths & spectacularly busted predictions (criteria #5). Does all that seem sufficient and/or necessary for the evolution of real science, progressing beyond pop-scientism & Pluralistic Opinion Theory promoted as valid scientific thought etc.? Thanks ~ M
The thread question is rather essentially clarified in the SS posts on page 1, though it is worth to point more concretely that the at least two FLE lattice expansions really aren’t, of course, some “space expansions”, however happened by pumping into lattice some real energy; which was a small, but non-zero, part of whole energy that was spent at Matter’s creation.
A couple of recent SS posts in https://www.researchgate.net/post/NO50Should_the_Entire_Universe_Have_any_Symmetry_Can_a_Finite_Universe_Avoid_a_Centre/1 are relevant to this thread question,
As explained in doi:10.3390/axioms13030138 and my other publications, dark energy is meaningless not only from theoretical considerations but also because it is not needed: the phenomenon of cosmological acceleration can be clearly explained without uncertainties as a consequence of de Sitter symmetry in semiclassical approximation.