In the 21st century, we can safely say that absolutely all modern achievements in the field of science are based on the successes of modeling theory, on the basis of which practical recommendations are given that are useful in physics, technology, biology, sociology, etc., are extracted. In addition, this is due to the fact that the application of the principles of the theory of measurements in determining the fundamental constants allows us to check the consistency and correctness of the basic physical theories. In addition to the above, the quantitative predictions of the main physical theories depend on the numerical values of the constants included in these theories: each new sign can lead to the detection of a previously unknown inconsistency or, conversely, can eliminate the existing inconsistency in our description of the physical world. At the same time, scientists have come to a clear understanding of the limitations of our efforts to achieve very high measurement accuracy.
The very act of measurement already presupposes the presence of a physical and mathematical model that describes the phenomenon under study. Simulation theory focuses on the process of measuring the experimental determination of values using special equipment called measuring instruments. This theory covers only aspects of data analysis and the procedure for measuring the observed quantity or after the formulation of a mathematical model. Thus, the problem of uncertainty before experimental or computer simulations caused by the limited number of quantities recorded in the mathematical model is usually ignored in measurement theory.
Dear Sergey P. Klykov
Thank you for the fast answer.
It is great!
I will be pleased to get practical results of your idea.
Push your line!
Dear Sergey P. Klykov
Really, I will wait when "applications in biology will be published ".
I wish you success in your activities!!!
Dear Preston Guynn ,
Thank you for your detailed explanations of your idea.
I am not a specialist of SR. That is why I cannot give you any positive or negative remarks.
At the same time, I pay your attention on the following (according to my scientific experience, intuition and knowledge):
1. Your papers (it seems to me, I read them) were not checked by the prestige journals, like many others papers introduced in RG by different participants.
2. In science, a new idea must meet at least three conditions:
a. Include, as a special case, the previous theory;
b. Be reproducible by different groups of scientists;
c. Be accepted (have a consensus) in the scientific community.
These three conditions have been around for at least 350 years.
In any case, I wish you the further success to prove your ideas!
The information approach—to assess the model’s noncompliance with the physical phenomenon under study—has introduced an additional measurement accuracy limit that is more stringent than the Heisenberg Uncertainty Principle. And it turns out that the “fuzziness” of the observed object, strangely enough, depends on the personal philosophical prejudices of scientists, which are based on their experience, acquired knowledge and intuition. In other words, when modeling a physical phenomenon, one group of scientists can choose quantities that will differ fundamentally from the set of quantities that are taken into account by another group of scientists. The fact is that the same data can serve as the basis for radically opposite theories. This situation assumes an equally probable accounting of quantities by a conscious observer when choosing a model. A possible, though controversial, example of such an assertion is the consideration of an electron in the form of a particle or wave, for the description of which various physical models and mathematical equations are used. Indeed, it is not at all obvious that we can describe physical phenomena with the help of one single picture or one single representation in our mind.
Laser experiments suggest helium rain falls on Jupiter
Compressing a hydrogen and helium mixture with lasers shows that the two elements separate at pressures found within gas giant planets.
https://www.sciencenews.org/article/helium-rain-jupiter-pressure-laser-experiments-physics
Sprinkles of helium rain may fall on Jupiter.
At pressures and temperatures present within the gas giant, the hydrogen and helium that make up the bulk of its atmosphere don’t mix, according to laboratory experiments reported in the May 27 Nature. That suggests that deep within Jupiter’s atmosphere, hydrogen and helium separate, with the helium forming droplets that are denser than the hydrogen, causing them to rain down (SN: 4/19/21).
Jupiter’s marbled exterior is pretty familiar territory, but it’s still not clear what happens far below the cloud tops. So researchers designed an experiment to compress hydrogen and helium, reaching pressures nearly 2 million times Earth’s atmospheric pressure and temperatures of thousands of degrees Celsius, akin to inner layers of gas giants.
“We are reproducing the conditions inside the planets,” says physicist Marius Millot of Lawrence Livermore National Laboratory in California.
Millot and colleagues squeezed a mixture of hydrogen and helium between two diamonds and hit the concoction with a powerful laser to compress it even further. As the pressure and temperature increased, the researchers saw an abrupt increase in how reflective the material was. That suggests that helium was separating from the hydrogen, which becomes a liquid metal under these conditions (SN: 8/10/16). At even higher pressures and temperatures, the reflectivity decreased, suggesting that hydrogen and helium began mixing again.
The researchers calculated that hydrogen and helium would separate about 11,000 kilometers below the cloud tops of Jupiter, down to a depth of about 22,000 kilometers.
The results could help scientists explain observations made by spacecraft Galileo (SN: 2/18/02) and Juno (SN: 3/7/18), such as the fact that Jupiter’s outer layers of atmosphere have less helium than expected.
The rise of data-driven modelling
Nature Reviews Physics volume 3, page383 (2021)
https://www.nature.com/articles/s42254-021-00336-z?utm_source=natrevphys_etoc&utm_medium=email&utm_campaign=toc_42254_3_6&utm_content=20210609&WT.ec_id=NATREVPHYS-202106&sap-outbound-id=8105BF4ED4450CAB15CE1812D710DAEF086E7ED0
The number of physics articles making use of AI technologies keeps growing rapidly. Here are some new directions we find particularly exciting.
The use of machine learning is no news to physicists, who have been early adopters of AI technologies. For example, looking back at the 2011–2012 analysis of the Large Hadron Collider data underlying the discovery of the Higgs boson, machine learning enabled an increase in sensitivity equivalent to collecting 50% more data1. But the number of physics papers using machine learning posted on the arXiv preprint server, or abstracts submitted to the American Physical Society March and April meetings keeps growing. At the March meeting, the fraction of presentations with “machine learning” in the title or abstract increased from 0.3% in 2015 to 3.4% in 2021, and at the April meeting from 0.085% to 3.3%. Is this trend just reflecting the overall explosion in AI applications, or is there something else going on in physics?
When thinking of AI and neural networks, the first application that comes to mind is classification: does this image represent a cat or a dog, does this jet of particles come from a quark or a gluon? Neural networks are powerful classifiers that have already had a big impact in data-rich fields such as particle physics, astrophysics or X-ray free electron laser experiments2. But they are more than that: neural networks can approximate any function with arbitrary precision (here is an intuitive explanation why).
Thinking of neural networks as universal function approximators is particularly empowering for physicists. It is hard to think of a field of physics that does not use partial differential equations. Neural networks can approximate the solutions to partial differential equations much faster than traditional numerical methods. Furthermore, deep neural networks (neural networks with multiple layers) can approximate operators, meaning that they can solve families of partial differential equations.
Neural networks can approximate complicated, ugly functions like many body wavefunctions or interatomic potentials and therefore can be readily integrated into well-established numerical methods such as quantum Monte Carlo or molecular dynamics simulations, overcoming some limitations of traditional methods and speeding up calculations. This approach is likely to push forward the capabilities of current state-of-the-art methods and enable new insights.
This is just the beginning of what may turn out to be a new paradigm: data-driven modelling. Some fields such as fluid dynamics have already made important advances, others are just starting (see for example this recent Perspective). The prospect of not being intimidated by ugly, complex models and not being limited by an incomplete understanding of the underlying physics is certainly attractive, but there is no such thing as a free lunch. Here comes the small print: neural networks can — given enough training data — approximate any function with arbitrary precision.
The availability of data is not necessarily a show stopper. One can use a combination of experimental data or/and surrogate training data from other computational methods. For example, the HEPMASS Data Set containing Monte Carlo simulations of 10.5 million particle collisions and CAMELS, a data set of over 4,000 cosmological simulations, are available for training machine learning algorithms. In some cases, no training data is necessary (see this Tools of the trade piece). There are other emerging directions.
In a Review in this issue, George Em Karniadakis and colleagues discuss physics-informed machine learning in which the algorithm incorporates prior knowledge of the physical laws coming from the observational or theoretical understanding of the world. This approach makes the most of the imperfect data and incomplete knowledge of the model. Moreover, it promises the ability to discover previously unknown physics and to tackle high-dimensional problems.
Machine learning and traditional numerical methods will coexist complementing each other. Data-driven modelling will provide faster or computationally cheaper, sometimes lower-accuracy simulations that can be used for parameter estimation, in multi-scale simulations for the parts that do not require high resolution, for surrogate models and uncertainty quantification3.
These are early days and the field of data-driven modelling is yet to be defined: a consistent terminology and a taxonomy of the sub-topics needs to be developed by its practitioners. Different directions are waiting to be mapped. We are keen to document the developments in this new area and offer a forum for interdisciplinary dialogue and collaboration in our pages.
Celebrating Sadi Carnot
BY ANDY PEARSON, PH.D., C.ENG., FELLOW ASHRAE
June 1, 2021, is the 225th birthday of Sadi Carnot, an original thinker who was not afraid to ask the questions that nobody else had thought. His attempts to answer them were hampered by deficiencies in the scientific theory of the day, and sadly he died in a cholera outbreak at the age of 36 before he had managed to untangle himself from the contemporary understanding of the nature of heat that was holding him back.
Bob Hanlon who has produced an excellent (and very accessible) textbook called Block by Block: The Historical and Theoretical Foundations of Thermodynamics. The extract from Réflexions quoted above and much of the historical context comes from Bob’s book, which he says took 20 years to write. It should be on every refrigeration engineer’s bookshelf
A new ocean has appeared on earth
National Geographic, known for broadcasting popular science films, began producing maps of the world in 1915.
Four oceans were marked on them: Atlantic, Pacific, Indian and Arctic. But now, from June 8, the fifth ocean - the Southern Ocean - will be put on the maps, the channel's website says.
National Geographic Society geographer Alex Tate said that scientists have long recognized the Southern Ocean, but it has never been officially recognized due to the fact that there has never been an international agreement.
Geographers debated whether the waters around Antarctica were unique enough to merit their own name, or whether they were simply cold southern extensions of the Pacific, Atlantic and Indian Oceans.
The International Hydrographic Organization recognized the existence of the Southern Ocean as early as 1937, but this decision was reversed in 1953 due to pressure from the scientific community.
While the other oceans are defined by the continents that enclose them, the Southern Ocean is defined by the current. Scientists estimate that the Antarctic Circumpolar Current appeared about 34 million years ago. This happened when Antarctica separated from South America. Thanks to this, an unobstructed flow of water began to go around the bottom of the Earth.
DNA Jumps Between Animal Species. No One Knows How Often!!!!
The discovery of a gene shared by two unrelated species of fish is the latest evidence that horizontal gene transfers occur surprisingly often in vertebrates.
https://www.quantamagazine.org/dna-jumps-between-animal-species-no-one-knows-how-often-20210609/?utm_source=Quanta+Magazine&utm_campaign=a4b0044c43-RSS_Daily_Biology&utm_medium=email&utm_term=0_f0cb61321c-a4b0044c43-389723505&mc_cid=a4b0044c43&mc_eid=aed3226bfd
To survive in the frigid ocean waters around the Arctic and Antarctica, marine life evolved many defenses against the lethal cold. One common adaptation is the ability to make antifreezing proteins (AFPs) that prevent ice crystals from growing in blood, tissues and cells. It’s a solution that has evolved repeatedly and independently, not just in fish but in plants, fungi and bacteria.
It isn’t surprising, then, that herrings and smelts, two groups of fish that commonly roam the northernmost reaches of the Atlantic and Pacific Oceans, both make AFPs. But it is very surprising, even weird, that both fish do so with the same AFP gene — particularly since their ancestors diverged more than 250 million years ago and the gene is absent from all the other fish species related to them.
A March paper in Trends in Genetics holds the unorthodox explanation: The gene became part of the smelt genome through a direct horizontal transfer from a herring. It wasn’t through hybridization, because herring and smelt can’t crossbreed, as many failed attempts have shown. The herring gene made its way into the smelt genome outside the normal sexual channels.
Laurie Graham, a molecular biologist at Queen’s University in Ontario and lead author on the paper, knows she’s making a bold claim in arguing for the direct transfer of a gene from one fish to another. That kind of horizontal DNA movement once wasn’t imagined to happen in any animals, let alone vertebrates. Still, the more she and her colleagues study the smelt, the clearer the evidence becomes.
Terraformation gets $30M to fight climate change with rapid reforesting
https://techcrunch.com/2021/06/08/terraformation-gets-30m-to-fight-climate-change-with-rapid-reforesting/?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAAElhAy3Fr0mDxaNf8kO6Mg2sj9NyZ1xZv-KoZ9Fuowp3XA2eF0T4asKlZUocT9qx7mCsEx1WU1OhIyWAg8TB2UQG6JzHzMr4wX90d_Gg4JJ7HTcNonPRgWJZGANGyotxHA6h2iU-xvM7YdJrS_S-uByY5Dq_cNfIouoMbRZ1b1b_
Every startup is trying to fix something, but Terraformation is tackling the only problem that must matter to all of us: climate change.
This is why it’s in such a big huge hurry. Its mission — as a “forest tech” startup — is to accelerate tree planting by applying a startup-y operational philosophy of scalability to the pressing task of rapidly, sustainably reforesting denuded landscapes — bringing back native trees species to revive former wastelands and shrinking our carbon emissions in the process.
Forests are natural carbon sinks. The problem is we just don’t have enough trees with roots in the ground to offset our emissions. So that at least means the mission is simple: Plant more trees, and plant more trees fast.
Terraformation is targeting the main barriers to successful reforesting: Through early research and pilots it says it’s identified three key bottlenecks to large-scale forest restoration — namely, land availability, freshwater and seed. It then seeks to address each of these pinch-points to viable reforesting — identifying and fashioning modular, sharable solutions (tools, techniques, training etc.) that can help shave off friction and build leafy, branching success.
How a Simple Arithmetic Puzzle Can Guide Discovery
Playing with numbers can lead to deep mathematical and scientific insights.
https://www.quantamagazine.org/how-a-simple-math-puzzle-can-guide-discovery-20210528/
In the April Insights puzzle, I tried to guide readers down a path that might be best described as “experimental mathematics.” The goal was to rediscover two constants by iterating simple arithmetic procedures. Readers found that the procedures ended in a repeating cycle — either in a single number (a “cycle” of one) or in a cycle of two or more numbers. The first constant, 6174, was discovered in 1946 by the Indian mathematician D.R. Kaprekar through pen-and-paper arithmetic explorations. The second, ∂ = 4.6692016…, was discovered in 1975 by the mathematical physicist Mitchell Feigenbaum with the aid of an HP-65 programmable calculator.
Both of these constants remain somewhat mysterious. The first is an interesting curiosity in recreational number theory, while the second is a universal constant central to many chaotic phenomena in the real world.
Since you’ve already done the work of trying to discover these constants, let’s sit back and experience their magic.
Scientist sees deep meaning in black holes after Event Horizon Telescope’s triumph
https://www.universetoday.com/151425/scientist-sees-deep-meaning-in-black-holes-after-event-horizon-telescopes-triumph/
Why are black holes so alluring?
You could cite plenty of reasons: They’re matter-gobbling monsters, making them the perfect plot device for a Disney movie. They warp spacetime, demonstrating weird implications of general relativity. They’re so massive that inside a boundary known as the event horizon, nothing — not even light — can escape its gravitational grip.
But perhaps the most intriguing feature of black holes is their sheer mystery. Because of the rules of relativity, no one can report what happens inside the boundaries of a black hole.
“We could experience all the crazy stuff that’s going on inside a black hole, but we’d never be able to tell anybody,” radio astronomer Heino Falcke said. “We want to know what’s going on there, but we can’t.”
Falcke and his colleagues in the international Event Horizon Telescope project lifted the veil just a bit two years ago when they released the first picture ever taken of a supermassive black hole’s shadow. But the enduring mystery is a major theme in Falcke’s new book about the EHT quest, “Light in the Darkness: Black Holes, the Universe, and Us” — and in the latest installment of the Fiction Science podcast, which focuses on the intersection of fact and science fiction.
The Mystery at the Heart of Physics That Only Math Can Solve
https://www.quantamagazine.org/the-mystery-at-the-heart-of-physics-that-only-math-can-solve-20210610/
Over the past century, quantum field theory has proved to be the single most sweeping and successful physical theory ever invented. It is an umbrella term that encompasses many specific quantum field theories — the way “shape” covers specific examples like the square and the circle. The most prominent of these theories is known as the Standard Model, and it is this framework of physics that has been so successful.
“It can explain at a fundamental level literally every single experiment that we’ve ever done,” said David Tong, a physicist at the University of Cambridge.
But quantum field theory, or QFT, is indisputably incomplete. Neither physicists nor mathematicians know exactly what makes a quantum field theory a quantum field theory. They have glimpses of the full picture, but they can’t yet make it out.
“There are various indications that there could be a better way of thinking about QFT,” said Nathan Seiberg, a physicist at the Institute for Advanced Study. “It feels like it’s an animal you can touch from many places, but you don’t quite see the whole animal.”
Mathematics, which requires internal consistency and attention to every last detail, is the language that might make QFT whole. If mathematics can learn how to describe QFT with the same rigor with which it characterizes well-established mathematical objects, a more complete picture of the physical world will likely come along for the ride.
“If you really understood quantum field theory in a proper mathematical way, this would give us answers to many open physics problems, perhaps even including the quantization of gravity,” said Robbert Dijkgraaf, director of the Institute for Advanced Study (and a regular columnist for Quanta).
iep.utm.edu/simplici/
Simplicity in the Philosophy of Science
The view that simplicity is a virtue in scientific theories and that, other things being equal, simpler theories should be preferred to more complex ones has been widely advocated in the history of science and philosophy, and it remains widely held by modern scientists and philosophers of science. It often goes by the name of “Ockham’s Razor.” The claim is that simplicity ought to be one of the key criteria for evaluating and choosing between rival theories, alongside criteria such as consistency with the data and coherence with accepted background theories. Simplicity, in this sense, is often understood ontologically, in terms of how simple a theory represents nature as being—for example, a theory might be said to be simpler than another if it posits the existence of fewer entities, causes, or processes in nature in order to account for the empirical data. However, simplicity can also been understood in terms of various features of how theories go about explaining nature—for example, a theory might be said to be simpler than another if it contains fewer adjustable parameters, if it invokes fewer extraneous assumptions, or if it provides a more unified explanation of the data.
Preferences for simpler theories are widely thought to have played a central role in many important episodes in the history of science. Simplicity considerations are also regarded as integral to many of the standard methods that scientists use for inferring hypotheses from empirical data, the most of common illustration of this being the practice of curve-fitting. Indeed, some philosophers have argued that a systematic bias towards simpler theories and hypotheses is a fundamental component of inductive reasoning quite generally.
However, though the legitimacy of choosing between rival scientific theories on grounds of simplicity is frequently taken for granted, or viewed as self-evident, this practice raises a number of very difficult philosophical problems. A common concern is that notions of simplicity appear vague, and judgments about the relative simplicity of particular theories appear irredeemably subjective. Thus, one problem is to explain more precisely what it is for theories to be simpler than others and how, if at all, the relative simplicity of theories can be objectively measured. In addition, even if we can get clearer about what simplicity is and how it is to be measured, there remains the problem of explaining what justification, if any, can be provided for choosing between rival scientific theories on grounds of simplicity. For instance, do we have any reason for thinking that simpler theories are more likely to be true?
Science is "full of secrets", unable to dispense with secrets, about which W. Bibler writes in the following convincing words: "When I think, I formulate with the same force what I understand in the object, and what I am in I don’t understand in principle what is the original mystery for me, something beyond the limits of my understanding.
The concept of an object as much includes what is understandable in it (the object of knowledge as an idealized object), as well as the incomprehensible, incomprehensible (the object of knowledge as the impossibility of an idealized object). And only if there is both, we have a concept, not a term. “I know what I don’t know” - the emphasis here is different from that in the aphorism of Socrates. But it was the aphorism of Socrates that signified the beginning of "dialogics" as the only real logic of thinking.
And the point here is not about the trivial - "a scientist should know what exactly he does not know in the subject, what he should still learn, understand, find out" ... More precisely, not only about this. This (trivial) formula must hide and, in fact, always hides a different, paradoxical meaning: knowledge of an object is its ignorance, there is knowledge of this object, as not included in my knowledge, there is knowledge about its existence outside my knowledge, about it (the subject of) illogicality in the light of the actual logic I am developing ".
http://lc.kubagro.ru/aidos/Works_on_identification_presentation_and_use_of_knowledge.htm
Eugene Veniaminovich Lutsenko
Boris Michailovich Menin
As Eugene probably know from his professional work, understanding someone else's reality as a system that can be replicated in an automated system, this is an art form. One is only successful when that other person you have been modelling for approves the output from your system. When the system works is the only reality!
As an art, the work modelling of systems can rarely be appreciated by others who do not have the perspective or expertise. They see that something works but they do not know, or care, how.
But in doing this the analyst gains an understanding of reality and how we use it that other people do not have. Esoteric knowledge is what it is. It is secret because others cannot see it, their lives being dependent on a different perspective.
It is basically the ability to hold in one's mind a combination of different realities and be able to switch smoothly from one to the other.
L Kurt Engelhart!
Most people mistake their own and others ' models for reality itself. But there are people, and this discussion confirms this, who understand that we are always dealing only with models of reality. Usually these are the people who develop these models of reality. These models can be more or less successful, have different areas of applicability, etc. The development of society, technology and consciousness lead to an increase in the quality of my reality, and also to increase the number of such models
Dears Eugene Veniaminovich Lutsenko Sergey P. Klykov
and L Kurt EngelhartI am proud to inform you that afinal version of my article is published by JASIST:
Construction of a model as an information channel between the physical phenomenon and observer
Boris Menin
I have rights to advise in RG only the 1st page (attached)
Following to restrictions of the Wiley Rights Department, I can send a full version only by private mail! (if you will want :-) )
Boris Menin!
A very good (wonderful) title of the article. Probably the content corresponds to the name. In this regard, I advise you to look at my article: UNIVERSAL INFORMATION VARIATIONAL LAW OF SYSTEM DEVELOPMENT: https://www.researchgate.net/publication/331501836
Boris! Your article relates to the theory of knowledge. I also have several articles in this area, the list of which is (incomplete) I quote below. Pay attention to the article: Lutsenko E. V. System generalization of the Ashby principle and increasing the level of systemicity of the model of the object of cognition as a necessary condition for the adequacy of the process of its cognition / E. V. Lutsenko / / Polythematic network electronic scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2020. – №09(163). P. 100-134. - IDA [article ID]: 1632009009. - Access mode: http://ej.kubagro.ru/2020/09/pdf/09.pdf, 2,188 y. p. l.
The works of Prof. E. V. Lutsenko & Co. on the identification, presentation and use of knowledge, logic and methodology of scientific cognition: http://lc.kubagro.ru/aidos/Works_on_identification_presentation_and_use_of_knowledge.htm
All these works are in RG, but it's a long time to look for links, so I give links to the place of their publication.
I think we are not exploring reality itself, but only our models of reality, which we most often mistakenly and wrongly take for reality. This also applies to ourselves, i.e. our ideas about ourselves. These models become more and more adequate as the form of consciousness increases. Different forms of consciousness are supported or limited by different structures (bodies). These bodies have various informational possibilities of interaction with the surrounding and internal world. This imposes restrictions on the models of reality created under these forms of consciousness. The true model of reality is the limit to which the models of reality created under various forms of consciousness strive with an unlimited increase in the level of consciousness.
The theory of the expanding universe is a model based on the current state of science. Look at how science has changed in 200 years. How seriously do you take the point of view of scientists from 200 years ago today? You almost completely ignore their opinion. I think the same fate awaits the theory of the expanding universe. As an option, I can offer such a point of view that the Universe is expanding, but only on the screen of the world super computer in one of the thousands of simulations that God has called "Your Universe" (in the sense of our). And we are also just images with artificial intelligence in this simulation. You say everything correctly and quite competently, but from today's point of view. You point out the principle of correspondence that new theories must satisfy. There is also the principle of relativity, which says that the laws of physics do not change from the transition to another frame of reference or to another place or at another time or in different directions (symmetries, Noether theorems, etc.). But under one very significant circumstance. That a particular theory is adequate in its field. And it may not be adequate at all. Maybe the red shift is an illusion of consciousness. You may object that it is observed independently by many scientists and is therefore objective (the principle of observability). And I'll tell you what it means that this is already a mass hallucination, it even looks like a pandemic. And what do you say to this? No one has yet managed to logically refute subjective idealism (a bright representative is Bishop Berkeley). The principle of correspondence is the most important methodological principle of cognition. From the point of view of philosophy, he says that a new more general theory should not deny the previous ones, but remove them in the sense of Hegel. But here it MUST be borne in mind that the areas of adequacy of different theories can be correlated as sets in set theory: they may coincide, partially coincide or not at all, one may include the other in its composition completely or partially. The correspondence principle is true only for the case when the area of adequacy of the new theory completely includes the area of adequacy of the previous theory. The consistent application of the correspondence principle leads to a spiral of cognition based on Hegel's law of negation-negation (thesis - antithesis - synthesis). It turns out that the new theory may turn out to be more like her grandmother than her mother. Example: the heat-kinetic theory of heat is a quantum theory of heat with quasiparticles of sound phonons. From the modern standpoint, the theory of heat is more correct than from the standpoint of the kinetic theory of heat: heat is a quantum quasi-liquid consisting of phonons. The old theory may be wrong in general and it will not be part of the new theory in any form. For example, the theory of a flat Earth standing on three whales (turtles or elephants) was not included in the new theory that says that the Earth is a ball flying in unsupported outer space. The new theory does not solve the question of what is more true: whales, turtles or elephants. There is nothing about them in the new theory at all. The same applies to the ether in the SRT. And by the way, the Earth still turned out to be at the center of the universe, but not in the sense in which they thought before.
Lutsenko E. V., Loiko V. I., Laptev V. N. Systems of knowledge representation and acquisition: textbook. manual / E. V. Lutsenko, V. I. Loiko, V. N. Laptev. - Krasnodar: Ekoinvest, 2018. - 513 p. ISBN 978-5-94215-415-8. https://elibrary.ru/item.asp?id=35641755
Orlov A. I., Lutsenko E. V. System fuzzy interval mathematics. Monograph (scientific publication). - Krasnodar, KubGAU. 2014. - 600 p. ISBN 978-5-94672-757-0. http://elibrary.ru/item.asp?id=21358220
Lutsenko E. V. Mathematical and numerical modeling of the dynamics of the probability density of human consciousness states in evolution using the theory of Markov random processes / E. V. Lutsenko / / Polythematic network electronic Scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2005. – №07(015). P. 59-76. - IDA [article ID]: 0150507004. - Access mode: http://ej.kubagro.ru/2005/07/pdf/04.pdf, 1,125 cu. p. l.
Lutsenko E. V. Automated technologies of knowledge management in an agro-industrial holding / E. V. Lutsenko, V. I. Loiko, O. A. Makarevich / / Polythematic network electronic scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2009. – №08(052). P. 98-109 – - The cipher of the Information register: 0420900012\0088, IDA [article ID]: 0520908007 – - Access mode: http://ej.kubagro.ru/2009/08/pdf/07.pdf, 0.75 cu. p. l.
Lutsenko E. V. Intellectual consulting system for identifying technological knowledge and making decisions on their effective application based on system-cognitive analysis of business processes / E. V. Lutsenko, V. E. Korzhakov, A. I. Ladyga / / Polythematic network electronic scientific Journal of Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2010. – №05(059). P. 79-110 – - The cipher of the Information register: 0421000012\0091, IDA [article ID]: 0591005007 – - Access mode: http://ej.kubagro.ru/2010/05/pdf/07.pdf, 2 u. p. l.
Lutsenko E. V. Development of the intellectual system "Eidos-astra", which removes restrictions on the dimension of knowledge bases and the resolution of cognitive functions / E. V. Lutsenko, A. P. Trunev, E. A. Trunev / / Polythematic network electronic scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2011. – №05(069). P. 353-377. - The cipher of the Information register: 0421100012\0159, IDA [article ID]: 0691105031 – - Access mode: http://ej.kubagro.ru/2011/05/pdf/31.pdf, 1,562 cu. p. l.
Lutsenko E. V. Methodological aspects of identifying, presenting and using knowledge in the ASK analysis and the intellectual system "Eidos" / E. V. Lutsenko / / Polythematic network electronic scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2011. – №06(070). P. 233-280. - The cipher of the Information register: 0421100012\0197, IDA [article ID]: 0701106018. - Access mode: http://ej.kubagro.ru/2011/06/pdf/18.pdf, 3 u. p. l.
Lutsenko E. V. Method of cognitive clustering or clustering based on knowledge (clustering in system-cognitive analysis and intellectual system "Eidos") / E. V. Lutsenko, V. E. Korzhakov / / Polythematic network electronic scientific Journal of Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2011. – №07(071). P. 528-576. - The cipher of the Information register: 0421100012\0253, IDA [article ID]: 0711107040 – - Access mode: http://ej.kubagro.ru/2011/07/pdf/40.pdf, 3,062 cu. p. l.
Lutsenko E. V. Universal information variational principle of systems development / E. V. Lutsenko / / Polythematic network electronic scientific Journal of Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2008. – №07(041). P. 117-193. - The cipher of the Information register: 0420800012\0091, IDA [article ID]: 0410807010 – - Access mode: http://ej.kubagro.ru/2008/07/pdf/10.pdf, 4,812 u.p.l.
Lutsenko E. V. Do socio-economic phenomena obey some analogs or generalizations of the principle of relativity of Galileo and Einstein and are the Noether theorem and conservation laws fulfilled for them? / E. V. Lutsenko / / Polythematic network electronic Scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2013. – №07(091). P. 219-254. - IDA [article ID]: 0911307014. - Access mode: http://ej.kubagro.ru/2013/07/pdf/14.pdf, 2.25 cu. p. l.
Lutsenko E. V. Formation of subjective (virtual) models of physical and social reality by human consciousness and unjustified giving them an ontological status (hypostasis) / E. V. Lutsenko / / Polythematic network electronic scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2015. – №09(113). P. 1-32 – - IDA [article ID]: 1131509001 – - Access mode: http://ej.kubagro.ru/2015/09/pdf/01.pdf, 2 u. p. l.
Lutsenko E. V. Principles and prospects of correct meaningful interpretation of subjective (virtual) models of physical and social reality formed by human consciousness / E. V. Lutsenko / / Polythematic network electronic scientific Journal of Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2016. – №01(115). P. 22-75 – - IDA [article ID]: 1151601003 – - Access mode: http://ej.kubagro.ru/2016/01/pdf/03.pdf, 3,375 cu. p. l.
Lutsenko E. V. Problems and prospects of the theory and methodology of scientific cognition and automated system-cognitive analysis as an automated method of scientific cognition that provides meaningful phenomenological modeling / E. V. Lutsenko / / Polythematic network electronic Scientific Journal of Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2017. – №03(127). P. 1-60 – - IDA [article ID]: 1271703001 – - Access mode: http://ej.kubagro.ru/2017/03/pdf/01.pdf, 3.75 cu. p. l.
Lutsenko E. V. Cognitive veterinary medicine – veterinary medicine of the digital society: definition of basic concepts / E. V. Lutsenko, E. K. Pechurina, A. E. Sergeev / / Polythematic network electronic scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2019. – №08(152). P. 141-199. - IDA [article ID]: 1521908015. - Access mode: http://ej.kubagro.ru/2019/08/pdf/15.pdf, 3,688 cu. p. l.
Lutsenko E. V. About the higher forms of consciousness, the prospects of man, technology and society. http://lc.kubagro.ru/aidos/LC_young-3/LC_young-3.pdf
Lutsenko E. V. Total lie as a strategic information weapon of society during the period of globalization and augmented reality (is the principle of observability applicable in modern society as a criterion of reality) / E. V. Lutsenko / / Polythematic network electronic Scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2014. – №07(101). P. 1410-1427 – - IDA [article ID]: 1011407091. - Access mode: http://ej.kubagro.ru/2014/07/pdf/91.pdf, 1,125 cu. p. l.
Lutsenko E. V. Complete automated system-cognitive analysis of the periodic criteria classification of forms of consciousness / E. V. Lutsenko, E. K. Pechurina, A. E. Sergeev / / Polythematic network electronic scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2020. – №05(159). P. 22-93. - IDA [article ID]: 1592005003. - Access mode: http://ej.kubagro.ru/2020/05/pdf/03.pdf, 4,5 cu. p. l.
Lutsenko E. V. Existence, non-existence and change as emergent properties of systems / / Quantum Magic, volume 5, issue 1, pp. 1215-1239, 2008. http://quantmagic.narod.ru/volumes/VOL512008/p1215.html
Lutsenko E. V. System generalization of the Ashby principle and increasing the level of consistency of the model of the object of cognition as a necessary condition for the adequacy of the process of its cognition / E. V. Lutsenko / / Polythematic network electronic scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2020. – №09(163). P. 100-134. - IDA [article ID]: 1632009009. - Access mode: http://ej.kubagro.ru/2020/09/pdf/09.pdf, 2,188 y. p. l.
Lutsenko E. V. Efficiency of the management object as its emergent property and increasing the level of consistency as the goal of management / E. V. Lutsenko / / Polythematic network electronic scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2021. – №01(165). P. 77-98 – - IDA [article ID]: 1652101009. - Access mode: http://ej.kubagro.ru/2021/01/pdf/09.pdf, 1,375 cu. p. l.
Dear Prof. Eugene Veniaminovich Lutsenko
I read your article with a strong interest.
Examples of the applying of your UNIVERSAL INFORMATION VARIATIONAL LAW is impressed. Really, I follow your publications.
I am not sure that I can made a deep analysis of your achievments.
At the same time, I have got at least two remarks:
1. You mentioned that until now a link between energy and information is not discovered. At this point I do not agree with you.
In the scientific community, the prevailing viewpoint is that information is immaterial and does not have mass (Burgin, 2010); however, at the same time, information may be a specific invisible substance. Landauer's principle
showed that information is physical in nature (Landauer, 1991). It is highly probable that the new principle of “mass–energy–information equivalence” will also be true, and information does have mass. This perspective is shared by a number of scientists (Lloyd, 2000; Vopson, 2019).
2. To my point of view, the best example of application of any new method for verifying the real power of new idea, like yours, is to adapt (compare) your method to data of CODATA for results of different methods to measure a value of the fundamental constants, like G, H0, k, h. If you can, it will be great!
3. As I suggest to every researcher (if somebody asks me :-) ) it is recommended to "push your line" and to try to publish the idea in the prestage journal. RG is not a good platform to check the scientific level of the idea.
In any case, I wish you success in your research way.
Best regards
Boris Menin!
Thank you for your support! I have an article: Lutsenko E. V. The effectiveness of the management object as its emergent property and increasing the level of consistency as the goal of management / E. V. Lutsenko / / Polythematic network electronic scientific Journal of the Kuban State Agrarian University (Scientific Journal of KubGAU) [Electronic resource]. - Krasnodar: KubGAU, 2021. – №01(165). P. 77-98 – - IDA [article ID]: 1652101009. - Access mode: http://ej.kubagro.ru/2021/01/pdf/09.pdf, 1,375 cu. p. l. In this article I am trying to reveal the mechanism of management efficiency. To do this, I apply the basics of the information theory of systems, to which I have made a contribution: http://lc.kubagro.ru/aidos/Work_on_emergence.htm.
PS
For us, publication in reputable publications included in Scopus and WoS takes years and this requires an advance payment in the amount. exceeding our wages. I have only a few such publications in my entire life:
RSCI: https://www.elibrary.ru/author_profile.asp?id=123162
Scopus: https://www.scopus.com/authid/detail.uri?authorId=57188763047
Web of Science: https://publons.com/researcher/1596347/eugene-lutsenko/
Web of Science ResearcherID S-8667-2018
ORCID: https://orcid.org/0000-0002-2742-0502
E-mail: [email protected]
Home URL: http://lc.kubagro.ru/
Personal Blog: https://www.researchgate.net/profile/Eugene-Lutsenko
Skype: eugene_lutsenko
https://www.youtube.com/channel/UC_QF84d8SCaWxsnXnexNFzg
Personal pages on the websites of universities and in PARADISE:
https://kubsau.ru/education/chairs/comp-system/staff/3965/
https://kubsu.ru/ru/public-portfolio/39926
http://www.famous-scientists.ru/17314/
http://www.famous-scientists.ru/school/1608
Dear Eugene Veniaminovich Lutsenko
Thank you very much for this very interesting article.
Unfortunately, for other participants it is not forreading: on Russian :-)
Hello, Boris! There are many interesting things written in Russian. However, as in other languages, except for the language of the coast of the tibia (and even then, only for the reason that such a language, like the coast itself, simply does not exist). However,there are translators for this, including automated and online ones
Dear Eugene Veniaminovich Lutsenko
I would like to agree with you, BUT
most of the participants have no patience and desire to bother translating articles :-) !!!
As I already told YOU, the best way is to find money and publish an article (after peer review) in a prestigious journal.
Otherwise: there is no prophet in his own country !!!
The importance of uncertainty
But sometimes an idea makes it further than that. Much of the work scientists put into publishing a scientific result involves figuring out how well they know it: What’s the uncertainty and how do we quantify it?
“If there’s any hallmark to the scientific method in particle physics and in closely related fields like cosmology, it’s that our results always come with an error bar,” . “A result that doesn’t have an uncertainty attached to it has no value.”
In a particle physics experiment, some uncertainty comes from background, like the data Narain’s group found that mimicked the kind of signal they were looking for from the top quark.
This is called systematic uncertainty, which is typically introduced by aspects of the experiment that cannot be completely known.
“When you build a detector, you must make sure that for whatever signal you’re going to see, there is not much possibility to confuse it with the background,” . “All the elements and sensors and electronics are designed having that in mind. You have to use your previous knowledge from all the experiments that came before.”
Careful study of your systematic uncertainties is the best way to eliminate bias and get reliable results.
“If you underestimate your systematic uncertainty, then you can overestimate the significance of the signal,”. “But if you overestimate the systematic uncertainty, then you can kill your signal. So, you really are walking this fine line in understanding where the issues may be. There are various ways the data can fool you. Trying to be aware of those ways is an art in itself and it really defines the thinking process.”
Physicists also must think about statistical uncertainty which, unlike systematic uncertainty, is simply the consequence having a limited amount of data.
“For every measurement we do, there’s a possibility that the measurement is a wrong measurement just because of all the events that happen at random while we are doing the experiment,” Takai says. “In particle physics, you’re producing many particles, so a lot of these particles may conspire and make it appear like the event you’re looking for.”
You can think of it as putting your hand inside a bag of M&Ms. If the first few M&Ms you picked were brown and you didn’t know there were other colors, you would think the entire bag was brown. It wouldn’t be until you finally pulled out a blue M&M that you realized that the bag had more than one color.
Particle physicists generally want their results to have a statistical significance corresponding to at least 5 sigma, a measure that means that there is only a 0.00003 percent chance of a statistical fluctuation giving an excess as big or bigger than the one observed.
It's easy to just write a set of equations and say, "I think the conventional equations are wrong, and my new ones are." Anyone can do this. For most of them, the error can be found on the first page, because there is an experiment or technology that could not work if the new equations were correct.
It is extremely difficult to invent equations that fit all the experiments done and the technologies invented. This is the high standard of science and nature.
Maybe it will be interesting for you???
Preprint Informational restrictions in the formulation of physical la...
Dear prof. Eugene Veniaminovich Lutsenko
I strongly suggest to visit a web-site of the future conference
Theoretical and Foundational Problems (TFP) in Information Studies
and maybe to participate.
https://tfpis.com/
It does not require any fee!
Someday we may face a situation in which Einstein's equations will not work, and they themselves will have to be extended.
Perhaps the first hints of this will appear in experiments. Or through the realization of theoretical inconsistencies.
But so far the SRT equations, proposed by Einstein to describe E, p, m and v, for objects moving without the influence of external forces (as well as the speed limit contained in these equations - no object can, from the point of view of an outside observer, move with speed faster c) work without any conflicts.
Hello, Boris Menin! I looked at the conference website. I understand that. that I worked on the same problems that the conference is dedicated to and I have something new to say in all sections. But there are a few problems. The first problem is that I don't speak enough English to communicate at a conference in real time. The second problem: all the control dates have already passed.
Dear Eugene Veniaminovich Lutsenko
It is a pity :-)
In any case, you can ask prof Mark Burgin - he speaks Russian....
There is mail adress..
Here is our idea to improve modeling: There are a lot of different ways to minimize the residual sum of squares, but using it for curve fitting means that your data must not suffer from systematic errors. Using a loss function which assures that the correlations within a curve or a series of curves are preserved, allows you to correct such systematic errors. See: Article Hybrid 2D Correlation-Based Loss Function for the Correction...
Thomas Mayerhöfer
In my opinion, rational wholeness is the only criterion useful for evaluating a model of a system.
Your 2D method for correcting systemic errors seems to fall into improving what I call the "probability of the model's success."
Thesis Metamodel for human systems'
Thanks,
Kurt
L Kurt Engelhart Actually you can use 2D correlation analysis even to evaluate if a model is appropriate. This is why we think it is particularly useful as loss function in neural networks.
All the best,
Thomas
Maybe this is a bit of diamond dust to improve measurements?
http://www.physicsjournal.net/article/view/30/3-3-11
Dear Koen Van de Moortel
Thank you for your link. I read your article.
Regarding your new version of ‘least squares regression’.
Least square approach is widely used in the methotodology of CODATA for verifying the most attractive value of the fundamental physical constant.
To my point of view, at this moment, CODATA's version is carefully verified.
That is why, I suggest to check your version for measuring value of physical constant.
If it will work - it will be a real achievement!
What do you think?
BR
Dears Thomas Mayerhöfer and L Kurt Engelhart
As I mentioned in a previous answer for Koen Van de Moortel , at this moment, any methods for verifying the best model must and can be checked in comparison with the CODATA approach.
It is explained by the fact that scientists of CODATA reached huge success in verifying the exact values of physical constants.
Be healthy!
Dear Boris Michailovich Menin , I did check my multidirectional least squares to determine the absolute zero temperature (Gay-Lussac experiment) and the result was better than with the classical OLS. With all the experiments I did, I had better results.
Dear Koen Van de Moortel
Thank you for your answer.
Everyone hears what he sees
Everyone sees what he wants...
Boris Michailovich Menin , yes the examples don't prove much by themselves, but the fact that with my method you can switch T and p and get the same result, is the most important. That's how it should be theoretically.
Now, what do you suggest I should look at on codata.org?
Dear Koen Van de Moortel
I suggest options according to my experience, knowledge and intuition:
1. Take data of measurements of any physical constant to check your method. I did the same when I applied my information approach for measuring Plank's constant, gravitational constant, Boltzmann's constant, Hubble constant.
2. Try to prepare your manuscript for the prestige journal and get its comments.
3. RG is not good platform for the professional critics.
For me, the diamond idea is the idea of a systematic generalization of mathematics: http://lc.kubagro.ru/aidos/Work_on_emergence.htm
Dear Prof. Eugene Veniaminovich Lutsenko
Thank you for your link.... BUT
If you don't praise yourself, who will notice?
:-) :-) :-)
May be you are right...
I am not your judge
Boris Menin! I'll continue in my terrible style. I, like many researchers who consciously conduct research, believe that we do not know the object of knowledge directly, as it is. We are only building a model of the object of knowledge. Then we get to know the object of cognition by examining its model. And then, if we do well, i.e. the model demonstrates high reliability, we make a terrible methodological mistake in the process of cognition: we begin to seriously believe that our model of the object of cognition is the object of cognition itself (reality itself). This error is called hypostasis: http://lc.kubagro.ru/aidos/Works_on_identification_presentation_and_use_of_knowledge.htm
Борис Менин! Продолжу в своем ужасном стиле. Я, как и многие исследователи, осознанно ведущие исследования, считаю, что мы не познаем объект познания непосредственно, как он есть. Мы лишь строим модель объекта познания. Потом познаем объект познания путем исследования его модели. А потом, если у нас получается хорошо, т.е. модель демонстрирует высокую достоверность, совершаем ужасную методологическую ошибку процесса познания: мы начинаем всерьез считать, что наша модель объекта познания это и есть сам объект познания (сама реальность). Эта ошибка называется гипостазирование.
Dear Eugene Veniaminovich Lutsenko
I do not understand or do not agree with you:
As I know (it is my feeling) nobody from known scientists declared that "the object of cognition is the reality itself".
Each step in recognition of Nature is not of its final description.
Using any model, the observer cut many links which are seemed to him less improtant. At the same time, there unkown unknowns which will be discovered in the future.
There is not end to this process..........
Boris Menin! They didn't talk about it just because they thought it was already clear and there was no need to say anything more about it. But they thought that they were exploring reality itself and that their models were reality itself. This is evident from their statements about the non-existence or impossibility of something. For example, about the impossibility of a perpetual motion machine or aircraft heavier than air, or meteorites. They claimed that all this is impossible and does not exist. At the same time, it was meant that this is impossible and does not exist in reality itself. But in fact it was only in their models of this reality. Moreover, the models are not quite adequate. This is called hypostasis and proves that I am right, not you.
Они об этом не говорили только потому, что думали, что это и так ясно и об этом нет необходимости дополнительно что-то говорить. Но они так думали, что исследуют саму реальность и что их модели - это и есть сама реальность. Это видно из их высказываний о несуществовании или невозможности чего-либо. Например о невозможности вечного двигателя или летательных аппаратов тяжелее воздуха, или метеоритов. Они утверждали , что это все невозможно и не существует. При этом имелось в виду, что это невозможно и не существует в самой реальности. А на самом деле это было только в их моделях этой реальности. Причем в моделях не вполне адекватных. Это и называется гипостазирование и доказывает, что я прав, а не вы.
Dear Eugene Veniaminovich Lutsenko
It is a pity that we do not understand each other.
You write about something for which there is no evidence.
You attribute to yourself the proof that there was not.
Well - the flag is in your hands !!!
Boris Menin! In this case, the proof is logic. If a person denies that something is possible in reality, and then it turns out to be possible, then he had an incorrect model of reality, which he mistakenly took for reality itself. What's not clear here? I don't understand:)
В данном случае доказательством является логика. Если человек отрицает, что что-то в реальности возможно, а потом это оказывается возможным, значит у него была неверная модель реальности, которую он ошибчно принимал за саму реальность. Что здесь непонятного? Не понимаю:)
Dear prof. Eugene Veniaminovich Lutsenko
You attribute to scientists what they did not say and call logic to help.
In this case it doesn't work because you are following YOUR logic.
Let me remind you of a childhood joke...
Petya stands in the school corridor and cries.
The director passes and asks him why he is standing in the corridor.
The boy replies:
I farted in class. The teacher kicked me out of class. I am in the hallway, and fart is in the classroom...Where is the logic????
Scientists have not said what I attribute to them, simply because they do not understand it. Just like you don't understand it and many others. But they behaved in such a way that it can be logically explained the way I explained it. And what are your explanations for the fact that they consider something impossible, and then it turns out to be possible? By the way, atoms don't say anything either, but they behave in a certain way and physicists explain why they behave this way and not otherwise.
Ученые не говорили того, что я им приписываю, просто потому, что они этого не понимают. Точно также как вы этого не понимаете и многие другие. Но они вели себя так, что это можно логично объяснить так, как это я это объяснил. А какие у вас объяснение того, что они считают что то невозможным, а потом это оказывается возможным? Кстати, атомы тоже ничего не говорят, но ведут себя определенным образом и физики объясняют, почему они ведут себя именно так, а не иначе.
P.s.
1. Not all scientists don't talk about it: for example, I'm talking about it.
2. Maybe you even think that when you look around, you see reality itself?
PS
1. Не все ученые об этом не говорят: например я об этом говорю.
2. Может быть вы даже думаете, что когда вы смотрите вокруг, то вы видите саму реальность?
Eugene Veniaminovich Lutsenko
You observe the fact of most people accepting their reality as the thing in itself, because it is just easier that way!
This is a valid hypothesis as observed from the evidence.
Kurt
Dear L Kurt Engelhart and Eugene Veniaminovich Lutsenko Koen Van de Moortel
Our comments are real prove that in a science there are many different interesting opinions.
Only time can confirm who is right!
Time will hardly help here... There are creatures (species) that have remained virtually unchanged for millions of years. They felt great even under the dinosaurs. And now they also feel great in our modern apartments (mainly in the kitchens). And I am sure they will feel great millions of years after the disappearance of mankind from the face of the Earth. And you can't prove anything to them, because they don't listen. And if they listen, they don't understand.
Время здесь едва ли поможет... Есть существа (виды), остающиеся практически неизменными миллионы лет. Они прекрасно себя чувствовали еще при динозаврах. И сейчас тоже прекрасно себя чувствуют в наших современных квартирах (в основном на кухнях). И я уверен прекрасно будут себя чувствовать через миллионы лет после исчезновения человечества с лица Земли. И ничего ты им не докажешь, т.к. они не слушают. А если и слушают, то не понимают.
Dears Eugene Veniaminovich Lutsenko Ali J. Abboud L Kurt Engelhart Koen Van de Moortel Thomas Mayerhöfer
It is a very interesting paper for your consideration...
Preprint Estimation of the information contained in the visible matte...
Melvin M. Vopson
Preprint Estimation of the information contained in the visible matte...
The information capacity of the universe has been a topic of great debate since 1970s, and continues to stimulate multiple branches of physics research. Here we used Shannon’s information theory to estimate the amount of encoded information in all the visible matter in the universe. We achieved this by deriving a detailed formula estimating the total number of particles in the observable universe, known as the Eddington number, and by estimating the amount of information stored by each particle about itself. We determined that each particle in the observable universe contains 1.509 bits of information and there are 6 × 1080 bits of information stored in all the matter particles of the observable universe.
This would have interested my good friend Fons Wils, who recently passed away.
See e.g.: https://www.researchgate.net/project/Scientific-approach-of-self-organization-ectropy-and-coherence-in-matter
Unity formulas for the coupling constants and the dimensionless physical constants
Stergios Pellis
Preprint Unity formulas for the coupling constants and the dimensionl...
In this paper will be presented the unity formulas for the coupling constants and the dimensionless physical constants. The theoretical value of the strong coupling constant αs=Euler's number/Gelfond's constant is the key that solves many problems of Physics. We will present the recommended theoretical value for the weak coupling constant. It will be presented the formula for the fine-structure constant with the golden angle,relativity factor and the fifth power of the golden mean and the simple expression for the fine-structure constant in terms of the Archimedes constant. The exact mathematical expression for the proton to electron mass ratio using Fibonacci and Lucas numbers and other two exact mathematical expressions for the proton to electron mass ratio. New formulas for the Planck length and the Avogadro's number. The unity formulas that connect the fine-structure constant and the proton to electron mass ratio. We will find the formulas that connect the strong coupling constant and the fine-structure constant. The unity formulas that connect the strong coupling constant,the weak coupling constant and the fine-structure constant. It will be presented the mathematical formulas that connects the strong coupling constant,the weak coupling constant,the proton to electron mass ratio,the fine-structure constant,the ratio of electric force to gravitational force between electron and proton,the Avogadro's number,the gravitational coupling constant for the electron and the gravitational coupling constant of proton. Also we will find the formulas for the Gravitational constant. It will be presented that the gravitational fine structure constant is a simple analogy between atomic physics and cosmology. Finally we will find the expression that connects the gravitational fine structure constant with the four coupling constants. Perhaps the gravitational fine structure constant is the coupling constant for the fifth force. In this work we will assume the theoretical value of the strong coupling constant. This value fits perfectly in the measurement of the strong coupling constant. Also we followed the energy wave theory and the fractal space-time theory.
Informational Restrictions in the Formulation of Physical Laws by Researchers Boris Menin
https://www.mdpi.com/2504-3900/81/1/31
By combining the information-oriented and theoretically proven method with the construction of the realized SI, it is possible to formulate the accuracy limit of any physical law or formula describing the observed phenomenon. This has never been described in the literature. Example is given.
The concept of information is becoming a pillar of modern science [1]. There is great potential for modeling physical processes using the concepts and mathematical apparatus of information theory, taking into account the qualitative and quantitative sets of variables in the model. However, over the centuries, it has proved difficult to choose and define a system of units for the study of natural and technological processes and phenomena. Since each variable selected from the system of units contains a finite amount of information about the object of interest [2], scientists and engineers may consider using the concept of “amount of information” contained in the model to achieve a minimum threshold discrepancy between the model and the phenomenon or process under study. Combining the information-oriented and theoretically proven method with the construction of the realized international system of units, shortly SI (the two look like unrelated branches of science), it is possible to formulate the accuracy limit of any physical law or formula describing an observed phenomenon. This has never been described in the literature. The purpose of this research article is to provide a theoretically substantiated application of the phenomenon of random choice of a variable observed when formulating a model of any physical process. The article is based on the use of the basic element—the finite information quantity (FIQ) [2]—and the implementation of the information method described in [3,4]. Examples are introduced.
Simplicity of Physical Laws: Informational-Theoretical Limits
BORIS M. MENIN
Article Simplicity of Physical Laws: Informational-Theoretical Limits
To assess the required simplicity of physical law, it is proposed that the finite information quantity (FIQ)-based approach be used. The approach proved to be reliable and accurate when analyzing the results of measuring physical constants. The method is based on the idea that using a finite amount of information in the model enables one to calculate the smallest preliminary and unremovable comparative uncertainty (respectively, relative uncertainty) depending on a qualitative-quantitative set of variables. The method does not require the usually applied constraints to the input data and works well with numerous statistical assumptions: the normality of the probability distributions of the data, observations, absence of outliers, etc. This paper provides researchers with a tool for analyzing the required level of simplicity of the resulting formulas. The FIQ-based approach is applied to verify the required level of simplicity of different physical laws.
RECOMMENDED Resource for those looking for a concise summary of the most relatively feasible concepts for free energy:
Article Perpetual Motion Executive Summary Report 2022-11-17
In more TOE-type categories, see: https://theoryofeverything.quora.com/Useful-Excel-Files-2022-04-28