In this part other specific well-known subjects are revisited. Please support or refute the following arguments in a scientific manner.
1) Still there is no convincing theorem, with a low range of uncertainty, to calculate the response of climate system in terms of the averaged global surface temperature anomalies with respect to the total feedback factors and greenhouse gases changes. In the classical formula applied in the models a small variation in positive feedbacks leads to a considerable changes in the response (temperature anomaly) while a big variation in negative feedbacks causes just small variations in the response.
2) NASA satellite data from the years 2000 through 2011 indicate the Earth's atmosphere is allowing far more heat to be emitted into space than computer models have predicted (i.e. Spencer and Braswell, 2011, DOI: 10.3390/rs3081603). Based on this research "the response of the climate system to an imposed radiative imbalance remains the largest source of uncertainty. It is concluded that atmospheric feedback diagnosis of the climate system remains an unsolved problem, due primarily to the inability to distinguish between radiative forcing and radiative feedback in satellite radiative budget observations." So the contribution of greenhouse gases to global warming is exaggerated in the models used by the U.N.’s Intergovernmental Panel on Climate Change (IPCC).
3) Ocean Acidification
Ocean acidification is one of the consequences of CO2 absorption in the water and a main cause of severe destabilising the entire oceanic food-chain.
4) The IPCC reports which are based on a range of model outputs suffer from a high range of uncertainty because the models are not able to implement appropriately a few large scale natural oscillations such as North Atlantic Oscillation, El Nino, Southern ocean oscillation, Arctic Oscillation, Pacific decadal oscillation, deep ocean circulations, Sun's surface temperature, etc. The problem with correlation between historical observations of the global averaged surface temperature anomalies with greenhouse gases forces is that it is not compared with all other natural sources of temperature variability.
5) If we look at micro-physics of carbon dioxide, theoretically a certain amount of heat can be trapped in it as increased molecular kinetic energy by increasing vibrational and rotational motions of CO2, but nothing prevents it from escaping into space. During a specific relaxation time, the energetic carbon dioxide comes back to its rest statement.
6) As some alarmists claim there exists a scientific consensus among the scientists. Nevertheless, even if this claim is true, asking the scientists to vote on global warming because of human made greenhouse gases sources does not make sense because the scientific issues are not based on the consensus; indeed, appeal to majority/authority fallacy is not a scientific approach.
%%----------------------------------------------------------------------------------------------------------------------------%%
Link to the discussions of Global Warming (Part 1):
https://www.researchgate.net/post/Global_Warming_Part_1_Causes_and_consequences_of_global_warming_a_natural_phenomenon_a_political_issue_or_a_scientific_debate
Link to the discussions of Global Warming (Part 2):
https://www.researchgate.net/post/Global_Warming_Part_2_A_growing_threat_or_nothing_to_worry_about_An_effect_of_greenhouse_gases_or_a_natural_climate_change
Link to the discussions of Global Warming (Part 4):
https://www.researchgate.net/post/Global_Warming_Part_4_Causes_and_consequences_of_global_warming_a_natural_phenomenon_a_political_issue_or_a_scientific_debate
For this question, there is no simple yes/no answer.
To what extent, do we trust and value the opinions of the experts on the issue of climate change? That is the issue.
my opinion is due to both green house gases mostly CO2 and CH4
You ask for scientific replies
but you do not present scientific questions becasue there are no scientific references for your citations or remarks
As for your subquestion 5: What is the relation with the full question?
CO2 after absorbing radation loses this energy by collisions with air molecules
Is this the question and when so this is documented in simplified schemes
for instance here at a "denier"'s blog
http://clivebest.com/blog/?p=4265
Harry ten Brink
In this subquestion and also in the context as a whole, I would like to focus on an argument related to the role of negative feedbacks of greenhouse gases and other tracer fields. The existing formulation/parametrisation of feedbacks, applied in almost all models, does not represent the correct response, see equation 1 and 10 in the following paper:
Article Radiation Transfer Calculations and Assessment of Global War...
Mathematically there is a singularity for the positive feedback, that is, \Delta T increases abruptly for cumulated radiative forces, while for the negative feedbacks for example, by water vapour, albedo, the lapse rate, clouds, or greenhouse gases it does not give a similar response. So it seems that the negative feedback contribution is underestimated in the models.
here is some part of this article related to this argument:
"... this calculus method suffers from the fact
(i) that the Planck sensitivity 𝜆𝑆 has to be adapted from other models;
(ii) that it is diffcult to distinguish between different forcings with different feedbacks;
(iii) that this method only considers a radiation balance at the tropopause or TOA, but not an additional radiation and energy balance at the surface;
(iv) that it does not consider the feedback of the atmosphere to the surface and vice versa, caused by radiation changes as well as changes of sensible and latent heat;
(v) that some feedback processes and their evaluation are not really retraceable from other sources;
(vi) and that for simplicity reasons often sw feedback effects are completely neglected."
See also:
R. S. Lindzen and Y.-S. Choi, “On the observational determination of climate sensitivity and its implications,” Asia-Pacific Journal of Atmospheric Sciences, vol. 47, no. 4, pp. 377–390, 2011
I scanned thru this publication but I got nauseated by the introduction by this know-it-all better without addressing the statements he makes: a gross remark
CO2 emissions are 95 % natural
OK over and out for me
Just my 2-cent doubt about a link between temperature and driving force.
Coming back to the fundamental laws, we must express global conservation relations for mass, momentum, total energy and estensive quantitives formulated over a certain volume around the Earth. Temperature is not an estensive variable, we should talk more properly about the internal energy equation and formulate the correct "energy" fluxes over the surfaces (that is atmosphere and ground) as well as production terms in the volume.
I often read about a direct relation between energy-based fluxes and evolution of the temperature (an intensive variable) which appears to me not fully representative of a correct model.
A comment on your #5.
When CO2 absorbs an IR photon, that increases its internal bond vibration energy, which in turn can impart that energy to other (N2, O2) molecules through molecular collisions. This is how greenhouse warming of the atmosphere is spread throughout, but it is not the major mechanism of overall global greenhouse warming.
The major mechanism of greenhouse warming relies on the fact that the RATE by which a moecular bond emits IR energy is equal to the Stefan-Boltzman constant times the fourth power of the temperature. When IR photons are emitted from the warmer Earth's surface and are of a wavelength that they are not absorbed, energy loss to space is maximized. However, when a CO2 or H2O molecule in the higher colder atmosphere (colder by the lapse rate) absorbs those photons of appropriate wavelength, they then radiate that energy at a lower RATE, according to the T^4 relationship. More of the outgoing energy is kept and transferred as kinetic energy to other molecules, rather than being lost to space.
This is the simple reason greenhouse gases warm the atmosphere.
The question is moot, ice coverage is reducing, glaciers receding, coral reefs bleaching and so on so the scientific evidence of impact is unequivocal.
if anthropogenic source are the major cause, we have to cut them to start getting back to the natural balance and if a background natural trend is the major cause, we still need to cut out effects to try to mitigate or counteract that trend.
I believe that arguing about causation will be disasterous. At the end of the day, someone living on an island or shoreline that is now below sea level doesn't really care about the "why" but about "what will do about it?". Determining the why will lead to solutions but they will be long-term. I don't think there is switch that will, in one fell swoop, stop change. If, in fact, we can because the earth is dynamic. Seems our first priority should be to deal with the (say) next century of impacts based on current trends. This may be, say, raise levees, build seawalls, pull back from coasts, relocate food production, and so on. At the same time, but is separate processes, we find ways to change greenhouse gas production and so on. As long as we insist on doing them both together I thinks we'll fail.
Incoming particles from exploding stars have been a threat, are a threat, and will be a threat. If we don't learn how to protect ourselves against them the USA could be destroyed as a nation in 2083. The bigger problem is collision with an interstellar meteor due to its 70% of light speed an asteroid is not the problem. Until scientist accept these things as possibilities or probilities and design a system of defense, humanity is doomed. Hopefully your spirit will live forever what ever happens. We simply try to protect the world by publishing the TRUTH.
https://independent.academia.edu/WilliamSokeland is a location for my published and unpublished papers of the SNIT theory. At this time climate change is a political football and I am sorry politics are involved. President Trump has made the right moves but scientific reasoning is lacking. I cast my vote towad Dr. Roy Spencer, but he has no concept of nova and supernova debris streams causing things like the death of the Saiga antelope or the warm Alaskan winters of the last two years. The energy source that is causing the problems produces local effects like the melting of the Bering Sea ice for the last two years and the early melt of the Nares Strait 2017 and 2019. A CO2 model could cause the ice caps to retreat toward the poles, but could not produce a localized heating event like the recent Alaskan winters. As long as mankind recognizes the new source of energy, he has a chance to stop harmful effects of the new phenomenon. Besides for the two problems mentioned in my previous statement an ice age could result due to the absence of impacting debris streams. I proposed this as the reason for the Little Ice Age in my first paper. Warm could hurt us, but there is no record of it doing so. There is a record of cold in the northern hemisphere killing thousands due to starvation. What happens happens, I have warned the world and that was my job. If the world is so uneducated that it can't recognize the TRUTH, it gets what it deserves.
Masoud Rostami is close to truth in his doubts.
1. Albedo decreases several times when clouds. The clouds are
produced from steam. The steam is produced more from wormer
water. This is a very strong negative feeback dramatically exeeding CO2
effect.
2. Of course during the nights the clouds defend Earth from heat loss but this
positive feedback weaker than the neganive one/
3. " The rate of mean temperature of the earth has been increased almost twice with respect to 60 years ago, it is a fact (Goddard Institute for Space Studies, GISS, data) " . In other words : Love exist (Romeo & Julieta, W. Shakespeare).
I meet practically every week in mass media regular propaganda of increasing of 'mean temperature of the earth', but no explanation what is it. This is not accidentally The notion is not simple. At first, one must applies multidimensional integral for the definition, and at second, discovers that meteorology station are very far from nessessary exactness . So, 0.74 °C of 'mean temperature' for a century is a pure fantasy. See, I don't affirm that it is a lie.
4. For me the most intresting iprobem is hidden away in 6) from part 3. The votinng for the heating (9:1) is a part of every day propaganda for the heating.
Who pay for the propaganda? Who pay for a plenty of departmens in American univercities studying global heating? Who gets better salary in
mr Miheev
Ad 1 more Clouds: water vapor is a strong greenhouse gas
Ad 2 more clouds during the night? give a reference for that
Ad 3: global temperature: https://www.ncdc.noaa.gov/monitoring-references/dyk/anomalies-vs-temperature
Global temperature http://www.drroyspencer.com/latest-global-temperatures/
William Sokeland
in my opinion, that figure showing the trend since 1979 means nothing (in the sense that cannot provide useful information on climate change) if compared to a statistically meaningful period. It is just a measure of a residual around a somehow arbitrary steady temeperature in a very short period of time.
mr Sokeland
This is NOT global temperature as measured at weather stations
but the temperature derived from satellites with quite some uncertainty
BUT:
Roy Spencer is one of the best known "deniers" and thus not biased to exaggerate the increase in temperature (0.1 C per decade)
Your welcome to your opinion and except for the energy source that produces the temperatures, I believe he is right! Note marked locations of exploding stars on plot. When you are blaming CO2, you are chasing a red herring. I just hope the USA society doesn't die in 2083 because the political world is ignorant. Sorry!
There is nothing new about CO2:
It is known for over 100 years that it is a greenhouse gas. It means it absorbs infrared (heat) radiation that is emitted by the surface of the atmosphere.
The presence of CO2 is the reason that we have an average surface temperature of 15C instead of far below freezing.
And this is NATURAL CO2 (in combination with water vapour)
Correction
CO2 absorbs the infrared radiation that is emitted by the surface of the Earth into the atmosphere (towards space)
WHAT iF THERE IS ANNOTHER SOURCE? Exploding stars debris streams add Carbon that becomes CO2 in our atmosphere and the debris stream adds energy. The contrbution of the exploding star could be much larger than the CO2 and heat energy produced by mankind. Then wouldn't the problem be to stop the incoming debris stream? You are blind and do not see.
IF it had not been for the imcoming energy from exploding stars over the years it would have been very cold on Earth. We are hit once every three years. If the frequency goes down we cool off. If the frequency go up we heat up. It is that simple.
What form of energy from exploding stars are you advocating, and in what form does it arrive?? You realize that any extra energy added to Earth's surface in a pulse would quickly be radiated away, and Earth would return to the equilibrium state with our Sun's radiation.
Re: Masoud Rostami
Masoud, most of your claims, are just that - "claims" that you did not support with falsifiable facts/arguments. Let's have a look:
1) "Still there is no convincing theorem, with a low range of uncertainty, to calculate the response of climate system in terms of the averaged global surface temperature anomalies with respect to the total feedback factors and greenhouse gases changes."
Nor is there "convincing theorem" that everything will be hunky-dory. Uncertainty cuts both ways - things may get not as bad or WORSE than predicted in the absence of feedbacks. And given that we know of at least several positive feedbacks (and the question is not whether they happen only how strong they would be)
- ice albedo feedback
-release of methane/CO2 from permafrost /ocean hydrates,
- mentioned weaker ability of future ocean to take up atm CO2 (Revelle factor effect)
- weakened formation of deep waters
- increase in wildfires releasing CO2 into the atmosphere
All these point to the probability that the ACTUAL climate change, if anything might be WORSE than the IPCC's, usually very conservative, predictions.
2) As mentioned by another person here - the satellite-data models do not measure surface temperature. Fact recognized by the people carrying out these measurements, but conveniently omitted by the climate change denialists who use these data, instead of instrumental temperature measurements, not because satellite data are inherently more reliable, but because their estimates of global temperature do not increase as dramatically as averages calculated from the surface measurement data. So the problems with the "satellite radiative budget observations" maybe with ... these "satellite radiative budget observations" rather than with the models that do not use these observations.
3) Ocean acidification - true, the effects of it on food webs are not integrated into the models, BUT this means that the actual changes will likely be WORSE than predicted
- as the disrupted ecosystems will be taking LESS CO2 from the atm - hence a bigger portion of anthropogenic CO2 will remain in the atm., free to affect the climate.
On top of it, acidification of the water (+warming and stratification) - makes the ocean LESS likely to take up atm CO2 -> hence the effect of the acifidification will be even worse than that from the effect of acid. on biology alone
4) "The IPCC reports not able to implement appropriately a few large scale natural oscillations : North Atlantic Oscillation, El Nino, Southern ocean oscillation, Arctic Oscillation, Pacific decadal oscillation, deep ocean circulations, Sun's surface temperature, etc.)". Which would have been a problem if we dealt with timescale of a few years, or a decade - but instead we are talking about the existing data set since XIX century and our projections are to the end of XIX century. Over this time scale, your "large-scale natural oscillations" ... average themselves out.
5) "but nothing prevents it from escaping into space".
Huh? When IR is absorbed by a molecule of CO2 - the molecule will re-emit it in all directions. Some of the emission will be up, some of it will be down. That which goes down is reabsorbed by the Earth's surface and heats it up in the process. Hence, this IR has just been "prevented from escaping into space"... ;-)
6) appeal to majority/authority fallacy is not a scientific approach.
You are fighting the paper tiger you created yourself. The statement that a massive majority of the papers published in the reputable scientific journals is consistent with the climate changing and changing in a major part due to us, IS NOT USED to dispute somebody models or data - it is used to communicate the current state of knowledge to the society at large, and to the politicians. Because political decisions should be informed by the best knowledge available at the time - which is approximated by the "majority" of the publications in a given area.
And what's the alternative? To throw hands up in the air and say that since no knowledge is absolutely certain, we should do nothing, as we don't knowl "for sure". I.e. stop the treatments for HIV, because there were scientists questioning the link between HIV and AIDS, stop vaccinations because there was a researcher, who published in a "Lancet" a paper linking vaccinations with autism, not implement the measures against antibiotic-resistance, because evolution is "only a theory" and relying on it is an "appeal to majority/authority fallacy [which] is not a scientific approach"
Piotr
@ P.T. On your point #2.
Actually the AIRS device on the AQUA satellite has measured true surface temperature since 2002. Except for the 2016-17 El Nino, the data are mosty flat, like other satellite data.
Satellite data ARE more accurate than directly measured temperatures. Land stations are poorly distributed across the surface, and increasingly many are in urban heat islands (like airports). Oceans temperatures are only recenlty numerous and because of its large heat capacity, the bulk ocean has only warmed by
@Donald Bogard " Actually the AIRS device on the AQUA satellite has measured true surface temperature since 2002. "
1. It is clear that the original author did not mean AIRS (his data range is 2000-2011)
2. What he probably meant are MSU data and for comparing these with ground data there is a detailed discussion in http://www.realclimate.org/index.php/archives/2016/05/comparing-models-to-the-satellite-datasets/
3. You: Except for the 2016-17 El Nino, the data are mosty flat,”
Well, that’s not the impression I get from April 2019 paper: Recent global warming as confirmed by AIRS https://iopscience.iop.org/article/10.1088/1748-9326/aafd4e
I quote:
“[the AIRS and surface (GISTEMP) data] are very consistent with each over the past 15 years [i.e. over the lifetime of AIRS]. Both data sets demonstrate that the Earth’s surface has been warming globally over this time period, and that 2016, 2017, and 2015 have been the warmest years in the instrumental record, in that order”.
If anything AIRS shows: “ Notably, surface-based data sets may be underestimating the changes in the Arctic.”
There is NO obvious temperature trend in AIRS data over 2003-2014. It rises very slightly in 2014-15, then has a 2016 El Nino temperature peak. After that it falls 0.2C, much of the way back to its earlier value. GISTEMP, HadCRUT4, ECMWF, and Cowtan&Way temperature trends over this time period are very, very similar. See Susskind et al., Environmental Research Letters, 2019
Global temperature responds to BOTH greenhouse gases and natural forcing. The natural part produced the many temperature variations over the past several thousand years (including the recent Little Ice Age), when greenhouse gases varied little. These natural factors are still at work, although they are poorly understood. (Shame they are studied so little.) Natural factors are responsible for the cooling in 1940-1970 and the flat temperature since. Greenhouse gases are responsible for warming over 1970-2000. Both are responsible for warming over 1910-1940, when atmospheric CO2 increased but little.
Climate change is complicated.
@Donald Bogard
“There is NO obvious temperature trend in AIRS data over 2003-2014. It rises very slightly in 2014-15, then has a 2016 El Nino temperature peak. […] See Susskind et al., Environmental Research Letters, 2019”
That’s the one I was referring to! The title of that paper “Recent global warming [sic! ] as confirmed by AIRS” and the least square trends for the data in both cases come to global warming of ca. + 0.25C/decade. Which was not exactly the impression I got from words ‘flat”, “NO obvious change” and “much of the way back to its earlier value”.
But let us get back to the original claim by @Masoud Rostami, namely
“2) NASA satellite data from the years 2000 through 2011 indicate the Earth's atmosphere is allowing far more heat to be emitted into space than computer models have predicted”
So your bringing attention to AIRS data only strengthens my argument that Masoud’s implication that we can’t trust the predictions of models based on ground data even less credible. To sum up the problems with Masoum’s argument 2:
a) His "NASA satellite data" are NOT measuring surface temp. (since they are NOT from AIRS sensor), so in effect he questions the apples asking them why are they not like oranges
b) Fluctuations in the narrow time range (2000 to 2011) are NOT at the scale long enough for talking about “climate” – but merely reflect sub-climate “noise” (NAO, ENSO etc).
c) The data from the AIRS satellite sensor, that unlike Masoum’s satellites, measures surface temperature (AIRS) – CONFIRM the validity of the ground-based observations over the period 2003-2017
d) By confirming ground data over 2003-2017 they are likely to be correct also during the several decades preceding 2003, i.e. at the time scale at which we can talk about climate change
@Donald Bogard: Climate change is complicated. Yes and no. Yes, because natural fluctuations may, at medium time scale, obscure human-caused trend. No, because these natural oscillations come and go, so they get averaged out when a longer, i.e. climate, time-scale averaging is used. Which means, counter-intuitively, that we could be more confident in the predictions for the trends in the next century than in the next decade. Which make climate change less complicated, than if it would be if we needs also resolved the short-to-medium scale natural “noise”.
Piotr Trela
" No, because these natural oscillations come and go, so they get averaged out when a longer, i.e. climate, time-scale averaging is used. "
I've already tried to discuss elsewhere the implication of some statistical assumptions like this. If the global temperature is a function of time T(t) that is assume to be sum of a constant function T0 and the fluctuation (anomaly) T'(t) you can define T0 by the asymptotic limit
T0= lim Dt->+Inf (1/Dt) Int [t0,t0+Dt] T(t) dt =
If you conversely assume a statistical averaging over some ensemble, the covergence to this mean to the temporal averaging requires the ergodicity assumption.
For a process that is not statistically steady in time and if the time averaging is localized over a finite period of time you get is not zero. This contribution could be relevant.
So, the question is about what approximation exists in assuming that the fluctuation effects is lost in statistical sense, 100-year period? It seems to me not fully correct.
Please remember that climate is defined as the average weather (parameter) over a long enough period to average out variations; generally periods of 30 years are used, however there seem/appear to be longer term oceanic oscillations of 60 years. Summary a change from one year to the other thus 2014 to 2015 to 2016 is meaningless with respect to climate CHANGE
Such a change would ideally be the difference in two periods of 60 years or possibly between two periods of 30 years, thus a sliding moving average of 10-15 years provides a much better view of a trend.
One might then better omit the year after a major eruption of volcanoes, like the pinatubo in 1991, and el-nino years
http://www.drroyspencer.com/wp-content/uploads/UAH_LT_1979_thru_April_2019_v6.jpg
data from the "denier" DR(!) Roy Spencer lower troposphere
http://www.drroyspencer.com/latest-global-temperatures/
YOU need to tell the people who are starving in Alaska due to the warm winters 2017-18 and 2018-19 that the change is meaningless.
@ Filippo Maria Denaro
I am not sure about the value of trying to squeeze complex thermodynamic system (affected by the biogeochemistry) into a simplistic representation by formal statistics – for instance – the effect of the El Nino, NAO, solar cycles etc. on Earth’s temperature, because of the complexity of the systems involved, are evaluated numerically and I am not sure how would you express all of them in a single or even several fluctuating component terms in your statistical formulations and get any meaningful and _realistic_ insights.
To bring it back to the basics, if you take the global temp. data and calculate a moving average then it will be by definition less variable than the actual data, because “the short term oscillations get being averaged (filtered) out”. Yet the original author seem to be throwing baby with the bathwater implying that if we can’t model the short term fluctuations, then we can’t say anything about the multidecadal climate trends.
Piotr Trela
yes, time-filtering is a way to smooth the function from spikes but as a result you get a class of filtered functions that depend on the filter width. But the real problem is that the relevance of the filtered "events" acts on the filtered temperature.
I agree with you that the complexity of the system requires more than a simplistic statistical analysis.
I think that there are still too flawness in the modelling of the climate change phenomenon to get any plausible prediction.
There is more under the sun than statistical analysis
called attribution taking into account known weather phenomena and removing these or taking care of these like volcano eruptions
see Chapter 10 in the IPCC report of WGI of IPCC for instance
Agree, indeed the concept of time filtering is not a pure statistical approach
@Filippo Maria Denaro: “But the real problem is that the relevance of the filtered "events" acts on the filtered temperature” and “I think that there are still too flawness in the modelling of the climate change phenomenon to get any plausible prediction”.
And I think the onus of the proof is on you. For the short-term oscillation to question the presence of the long-term warming trend - the short-term oscillations filtered out by averaging would have to become increasingly (from one cycle of oscillation to the next one) asymmetrical – the “up” would have to become progressively weaker than the “down” part, and furthermore this difference would have to be strong enough to make a difference, i.e. lower significantly the 0.15-0.20°C per decade from 1975 on trend.
Piotr Trela
first of all, I would be sure we are talking about the same issue, so let me formalize better my thought.
Let us assume we know the global temperature over the Earth during the last 104 year, that is we know the function Tg(t). Now we can assume the anomaly of temperature as the function Ta1(t)=Tg(t)-Tb, where
Tb = (1/104) Int [tpresent -104 ,tpresent ] Tg(t) dt
is a constant function.
Clearly the function Ta1(t) has strong oscillations due to the fact of being a residual over a constant function.
Now let us define a filter width Dt as parameter to define the filtered temperature :
Tf(t;Dt)= (1/2Dt ) Int [t - Dt ,t +Dt ] Tg(tau) dtau
Of course, now Tf is a smoothed continue function of the time depending on the filter width parameter. In terms of the transfer function, the action of the filter can be seen by the spectral representation G(k;Dt)=sin(k*Dt)/(k*Dt), thus the low wavenumbers component are not filtered.
Now the issue is in the definition of the anomaly as Taf(t;Dt)=Tg(t)-Tf(t;Dt).
My question is if the anomaly in the temperature you are referring is Taf(t;Dt) or not.
Then, given the full set of initial data, to have a prediction of the anomaly after tpresent is clear that we have fundamenta ldifferences if using an approach or another. The modelled equations will be different and requires very different apporximations
@ Filippo Maria Denaro
Clearly the function Ta1(t) [=Tg(t)-Tb] has strong oscillations due to the fact of being a residual over a constant function.
I think there is a confusion with terminology. The whole point of the original argument was the distinction between a man-made long-term (“climatic”) trend (AGW), and the natural oscillations (ENSO, NAO, AO, solar activity cycle, etc.). “Oscillation” is a regular variation in magnitude or position around a central point. If we allow for a trend, and the denote the temp. expected from the trend alone as Ttr(t), then it means that “the central point” for a given oscillation cycle, is Ttr (t), and not Tb. So if we denote To as the effect of the oscillation, then To(t) = Ta1(t) – Ttr (t)
Hence your above statement (Clearly the function Ta1(t) [=Tg(t)-Tb] has strong oscillations due to the fact of being a residual over a constant function) does not make sense, because the oscillations To(t) are not around a constant function y=T_b, but around y=Ttr(t). And we can't assume a priori that the oscillating part of To(t) is “strong” (or “weak” for that matter).
Which brings us back to original author’s claim – that if we can’t model precisely To(t) then we can’t say anything about Ttr(t). This is a fallacy, because unless the oscillation is asymmetric and increasingly so with time - then To(t) has no influence on Ttr (t)
@ Filippo Maria Denaro:
Now the issue is in the definition of the anomaly as Taf(t;Dt)=Tg(t)-Tf(t;Dt).
My question is if the anomaly in the temperature you are referring is Taf(t;Dt) or not.
Then, given the full set of initial data, to have a prediction of the anomaly after tpresent is clear that we have fundamenta ldifferences if using an approach or another. The modelled equations will be different and requires very different apporximations
Piotr: I don’t recall even using word “anomaly” in this thread. If I were to use it – it would be in the sense of your initial: Ta1(t) [=Tg(t)-Tb]
Further - I am not sure why the results of your “different approaches” need to be so different since for a _climate_ change all we look for is the long-term trend – so would your different approaches not just converge onto the same solution – i.e. this long term trend?
Piotr Trela
no, time-filtering is different from statistical time-averaging and this latter has the same convergence to the ensemble averaging (and other statistical average) only under ergodicity hypothesis. If "oscillation" is a function defined as the residual from a time-filtered function is it not univocally defined but depends on the choice of the filter width (that is the period of time over which the filter acts). In other words, if you filter over a different time interval, the oscillation is different. This is not an arbitrary assumption but a mathematical consequence of the definitions of filtered function. If the time filter width tends to zero the oscillations tend to zero. If the filter width tends to infinity the oscillations are maxima. The correlation between the oscillation and the "central point" function you defined is not zero for a finite filter width.
Thus, the issue is in the way one set the problem for the prediction of the oscillation starting from a know set of initial data. Even if you look for long-term prediction, predicting the behavior of the oscillation assuming convergence to the same solution is not rigorous. You need further statistical task. I asked elsewhere (https://www.researchgate.net/post/Numerical_simulation_of_climate_change_What_about_the_state_of_the_art?_ec=topicPostOverviewAuthoredQuestions&_sg=BUCDJPYUq_ntP8047JGSNbtyNe7mAdsF9p3TPQ-YIaJx7mIgzYAF_QeKpJa5tfrrsvVDEmTAQycxzoHy.0AY-8y1QUBXDsaqESMNe9zAijPLkli1KSJ0ThIiLATWN3rlPMuxGscB3Alic7_qrS1EfJVhif1UwTIQo9iIFYc8) for understanding the equations for the temperature that is solved numerically but I did not get a clear response.
At present I still am not sure if we use different terminologies from the same problem, as we are from different fields, or there is a real confusion between different approaches each one implying different assumptions.
Filippo Maria Denaro Is it possible to explain your answer in simple language for the proverbial intelligent reader who is technically challenged? For example, me. I am widely read but at a shallow level in many disciplines. Probably no different from a typical civil servant that has to make some decisions on climate change.
Then at least we can try and contribute to the debate.
Einstein said that if you cannot explain something in simple language, it probably suggests that you do not understand it well enough.
Joseph Tham
If this discussion is on a scientific ground, then we should invoke the tools of mathematics, physics and all we need to be rigorous. I've done that as I am not sure we use a common denomination in the tools that come from different fields of science (I'm from the fluid dynamics field not from climatology).
From a didactic point of view simply imagine a problem where you have a function of time (in this case the global temperature) that is known (in some way...) from some point in the past up to the present. If this known function is representative of some statistically applied tool and you want to provide a prediction for the future (e.g., say 100 years) you must solve some mathematical problem that describes the evolution of this kind of function. Generally, this equation contains many models and approximations and the way we introduce such approximation depends on the type of function (filtered, time-averaged, ensemble averaged). An "event" like ENSO can be smoothed from the known set of initial data but that does not mean that its contribution has not relevance in the evolution of the future temperature.
In practices, the predictions that are done at present for the future temperatures are solved numerically but a large part of uncertainty in the solution depends on the physical and mathematical model. So I wonder if we are congruent is using all the statistical tools.
Masoud Rostami ,This is one of the world's problems that until now humans have not been able to overcome it, ice in the Arctic is increasingly melting because the sea temperature rises to 2 degrees, but all efforts made by humans or scientists have not been able to do anything. to deal with it seriously. in the future, the next five years will feel the impact. many small islands disappear because of rising sea levels, the number of storms increases when the rainy season arrives. but the increase in earth temperature is caused by very complex factors. in Indonesia, many small islands in the middle of the sea have long been lost, but not well-informed so that many parties do not know.
@ Filippo Maria Denaro
I think we got as far as we could – I am not particularly interested or qualified to argue fine points of formal statistics – and I don't think it would be very productive in the context of the original questions posted by @Masoud Rostami .
(BTW – since after posting he hasn’t responded to any of the comments nor clarified his original statements – a question arises whether these were questions he genuinely wanted to hear answered, or just a package of assorted denialist claims, to be put up for the public to question the reality of the climate change)
Of his 6 points I have challenged all 6. With you, we discuss only one - @Masoud Rostami seems to imply that models based on temp. observations cannot reconstruct short-medium term oscillations (ENSO, NAO, solar activity cycles) and therefore are not believable in determining long-term (multidecadal = “climatic”) TREND.
I argue that to determine whether the anthropogenic global warming happens and how quickly – you don’t have to resolve short term oscillations because by being short term and by being oscillations (going up and down around the central point) they are not likely to negate the long-term upward trend (“global warming”).
And we may discuss for weeks how many statistical angels can dance on the tip of a climate model needle – but to show that what I argue is not the case - you would have to demonstrate:
1. That the data during a given oscillation do not meet the ergodicity criterion.
2. That they do not meet it strongly enough to substantially influence the global warming trend computed on timescales several times longer than the period of the oscillations (see the fig at the end) otherwise: why to bother
3. That the effects of non-ergodicity of various oscillations are in the same direction (otherwise they would counter each other, thus lowering the cumulative impact)
4. That to support denialist argument repeated here by @Masoud in the opening post - the cumulative effect of the non-ergodicity would have to be not only significant, but also in the downward direction: The statistical uncertainty cuts both ways - the future global warming may be smaller or larger than predicted, but I still have to meet a climate change “skeptic” who would acknowledge the latter, because their entire use uncertainties in the models is based on the unspoken implication that the process behind these uncertainties would reduce or cancel* the global warming.
In other words – looking the global temp:
https://phys.org/news/2017-01-earth-global-temperature.html
– would you agree with implication of the denialist argument @Masoud presented above - that the short-term oscillations mean that we can’t say anything one way or another about the future climate. And therefore we can continue the fossil fuel business as usual ?
Piotr
* you may recall the popular skeptics “count-up” line: “ it’s already [17] years and [5] months since the end of global warming” – numbers were updated monthly – their reference point was fixed at a the top of the ENSO cycle – typically at the warmest month of this massive temp. spike during the 1997/1998 El Nino (see the 1998 outlier in that https://phys.org/news/2017-01-earth-global-temperature.html ). After a heat wave everything looks like cooling, eh ? or Give me a reference point, and can prove anything! ;-)
Piotr Trela
Piotr, the graphics you posted is one of the reasons of debate.... Looking at a longer time period, it represents only a further shorter-term oscillation superimposed to a long term behavior that show a decreasing in the temperature.
Yes, temperature measured in the last 120 years shows an increasing trend but this is not the predicted global long-term period. That would be as same as to see the function sin(x) in the period [0,pi/6] and then give the prediction in [0,2pi] that is a monotone increasing function.
An interesting graphics is here, in a companion discussion https://www.researchgate.net/profile/Brendan_Godwin/post/Global_Warming_Part_1_Causes_and_consequences_of_global_warming_a_natural_phenomenon_a_political_issue_or_a_scientific_debate/attachment/5ce68caacfe4a7968da2d42e/AS%3A761692963209218%401558613162643/download/From+Isotopes+to+Temperature+Using+Ice+Core+Data.pdf
you will find also a link to a paper that analyses the ergodicity within the framework of climate change.
Among all the questions in the original post I would focus on on the ground on a scientific debate and, on this ground, I see a lot of incongruence in the way the problem is posed. And yes, maybe I am facing the issue from my personal experience in a field (CFD and turbulence) that is not climatology but that can be also a way to discuss from different point of views. ;-)
Needs to increase greenery and reduce gases from factories vehicles
Filippo Maria Denaro: “a long term behavior that show a decreasing in the temperature”.
On what time scale is that cooling – from the end Medieval Warm Period? Beginning of Pleistocene? Since PETM?
All these are irrelevant to the problem at hand – providing decision-makers with likely scenarios for, say, 100 yrs – cooling trend over scale of many centuries or millions of years would be poor comfort if our civilization falls apart in several decades or a century. Like the saying goes in Polish “Myslal indyk o niedzieli a w sobote leb ucieli” - A turkey was planning for the Sunday and on the Saturday it got its head choped off (in Polish sounds better because it rhymes...)
Filippo Maria Denaro: “I see a lot of incongruence in the way the problem is posed” I would be surprised if you didn't - analytical statistics is ill suited to the study of the climate change – it looks only at the result (here: global temperature) but KNOWS NOTHING about processes that determine it. Which would be OK if there was only one forcing of the temperature, not when there are many, and some of them combine with each other through feedbacks. Which means that forcings that were dominant 200 or 5 ,000 yrs ago may not be dominant today and during the rest of the century. For instance, CO2 concentration today is higher than it has been in the last few mln years.
To illustrate it on your own tool - the analyses the ergodicity – in the previous post I have listed 4 conditions for the ergodicity that have to be met to make a noticeable impact on climate trend. I don't see how, without thermodynamics and numerical climate models. you could quantify even one of these four conditions, much less _ALL 4 of them_.
That’s why I said (echoed by another poster) that statistical analysis that ignores different forcings and their feedbacks, is close to useless in making predictions of the future climate based on the past climates with different forcings. In fact it may be worse than useless – it may be misleading – if offering false knowledge - promising insight where there is little or none.
Piotr Trela
Piotr, the trend of the"central temperature" is slowly decreasing in the scale of last hundred of years, despite the positive oscillations in the last 100 years. And the ICPP report showed a large discrepancy from the measured temperatures in the last 10 years from the predicted models. The largely overpredicts the real temperature.
Then
" - analytical statistics is ill suited to the study of the climate change – it looks only at the result (here: global temperature) but KNOWS NOTHING about processes that determine it. "
This statement makes me wondering if you know exactly what I mean. When you talk of global temperature, a function only of the time, you are by definition introducing the first statistical tool, that is the averaging of the temperature over the whole Earth! When you talk about numerical climate model you are not considering that the statistical approach is built-in in the solution of the governing equation (the temperature means you are solving the equation for the internal energy of the system). The statistical approach does not work by alone but is intrinsically introduced in the equation and in the forcing terms. In other words, the forcing that contribute to the averaged temperature are subject to statistical tool. So, you contradict yourself. However, this discussion started to be too technical, some months ago I opened a specific discussion about that https://www.researchgate.net/post/Numerical_simulation_of_climate_change_What_about_the_state_of_the_art?_ec=topicPostOverviewAuthoredQuestions&_sg=ghPFLcCye33YesIjbNXNU93qEWUaTdEWlB-b60gUA0fKoGmBa6s8r5jMEk099kVxqxWpZxod50jsy1e7.nMTlUZScEpJ0VotJD9roHc_-lcqsvqH61iK4F_tE_HTDaqW82MdIEDcFpXkOU26WhrKN0_ExGvLMRBxRHMiE5Hc
All of climate variability is because of solar variations, I have explained and quantified them in my papers and discovered the mechanisms.
Filippo Maria Denaro I am most grateful for your answer. Let me think about it.
Is there any commonly agreed criterion for comparing and assessing the accuracy (performance) of different competing forecasting models. Some sort of complex RMSE (Root Mean Squared Error) calculation? Or confidence intervals for the point estimates? Forgive me if this question reeks of my ignorance. I apologize.
Joseph Tham
the IPCC report provides also a range of confidence rate for the sections. However, there is much more than this, the way in which the modelization is done is very rude, for example as reported here
http://www.globalwarmingequation.info/
I don't want to go into details here, but writing the evolution equation for the temperature is much more than using a simple zero-th order model. Again, still on a didactic ground, the global temperature of the atmosphere is actually governed by a complex set of PDE equations https://ima.org.uk/688/predicting-climate-change/
In poor words, when you integrate the pointiwise temperature equation over the volume surrounding the Earth, the equation for the averaged temperature is not closed until you introduce a somehow arbitrary closure model. And the model is linked with all other physical variables involved in the phenomenon.
Yes we have to worry about global warming and its threats to our earth
Global warming mainly caused by human activies(green house gases) and it can be happen by nature is very slow because by nature thousands of years need verify with Milankovitch cycles
Nature also contribute to rise global warming when our earth reach high temperature behind our science and technology
Just to get an idea of present approaches and modelling, have a look to Chapt.3 here http://www.climate.be/textbook/pdf/
What happens to your methods and logic if greenhouse gases enter our biosphere from a different source that is not manmade but natural.
WS
The greenhouse effect does not depend on the source of atmospheric gases, only that they increase.
Along with new CO2 added by humans, which are the primary cause of the CO2 increase, there is a much larger CO2 exchange equilibria among atmosphere, oceans and plants/soil. So long as the CO2 concentration in any reservoirs does not change, the exchanges are at equilibrium, with equal amounts of CO2 moving into each reservoir as out. Increasing atmospheric CO2 changed that, and CO2 is attempting to establish new equilibria. Thus, about half of the increase in atmospheric CO2 is moving into the oceans and plants.
I really appreciate all the researchers who have had active participation, with their constructive remarks, in these discussion series. Here is the link of Global Warming (Part 4), with some new topics for further discussion:
https://www.researchgate.net/post/Global_Warming_Part_4_Causes_and_consequences_of_global_warming_a_natural_phenomenon_a_political_issue_or_a_scientific_debate
Following the end of the last ice age 20,000 years ago, in common with the three previous minima, (based on the analysis of Antarctic ice core samples) global temperatures started to increase. Based on the three previous events, this rise would have been expected to continue for around 10,000 years to a maximum temperature of around 3°C above the 1880 datum. Then temperatures would have declined for around 100,000 years to a minimum of around 8°C below the 1880 datum.
However, rather than increasing rapidly for 20,000 years ago, temperatures steadied out after 8,000 years and until recently have remained steady for the last 12,000 years. Such extended periods of relatively constant temperature were not seen in the previous 400,000 years, so something had changed.
This is well known by all followers of this question, but the discourse is necessary to set up my questions.
Why did the temperature stabilise? Agricultural societies started to emerge 12,000 years ago and although world population was only around 5 million, could this be the first anthropogenic effect on global climate?
If we could understand the reasons for the stabilisation, would this help improve understanding of the current global warming phenomenon?
The 20,000 years have now passed and according to the ice-core records, global temperatures should be close to decreasing and as such off-setting the effects of atmospheric CO2/H20. Is there any evidence such a decline is imminent? If so, what initiates the decline?
You covered a lot of ground. I have discussed the dimize of the last ice age in my paper, GLOBAL WARMING AND COOLING: FRIEND AND FOE TO MANKIND May 21, 2018 in the section titled "Destruction of the last ice age". It would be difficult to explain here, but the key is via understand geothermal heat was the large energy input that melted most of the ice before the formation of the Black Sea. The real question is what opened the strait that filled the Black sea at about 10,000 years ago? my paper is free for everyone to read at https://independent.academia.edu/WilliamSokeland.
W.S.
The average heat flow rate out of Earth's surface from internal generation is about 0.1 watt/m^2 -- a bit more with continents, a bit less with ocean basins. The average energy received at Earth's surface from solar insolation is about 240 w/m^2. By comparison, geothermal energy is puny, except in certain places it is concentrated, like rift zones, and even there the typical heat flow rates are higher by only a few factors.
Past glaciation events began in the far northern hemisphere. When the last inter-glacial (Eemian) ended, solar insolation above 65 deg-north latitude decreased by ~100 w/m^2 over ~125 kyr to ~114 kyr ago. When the last glaciation ended ~18 kyr ago (insolation peaked ~9 kyr ago and has been slowly decreasing through the Holocene, e.g., your graph), solar insolation increased by ~50 w/m^2. The effect of Earth's orbital variations on solar insolation is the main trigger for glacial cycles, although it is not the only factor sustaining those cycles.
The Younger Dryas event is not well understood. Temperature had risen almost to its present level, when it plunged again. Many think a sudden change in melt-water flowing into the Arctic Ocean, perhaps via the MacKenzie River, was the trigger; but there are other ideas. Two of the main glacier ice spreading centers were located on either side of Hudson Bay, but there was much less ice along this river drainage. The point here is that events other than insolation can and do produce sudden changes in glacial conditions.
If you read my paper, which I doubt -- you didn't understand the concept of the Black sea being an exit crater. Try again after you have read my explaination.
John W.
Three interglacials before this one lasted from ~419 kyr to ~399 kyr, or ~20 kyr long (as measured by temperatures exceeding the current temperature. However, that interglacial had a double peak in northern hemisphere insolation, at ~425 kyr and ~409 kyr. The effect of Milankovitch cycles on insolation is not always smooth.
Temperature in the Eemian showed a double peak, at ~128 and ~124 kyr, although the insolation change was smooth and peaked at ~125 kyr. As I mention in my answer to W.
S., events like teh Younger Dryas do occur and alter temperature, thought not insolation.
Two interglacials before this one, temperature also had a double peak, at ~333 kyr and ~328 kyr, although insolation had a smooth peak at ~332 kyr.
The Eemian has been unusual in that both the insolation and the temperature have changed smoothly.
Human activity has been influential recently, and possibly it played a factor even earlier, as you suggest.
John W.
I might add to the above that the Milankovitch cycle insolation in the northern hemisphere has only been decreasing for the past ~9 kyr and will only decrease by some 35 w/m^2 before it starts up again. (When the past glaciation began, NH insolation decreased almost 100 w/m in only ~11 kyr.) Earth is entering an extended period without major insolation swings. This also is likely a factor in the Holocene length.