The paper, published on Wednesday in the journal Nature Geoscience, explores the performance of a climate forecast based on data up to 1996 by comparing it with the actual temperatures observed since. The results show that scientists accurately predicted the warming experienced in the past decade, relative to the decade to 1996, to within a few hundredths of a degree.
The forecast, published in 1999 by Myles Allen and colleagues at Oxford University, was one of the first to combine complex computer simulations of the climate system with adjustments based on historical observations to produce both a most likely global mean warming and a range of uncertainty. It predicted that the decade ending in December 2012 would be a quarter of degree warmer than the decade ending in August 1996 – and this proved almost precisely correct.
The study is the first of its kind because reviewing a climate forecast meaningfully requires at least 15 years of observations to compare against. Assessments based on shorter periods are prone to being misleading due to natural short-term variability in the climate.
For a short-term forecast, you create three columns in Excel or SPSS. Year, CO2 and yearly average temperature. Input this data back to 30 years. Then you use Excel's or SPSS's regression function, assuming that the relationship is linear. The software will show you the right equation.
Next, I suggest using only CO2 and average temperature (do not use the year), you will get another regression.
More sophisticated models only necessary if you want spatial data or apply long-term forecast or non-linear regressions.
The truth is, you can get the same by applying GDP or population instead of CO2.
Earlier it was believed that climate models are unreliable and they aare full of fudge factors that are fitted to the existing climate, so the models more or less agree with the observed data. But there is no reason to believe that the same fudge factors would give the right behaviour in a world with different chemistry, for example in a world with increased CO2 in the atmosphere." As climate models are mathematical representations of the interactions between the atmosphere, oceans, land surface, ice – and the sun. This is clearly a very complex task, so models are built to estimate trends rather than events. For example, a climate model can tell you it will be cold in winter, but it can’t tell you what the temperature will be on a specific day – that’s weather forecasting. Climate trends are weather, averaged out over time - usually 30 years. So with the available model like MODTRAN Infrared Light in the Atmosphere or Kaya Identity Scenario, one can forecast Global Warming. however Dr Robert Istvan Radics rightly highlighted SPSS and Excel for short-term forecast.
The dynamics, chaotic and spontaneous manner of natural events make any type of climate models (numerical, statistical, and physical) more difficult to forecast climate change accurately. Secondly, the highly interconnected among climate change elements working in parallel to produce extreme events is another reason climate models may not produce severe weather accurately. Another reason is that understanding the critical role play by climate forcing is still at infancy phase.