Doogie
Kennel Lizard Lord
- Joined
- Oct 6, 2012
- Messages
- 9,923
- Reaction score
- 12,427
Or - to put it simply - modelling CO2 outputs as an input into the model(s) is driven by human behaviours, not a system dynamic approach. Gettting the inputs right is a best guess. Simple GIGO. Plus a fair chunk of those models were energy balance models which characterise feedback loops such as methane emissions or for example, eruptions. How do you account for something like Tonga? That has a worldwide cooling effect - but u have to somehow work out if its gunna happen in the next 20 years. Good luck.What’s an acceptable margin of error?
The last 50 years has seen an increase in earth‘s temperature of 0.13 degrees every 10 years. The previous 50 years was 0.07 degrees every 10 years. A delta of 0.6 degrees.
The IPCC 1995 temperature rise modelling average for the 10 years 2007 to 2017 was the same as their 2007 modelling and was inaccurate by 0.20 degrees compared to the NASA observations.
The IPCC 1990 modelling for the same 10 years was inaccurate by 0.25 degrees
The IPCC 2001 modelling for the same 10 years was the most inaccurate at 0.30 degrees.
My view is errors compared to the delta between 33% and 50% aren’t acceptable.
Another valid question is why did the 2007 modelling revert to a very similar result to the 1995 modelling? My view is that the 2001 modelling methodology was proven to be more inaccurate than the previous methodology so it was “updated”, which of course is acceptable methodology. Or maybe 2001 was overstated and resulted in quite some embarrassment for the IPCC so they “tempered” the 2007 modelling.
Even if we ignore the discrepancies to the actuals, the fact is 3 versions of the IPCC modelling over the last 32 years have varied noticeably. Also with each version, except 2001, the predicted temperature rise has decreased. In fact the decrease in later modelling is consistent with that experienced from the Broecker modelling in 1975, which was inaccurate by over 1.0 degrees by 2017 and the Exxon modelling in 1981 which was inaccurate by 0.4 degrees. I find the most interesting is movements in the Hansen modelling from 1981 to 1988, the 1981 modelling was wildy criticised because it was so low, crediting only a 0.7 degree increase between 2007 and 2017. It was revised upwards in 1988 which resulted in that version being almost as inaccurate as the Broecker version from 1975. Maybe he should have stayed with his 1981 modelling as it has proven to be the closest to the actuals.
More current, "the IPCC report 2022 warned that the world is set to reach the 1.5ºC level within the next two decades" but the 2001 modelling said that we would already be there by 2021.
There seems to be a strong case that indicates political and pear group influence results in changes in the modelling which most often are not accurate. Having followed the modelling and the actual results since the mid 90's I remain skeptical about a science that is so hugely complex that the modelling has to change almost every year to reflect what has actually happened. None of which means that I am not taking my own measured actions to reduce my carbon monoxide footprint, it seems to me that many of the IPCC demanded actions are far from measured, bordering on totally unrealistic.
Always a Bulldog
Political and peers influence the communication of the models, but the results are the results.
You've chosen to do whatever about your footprint, fair play, your choice. But you're are doing that on the basis that the models weren't perfect so therefore they are incorrect. And CC isn't occurring. Weird argument bro.