Climate Models Can Never Work, Says Computer Modeller

There are plenty of data sets. It is clear you have no clue how computer modelling works.

Ok suit yourself, even Gavin Schmidt head of NASA GISS, who is an expert on climate models, has acknowledged that the newer CMIP6 models are running way too hot due in part to their inability to model cloud behaviour. But you just carry being sanctimonious and pompous I wouldn't expect otherwise from you.
 
Last edited:
Ok suit yourself, even Gavin Schmidt of NASA GISS has acknowledged that the newer CMIP6 models are running way too hot due in part to their inability to model cloud behaviour. But you just carry be sanctimonious and pompous I wouldn't expect otherwise from you.

Sighs. When you start off with a non sequitur, it wrecks your entire article.

You need to refresh your knowledge of logic.
 
Comparing a lottery machine to climate modelling???

n34NqyY.gif
 
You've turned into a clichebot, time to order up some new ones. I haven't forgotten how you declared with all the authority you could muster that the Kansas oil spill would take months or even years to be cleaned up. In fact it took only 3 weeks before the pipeline was returned to service, I still can't stop laughing about that.

I never argued about the time it took the pipeline to go back into service.

I merely didn't agree with your comment that the spill was cleaned up in just 3 short weeks as you claimed!


Next!
 
I never argued about the time it took the pipeline to go back into service.

I merely didn't agree with your comment that the spill was cleaned up in just 3 short weeks as you claimed!


Next!

Yeh yeh, but you know full well that it wouldn't have been allowed back online otherwise. Accept that you made a mistake and move on ffs.
 
I accept your concession. Bye bye.

I never offered it ffs. The guy who wrote that is a former climate modeller and I'm sure he knows a fuckton more about the subject that either of us!!

If you cannot make a model to predict the outcome of the next draw from a lottery ball machine, you are unable to make a model to predict the future of the climate, suggests former computer modeller Greg Chapman, in a recent essay in Quadrant. Chapman holds a PhD in physics and notes that the climate system is chaotic, which means “any model will be a poor predictor of the future”. A lottery ball machine, he observes, “is a comparatively much simpler and smaller interacting system”.

Most climate models run hot, a polite term for endless failed predictions of runaway global warming. If this was a “real scientific process’” argues Chapman, the hottest two thirds of the models would be rejected by the International Panel for Climate Change (IPCC). If that happened, he continues, there would be outrage amongst the climate scientists community, especially from the rejected teams, “due to their subsequent loss of funding”. More importantly, he added, “the so-called 97% consensus would instantly evaporate”. Once the hottest models were rejected, the temperature rise to 2100 would be 1.5°C since pre-industrial times, mostly due to natural warming. “There would be no panic, and the gravy train would end,” he said
 
Last edited:
There is no need to read the rest because the first sentence invalidates the whole article.

"If you cannot make a model to predict the outcome of the next draw from a lottery ball machine, you are unable to make a model to predict the future of the climate, suggests former computer modeller Greg Chapman, in a recent essay in Quadrant."

1. Lottery numbers are 100% random.
2. What does a computer modeller know?
3. If it is useless, how are the meteorologists able to make predictions?

Climate is not weather, Poncho.
 
I never offered it ffs. The guy who wrote that is a former climate modeller and I'm sure he knows s fuckton more about the subject that either of us!!

If you cannot make a model to predict the outcome of the next draw from a lottery ball machine, you are unable to make a model to predict the future of the climate, suggests former computer modeller Greg Chapman, in a recent essay in Quadrant. Chapman holds a PhD in physics and notes that the climate system is chaotic, which means “any model will be a poor predictor of the future”. A lottery ball machine, he observes, “is a comparatively much simpler and smaller interacting system”.

Most climate models run hot, a polite term for endless failed predictions of runaway global warming. If this was a “real scientific process’” argues Chapman, the hottest two thirds of the models would be rejected by the International Panel for Climate Change (IPCC). If that happened, he continues, there would be outrage amongst the climate scientists community, especially from the rejected teams, “due to their subsequent loss of funding”. More importantly, he added, “the so-called 97% consensus would instantly evaporate”. Once the hottest models were rejected, the temperature rise to 2100 would be 1.5°C since pre-industrial times, mostly due to natural warming. “There would be no panic, and the gravy train would end,” he said

There is no way to predict the next lottery numbers. However, you can predict the number of wins in, say a year, with reasonable accuracy.

This computer modeller and you are claiming that those scientists have been lying to keep getting funded. That's a wild claim. That's tantamount to grant fraud.
 
Climate is not weather, Poncho.

cli·mate

noun
noun: climate; plural noun: climates

the weather conditions prevailing in an area in general or over a long period.
"our cold, wet climate"
h
Similar:
weather pattern

weather conditions
weather
atmospheric conditions

a region with particular prevailing weather conditions.
"vacationing in a warm climate"
 
You have no clue how to even spell modeling. :palm:

Depends where it is used.

The word was derived from the Middle French modelle, itself derived from the Italian modello, which can be traced all the way to the Latin root modus, which means “manner” or “measure.”
 
There is no way to predict the next lottery numbers. However, you can predict the number of wins in, say a year, with reasonable accuracy.

This computer modeller and you are claiming that those scientists have been lying to keep getting funded. That's a wild claim. That's tantamount to grant fraud.

You're such a simpleton and truly naive despite your pretensions otherwise. They are grossly exaggerating the true situation as they know full well that the grants would dry up otherwise.

Here is the original article by Greg Chapman in Quadrant magazine for you to ignore as well.

Global extinction due to global warming has been predicted more times than the Labor Party has claimed it can cool the planet with a new tax. But where do these predictions come from? If you thought it was just calculated from the simple, well known relationship between CO2 and solar energy absorption, you would only expect to see about 0.5o C increase from pre-industrial temperatures as a result of CO2 doubling, due to the logarithmic nature of the relationship. [1]

The runaway 3-6o C and higher temperature increase model predictions depend on coupled feedbacks from many other factors, including water vapour (the most important greenhouse gas), albedo (the proportion of energy reflected from the surface – e.g. more/less ice or clouds, more/less reflection) and aerosols, just to mention a few, which theoretically may amplify the small incremental CO2 heating effect.



the planet with a new tax. But where do these predictions come from? If you thought it was just calculated from the simple, well known relationship between CO2 and solar energy absorption, you would only expect to see about 0.5C increase from pre-industrial temperatures as a result of CO2 doubling, due to the logarithmic nature of the relationship. [1]

The runaway 3-6C and higher temperature increase model predictions depend on coupled feedbacks from many other factors, including water vapour (the most important greenhouse gas), albedo (the proportion of energy reflected from the surface – e.g. more/less ice or clouds, more/less reflection) and aerosols, just to mention a few, which theoretically may amplify the small incremental CO2 heating effect.

“The world has less than a decade to change course to avoid irreversible ecological catastrophe, the UN warned today.” — The Guardian, Nov 28, 2007

“It’s tough to make predictions, especially about the future.” — Yogi Berra

Because of the complexity of these interrelationships, the only way to make predictions is with climate models. But are they fit for purpose? Before I answer that question, let’s have a look at how they work.

How do Climate Models Work?

In order to represent the earth in a computer model, a grid of cells is constructed from the bottom of the ocean to the top of the atmosphere. Within each cell, the component properties, such as temperature, pressure, solids, liquids and vapour are uniform.

The size of the cells varies between models and within models. Ideally, they should be as small as possible, as properties vary continuously in the real world, but the resolution is constrained by computing power. Typically, the cell area is around 100×100 km2 even though there is considerable atmospheric variation over such distances, requiring all the cell properties to be averaged. This introduces an unavoidable error into the models even before they start to run.

The number of cells varies between models, but the order of magnitude is around 2 million.

Once the grid has been constructed, the component properties of each these cells must be determined. There aren’t, of course, two million data stations in the atmosphere and ocean. The current number of data points is around 10,000 (ground weather stations, balloons and ocean buoys), plus we’ve had satellite data since 1978, but historically the coverage is poor. As a result, when initialising a climate model starting 150 years ago, there is almost no data available for most of the land surface and oceans, and nothing above the surface or in the ocean depths. This should be understood to be a major concern.

Once the grid has been constructed, the component properties of each these cells must be determined. There aren’t, of course, two million data stations in the atmosphere and ocean. The current number of data points is around 10,000 (ground weather stations, balloons and ocean buoys), plus we’ve had satellite data since 1978, but historically the coverage is poor. As a result, when initialising a climate model starting 150 years ago, there is almost no data available for most of the land surface and oceans, and nothing above the surface or in the ocean depths. This should be understood to be a major concern.

Once initialised, the model goes through a series of timesteps. At each step, for each cell, the properties of the adjacent cells are compared. If one such cell is at a higher pressure, fluid will flow from that cell to the next. If it is at higher temperature, it warms the next cell (whilst cooling itself). This might cause ice to melt, but evaporation has a cooling effect. If ice melts, there is less energy reflected and that causes further heating. Aerosols in the cell can result in heating or cooling and an increase or decrease in precipitation, depending on the type.

Increased precipitation can increase plant growth, as does increased CO2. This will change the albedo (reflectivity) of the surface as well as the humidity. Higher temperatures cause greater evaporation from oceans which cools the oceans and increases cloud cover. Climate models can’t model clouds due to the low resolution of the grid, and whether clouds increase surface temperature or reduce it, depends on the type of cloud.

Of course, this all happens in three dimensions and to every cell, resulting in lots of feedback to be calculated at each timestep.It’s complicated!

The timesteps can be as short as half an hour. Remember, the terminator, the point at which day turns into night, travels across the earth’s surface at about 1700 km/hr at the equator, so even half hourly timesteps introduce further error into the calculation. Again, computing power is a constraint.

While the changes in temperatures and pressures between cells are calculated according to the laws of thermodynamics and fluid mechanics, many other changes aren’t calculated. They rely on parameterisation. For example, the albedo forcing varies from icecaps to Amazon jungle to Sahara desert to oceans to cloud cover and all the reflectivity types in between. These properties are simply assigned and their impacts on other properties are determined from look-up tables, not calculated. Parameterisation is also used for cloud and aerosol impacts on temperature and precipitation. Any important factor that occurs on a subgrid scale, such as storms and ocean eddy currents, must also be parameterised with an averaged impact used for the whole grid cell. Whilst impacts of these factors are based on observations, the parameterisation is far more a qualitative rather than a quantitative process, and often described by modelers themselves as an art, that introduces further error. Direct measurement of these effects and how they are coupled to other factors is extremely difficult.

Read more: https://quadrant.org.au/opinion/doomed-planet/2022/10/garbage-in-climate-science-out/
 
There is no way to predict the next lottery numbers. However, you can predict the number of wins in, say a year, with reasonable accuracy.

This computer modeller and you are claiming that those scientists have been lying to keep getting funded. That's a wild claim. That's tantamount to grant fraud.

You're such a simpleton and truly naive despite your pretensions otherwise. They are grossly exaggerating the true situation as they know full well that the grants would dry up otherwise.

Here is the original article by Greg Chapman in Quadrant magazine for you to ignore as well.

The runaway 3-6C and higher temperature increase model predictions depend on coupled feedbacks from many other factors, including water vapour (the most important greenhouse gas), albedo (the proportion of energy reflected from the surface – e.g. more/less ice or clouds, more/less reflection) and aerosols, just to mention a few, which theoretically may amplify the small incremental CO2 heating effect.

“The world has less than a decade to change course to avoid irreversible ecological catastrophe, the UN warned today.” — The Guardian, Nov 28, 2007

“It’s tough to make predictions, especially about the future.” — Yogi Berra

Because of the complexity of these interrelationships, the only way to make predictions is with climate models. But are they fit for purpose? Before I answer that question, let’s have a look at how they work.

How do Climate Models Work?

In order to represent the earth in a computer model, a grid of cells is constructed from the bottom of the ocean to the top of the atmosphere. Within each cell, the component properties, such as temperature, pressure, solids, liquids and vapour are uniform.

The size of the cells varies between models and within models. Ideally, they should be as small as possible, as properties vary continuously in the real world, but the resolution is constrained by computing power. Typically, the cell area is around 100×100 km2 even though there is considerable atmospheric variation over such distances, requiring all the cell properties to be averaged. This introduces an unavoidable error into the models even before they start to run.

The number of cells varies between models, but the order of magnitude is around 2 million.

Once the grid has been constructed, the component properties of each these cells must be determined. There aren’t, of course, two million data stations in the atmosphere and ocean. The current number of data points is around 10,000 (ground weather stations, balloons and ocean buoys), plus we’ve had satellite data since 1978, but historically the coverage is poor. As a result, when initialising a climate model starting 150 years ago, there is almost no data available for most of the land surface and oceans, and nothing above the surface or in the ocean depths. This should be understood to be a major concern.

Once the grid has been constructed, the component properties of each these cells must be determined. There aren’t, of course, two million data stations in the atmosphere and ocean. The current number of data points is around 10,000 (ground weather stations, balloons and ocean buoys), plus we’ve had satellite data since 1978, but historically the coverage is poor. As a result, when initialising a climate model starting 150 years ago, there is almost no data available for most of the land surface and oceans, and nothing above the surface or in the ocean depths. This should be understood to be a major concern.

Once initialised, the model goes through a series of timesteps. At each step, for each cell, the properties of the adjacent cells are compared. If one such cell is at a higher pressure, fluid will flow from that cell to the next. If it is at higher temperature, it warms the next cell (whilst cooling itself). This might cause ice to melt, but evaporation has a cooling effect. If ice melts, there is less energy reflected and that causes further heating. Aerosols in the cell can result in heating or cooling and an increase or decrease in precipitation, depending on the type.

Increased precipitation can increase plant growth, as does increased CO2. This will change the albedo (reflectivity) of the surface as well as the humidity. Higher temperatures cause greater evaporation from oceans which cools the oceans and increases cloud cover. Climate models can’t model clouds due to the low resolution of the grid, and whether clouds increase surface temperature or reduce it, depends on the type of cloud.

Of course, this all happens in three dimensions and to every cell, resulting in lots of feedback to be calculated at each timestep.It’s complicated!

The timesteps can be as short as half an hour. Remember, the terminator, the point at which day turns into night, travels across the earth’s surface at about 1700 km/hr at the equator, so even half hourly timesteps introduce further error into the calculation. Again, computing power is a constraint.

While the changes in temperatures and pressures between cells are calculated according to the laws of thermodynamics and fluid mechanics, many other changes aren’t calculated. They rely on parameterisation. For example, the albedo forcing varies from icecaps to Amazon jungle to Sahara desert to oceans to cloud cover and all the reflectivity types in between. These properties are simply assigned and their impacts on other properties are determined from look-up tables, not calculated. Parameterisation is also used for cloud and aerosol impacts on temperature and precipitation. Any important factor that occurs on a subgrid scale, such as storms and ocean eddy currents, must also be parameterised with an averaged impact used for the whole grid cell. Whilst impacts of these factors are based on observations, the parameterisation is far more a qualitative rather than a quantitative process, and often described by modelers themselves as an art, that introduces further error. Direct measurement of these effects and how they are coupled to other factors is extremely difficult.

Read more: https://quadrant.org.au/opinion/doomed-planet/2022/10/garbage-in-climate-science-out/
 
You're such a simpleton and truly naive despite your pretensions otherwise. They are grossly exaggerating the true situation as they know full well that the grants would dry up otherwise.

Here is the original article by Greg Chapman in Quadrant magazine for you to ignore as well.

Global extinction due to global warming has been predicted more times than the Labor Party has claimed it can cool the planet with a new tax. But where do these predictions come from? If you thought it was just calculated from the simple, well known relationship between CO2 and solar energy absorption, you would only expect to see about 0.5o C increase from pre-industrial temperatures as a result of CO2 doubling, due to the logarithmic nature of the relationship. [1]

The runaway 3-6o C and higher temperature increase model predictions depend on coupled feedbacks from many other factors, including water vapour (the most important greenhouse gas), albedo (the proportion of energy reflected from the surface – e.g. more/less ice or clouds, more/less reflection) and aerosols, just to mention a few, which theoretically may amplify the small incremental CO2 heating effect.



the planet with a new tax. But where do these predictions come from? If you thought it was just calculated from the simple, well known relationship between CO2 and solar energy absorption, you would only expect to see about 0.5C increase from pre-industrial temperatures as a result of CO2 doubling, due to the logarithmic nature of the relationship. [1]

The runaway 3-6C and higher temperature increase model predictions depend on coupled feedbacks from many other factors, including water vapour (the most important greenhouse gas), albedo (the proportion of energy reflected from the surface – e.g. more/less ice or clouds, more/less reflection) and aerosols, just to mention a few, which theoretically may amplify the small incremental CO2 heating effect.

“The world has less than a decade to change course to avoid irreversible ecological catastrophe, the UN warned today.” — The Guardian, Nov 28, 2007

“It’s tough to make predictions, especially about the future.” — Yogi Berra

Because of the complexity of these interrelationships, the only way to make predictions is with climate models. But are they fit for purpose? Before I answer that question, let’s have a look at how they work.

How do Climate Models Work?

In order to represent the earth in a computer model, a grid of cells is constructed from the bottom of the ocean to the top of the atmosphere. Within each cell, the component properties, such as temperature, pressure, solids, liquids and vapour are uniform.

The size of the cells varies between models and within models. Ideally, they should be as small as possible, as properties vary continuously in the real world, but the resolution is constrained by computing power. Typically, the cell area is around 100×100 km2 even though there is considerable atmospheric variation over such distances, requiring all the cell properties to be averaged. This introduces an unavoidable error into the models even before they start to run.

The number of cells varies between models, but the order of magnitude is around 2 million.

Once the grid has been constructed, the component properties of each these cells must be determined. There aren’t, of course, two million data stations in the atmosphere and ocean. The current number of data points is around 10,000 (ground weather stations, balloons and ocean buoys), plus we’ve had satellite data since 1978, but historically the coverage is poor. As a result, when initialising a climate model starting 150 years ago, there is almost no data available for most of the land surface and oceans, and nothing above the surface or in the ocean depths. This should be understood to be a major concern.

Once the grid has been constructed, the component properties of each these cells must be determined. There aren’t, of course, two million data stations in the atmosphere and ocean. The current number of data points is around 10,000 (ground weather stations, balloons and ocean buoys), plus we’ve had satellite data since 1978, but historically the coverage is poor. As a result, when initialising a climate model starting 150 years ago, there is almost no data available for most of the land surface and oceans, and nothing above the surface or in the ocean depths. This should be understood to be a major concern.

Once initialised, the model goes through a series of timesteps. At each step, for each cell, the properties of the adjacent cells are compared. If one such cell is at a higher pressure, fluid will flow from that cell to the next. If it is at higher temperature, it warms the next cell (whilst cooling itself). This might cause ice to melt, but evaporation has a cooling effect. If ice melts, there is less energy reflected and that causes further heating. Aerosols in the cell can result in heating or cooling and an increase or decrease in precipitation, depending on the type.

Increased precipitation can increase plant growth, as does increased CO2. This will change the albedo (reflectivity) of the surface as well as the humidity. Higher temperatures cause greater evaporation from oceans which cools the oceans and increases cloud cover. Climate models can’t model clouds due to the low resolution of the grid, and whether clouds increase surface temperature or reduce it, depends on the type of cloud.

Of course, this all happens in three dimensions and to every cell, resulting in lots of feedback to be calculated at each timestep.It’s complicated!

The timesteps can be as short as half an hour. Remember, the terminator, the point at which day turns into night, travels across the earth’s surface at about 1700 km/hr at the equator, so even half hourly timesteps introduce further error into the calculation. Again, computing power is a constraint.

While the changes in temperatures and pressures between cells are calculated according to the laws of thermodynamics and fluid mechanics, many other changes aren’t calculated. They rely on parameterisation. For example, the albedo forcing varies from icecaps to Amazon jungle to Sahara desert to oceans to cloud cover and all the reflectivity types in between. These properties are simply assigned and their impacts on other properties are determined from look-up tables, not calculated. Parameterisation is also used for cloud and aerosol impacts on temperature and precipitation. Any important factor that occurs on a subgrid scale, such as storms and ocean eddy currents, must also be parameterised with an averaged impact used for the whole grid cell. Whilst impacts of these factors are based on observations, the parameterisation is far more a qualitative rather than a quantitative process, and often described by modelers themselves as an art, that introduces further error. Direct measurement of these effects and how they are coupled to other factors is extremely difficult.

Read more: https://quadrant.org.au/opinion/doomed-planet/2022/10/garbage-in-climate-science-out/

I get the argument. It is based on the number of stations. Apparently it is not enough data to make an accurate prediction.

That would be statisticians' job.
 
I get the argument. It is based on the number of stations. Apparently it is not enough data to make an accurate prediction.

That would be statisticians' job.

Much more than that, they cannot model clouds to any great extent and that's a huge drop off if you can't do that then your model is next to useless. The IPCC freely admit that in their reports yet the real fraud is when they draw up their Summary for Policymakers
and circulate it to NGOs, government departments and the likes of Greenpeace etc. Who then proceed to sex it up prior to distribution, that is fraud, surely even you would admit that.
 
Last edited:
Back
Top