November 21, 2024
 
  • by:
  • Source: FreePressers
  • 03/08/2024
FPI / March 4, 2024

Climate activists and world governments are using problematic temperature recordings that have corrupted data to build models that then forecast catastrophic global warming, according to multiple scientists who have published recent studies on the issue.

Problems with temperature data include a lack of geographically and historically representative data, contamination of the records by heat from urban areas, and corruption of the data introduced by a process known as “homogenization,” scientists from around the world said, citing peer-reviewed studies.

The flaws are so significant that they make the temperature data — and the models based on it — essentially useless or worse, three independent scientists with the Center for Environmental Research and Earth Sciences (CERES) told The Epoch Times.

The scientists said that when data corruption is considered, the alleged “climate crisis” supposedly caused by human activities disappears.

Instead, natural climate variability offers a much better explanation for what is being observed, the scientists say.

The Biden Administration relies on the National Climate Assessment report as evidence that global warming is accelerating because of human activities. The document states that human emissions of “greenhouse gases” such as carbon dioxide are dangerously warming the Earth.

The UN Intergovernmental Panel on Climate Change (IPCC) holds the same view, and its leaders are pushing major global policy changes in response.

“For the last 35 years, the words of the IPCC have been taken to be gospel,” according to astrophysicist and CERES founder Willie Soon. Until recently, he was a researcher working with the Center for Astrophysics, Harvard & Smithsonian.

“And indeed, climate activism has become the new religion of the 21st century — heretics are not welcome and not allowed to ask questions,” Soon told The Epoch Times. “But good science demands that scientists are encouraged to question the IPCC’s dogma. The supposed purity of the global temperature record is one of the most sacred dogmas of the IPCC.”

Data taken from rural temperature stations, ocean measurements, weather balloons, satellite measurements, and temperature proxies such as tree rings, glaciers, and lake sediments, “show that the climate has always changed,” Soon said.

“They show that the current climate outside of cities is not unusual,” he said, adding that heat from urban areas is improperly affecting the data. “If we exclude the urban temperature data that only represents 3 percent of the planet, then we get a very different picture of the climate.”

One issue that scientists say is corrupting the data stems from an obscure process known as “homogenization.”

According to climate activists working with governments and the UN, the algorithms used for homogenization are designed to correct, as much as possible, various biases that might exist in the raw temperature data.

These biases include, among others, the relocation of temperature monitoring stations, changes in technology used to gather the data, or changes in the environment surrounding a thermometer that might impact its readings.

For instance, if a temperature station was originally placed in an empty field but that field has since been paved over to become a parking lot, the record would appear to show much hotter temperatures. As such, it would make sense to try to correct the data collected.

"Virtually nobody argues against the need for some homogenization to control for various factors that may contaminate temperature data," The Epoch Times report notes.

But a closer examination of the process as it now occurs reveals major concerns, Ronan Connolly, an independent scientist at CERES, said.

“While the scientific community has become addicted to blindly using these computer programs to fix the data biases, until recently nobody has bothered to look under the hood to see if the programs work when applied to real temperature data,” he told The Epoch Times.

Since the early 2000s, various governmental and intergovernmental organizations creating global temperature records have relied on computer programs to automatically adjust the data.

Soon, Connolly, and a team of scientists around the world spent years looking at the programs to determine how they worked and whether they were reliable.

One of the scientists involved in the analysis, Peter O’Neill, has been tracking and downloading the data daily from the National Oceanographic and Atmospheric Administration (NOAA) and its Global Historical Climatology Network since 2011.

He found that each day, NOAA applies different adjustments to the data.

“They use the same homogenization computer program and re-run it roughly every 24 hours,” Connolly said. “But each day, the homogenization adjustments that they calculate for each temperature record are different.”

This is “very bizarre,” he said.

“If the adjustments for a given weather station have any basis in reality, then we would expect the computer program to calculate the same adjustments every time. What we found is this is not what’s happening,” Connolly said.

Your Choice


Quality Resource for Citizen Journalists



Free Press International
[Publish This Content]

We thought you'd be interested in this message from our sponsor.

covdclimate by is licensed under

We thought you'd be interested in this message from our sponsor.

Get latest news delivered daily!

We will send you breaking news right to your inbox


Have a tip? Let us know!

We thought you'd be interested in this message from our sponsor.

We thought you'd be interested in this message from our sponsor.

We thought you'd be interested in this message from our sponsor.

We thought you'd be interested in this message from our sponsor.