Because of professional marketing, we associate climate science with highly motivated researchers of different ethnicities, working in laboratories or outdoors. We think of vegan Tesla drivers, activists like Greta Thunberg and the new generation of green politicians. And we think of bureaucrats from the United Nations and the IPCC who are supposed to represent the interests of the people in large conference rooms, surrounded by blue flags, rather than the interests of individual national governments.
But if you look behind this marketing to see who really created climate research and still controls it today, we find white, old, aristocratic Brits, American generals and military contractors. All of the leading climate research organizations have this military background. The UK Met Office, CRU, the Hadley Center, NOAA and NASA. In a guarded basement, the UK Met Office, which was officially part of the Air Force and the Ministry of Defense until 2011, uses software to calculate the climate of the future on a supercomputer made by the company Cray, which goes back to the US Navy’s codebreakers.
The results of the simulation are then passed on to the Intergovernmental Panel on Climate Change, IPCC. The lead author of the IPCC’s first three major assessment reports, the knighted Sir John Theodore Houghton, was the chairman of the UK Met Office and founder of the Hadley Center, which supplies the software for the calculations used by the IPCC. Sir Houghton also became a trustee of the oil and gas company Shell’s foundation in 2000. It used to be called Royal Dutch Shell.
This foundation professes to be charitable and has nothing to do with oil profits, but was suspected of influencing a British minister with regard to a multi-billion dollar deal called Sakhalin Two. The commitment to combat climate change and reduce CO2 emissions is a simple slogan. The political pursuit of CO2 neutrality is based on the IPCC’s latest assessment report, which is said to be a summary of 14,000 scientific publications.
The consensus among scientists, we are being told, is almost 100%. For many normal citizens, this presents clarity in a world that seems chaotic. It provides people with an identity, with meaning and a collective purpose for our civilization. Finally we have certainty instead of differing opinions. The only people who would still deny climate change caused by human CO2 emissions are supposedly anti-scientific idiots, representatives of the right-wing political camp and agents of the oil industry.
Socialists are usually the first to complain loudly about NATO’s influence. Strangely enough, the left has failed to realize the clear military background of climate research and failed to communicate it. The left-wing author and filmmaker Michael Moore made his breakthrough in 2001 with the book Stupid White Men. What caused the most stir was the first chapter about the Republican Party’s election fraud to put George W. Bush in office instead of Democrat Al Gore who later dedicated himself to the topic of saving the climate. He published the world-famous documentary film “An inconvenient Truth” and shared the Nobel Peace Prize with the IPCC. For the left-wing audience, a picture was drawn of a massive right-wing global conspiracy that could only be defeated through massive left-green activism and climate science. The Republicans had removed tons of African Americans and other traditional Democratic voters from the voter rolls and used other tricks to ensure that votes for the Democrats were thrown out as invalid.
When he took office in 2001, President Bush expressed his rejection of the United Nations’ Kyoto Protocol, which set binding targets for CO2 emissions under international law. The Republicans were traditionally close to the oil and gas companies and in the background these networks also cultivated organizations and media including the right-wing extremist spectrum and classic conspiracy media, in the tradition of the John Birch Society.
The right-wing narrative was basically that climate protection was just the lousy tactic of a left-wing global conspiracy to seize power.
We are winning the intellectual battle, but the ecos have the necessary state power. They just enforce it. Trump restarted the coal industry. And so forth. But Ivanka, who I didn’t vote for, hangs out with Angela Merkel and says she’ll stop her father.
I firmly believe that Trump will do the right thing and deliver on his campaign promises and stay true to his political philosophy.
People expected that Michael Moore would devote extensive attention to the climate issue. But instead he avoided it. The film he produced, “Planet of the Humans,” was only released in 2019 with director Jeff Gibbs, who had a humorless, boring style.
Despite being posted for free on YouTube, the film only received 11 million views, a small fraction of a popular music video or about the same as a successful compilation of funny animal videos. What happened? Why was “Planet of the Humans” a flop while Al Gore’s film was a hit and was shown to countless students around the world? The reason for the lukewarm reaction was that Moore’s film was about the fact that megacorporations and banks had long since bought out the green movement and that the green movement was lying to itself and other people.
Eco-activists and green lobby groups have become involved with suspicious billionaires, corporations and large family foundations. Former Goldman Sachs representative David Blood wanted thousands of billions of dollars in green investments for gigantic profits. Before Al Gore released his film “An Inconvenient Truth,” he planned an environmental investment fund with David Blood. Various green funds hold stakes in mining companies, Blackrock, Halliburton, Exxon, Chevron, Gazprom and so on. The reaction from left-green circles to Moore and Gibbs’ film ranged from negative to hostile. The famous climate scientist Michael Mann even considered Moore to be a kind of traitor who had defected to the right-wing global conspiracy. So one would expect that Moore would also reveal the military background of climate research. But he doesn’t. The core aspect of climate research is consistently avoided, or is not known to the filmmakers at all, even though it can be documented very easily.
Moore is absolutely certain that the IPCC is telling the truth and that the scientific consensus is almost 100%. So what’s the risk in documenting the military background in a snappy, funny documentary film and demanding that climate research become truly civilian and international? Does he fear that a film like this will make people too distrustful of climate research?
Either Moore negligently never bothered to review the history of climate research. Or he would rather grudgingly trust the Met Office, NASA and NOAA than trust US citizens to act politically on the basis of more complete information. What about the more well-known climate change doubters, such as Anthony Watts and his world-leading blog? Here, too, you can find practically nothing about the military background of climate research.
On another popular blog called Climate Audit by Steve McIntyre, a search for terms like military returns no relevant results. There is considerable criticism of the data and the behavior of the UK MET, Office, NASA and NOAA, but that’s it. You could be a regular reader of the blog for ten years without ever knowing what’s behind the climate research.
A very prominent skeptic is Republican Senator James Inhoffe, a member of the United States Senate Committee on Armed Services, a committee for parliamentary oversight of the US Department of Defense. He found the scandal surrounding the torture of prisoners at Abu Ghraib exaggerated, promoted the passage of the record-breaking $716 billion budget in 2019 and wanted to pave the way for new massive sanctions against Iran.
He is considered the most conservative member of the US Congress. And in his book “The Greatest Hoax of the Global Warming Conspiracy,” he reveals nothing about the military’s dominance in climate research. Serving as Inhoffe’s Director of Communications was Mark Moreno, who himself became one of the most prominent climate change skeptics and also worked for the Senate Environment Public Works Committee of the George W. Bush administration as Director of Communications. There is also nothing enlightening about the military history of climate research on Moreno’s website “Climate Depot”. His new book is called “Green Fraud” and has a Democrat politician on the cover. Climate protection is supposedly just a left-wing conspiracy and exposing this conspiracy is conservative.
I have here a map for June 4, 1944, which was used in the decision to invade Normandy the following day. The first thing you notice, of course, is that there is also a lot of weather data from Nazi Germany entered here. Of course we had broken the German encryption for their weather observations and we were able to incorporate the data into our forecast. The weather experts concluded that better conditions would prevail on June 6th. It was about wind, visibility and waves. It was concluded that it would be possible that day.
It was ultimately a meteorologist who sealed the downfall of Nazi Germany. Not a politician, not a general, not a Codebreaker, not a spy, but a weatherman named Captain James Stag. He made the massive invasion of Normandy possible, codenamed Operation Overlord. In 1944 the Anglo-Americans needed reasonably acceptable weather for over 150,000 troops, 7,000 boats and fighter planes. For months, weather experts had been analyzing when the weather would most likely be suitable on the coast.
And even at the last moment, it was the meteorologists who convinced US General Eisenhower to postpone the attack by one day to June 6th because of an approaching weather front. Only at low tide would one be able to see from a safe distance what defenses the Nazis had installed in the water. There couldn’t be too much wind or waves so that the infantrymen could storm the coast from the boats.
Air Force divisions needed moonlight to see targets for bombardment and to dispatch paratroopers. If the attack had been canceled because of the approaching storm, the next opportunity would not have come until weeks later. And during this time, German defense measures could theoretically have progressed to such an extent that the Americans would have failed in their attempted invasion. There was also a risk that the Germans would find out where the planned invasion would take place.
There were even fears among the Allies that in the event of failure they would have to withdraw from Europe and begin peace negotiations with the Nazis. The British Admiral Sir Bertram Ramsey, who had already evacuated the encircled British troops at Dunkirk, led Operation Overlord. In the previous weeks, the coast had been flown over 3,200 times in order to photograph it in great detail. The highest-ranking meteorologist involved in the planning was the British Captain James Stag from the Air Force.
And without his strong recommendation to attempt the invasion on June 6, the entire Operation Overlord might have failed. In gratitude, he received some of the highest medals in the British colonial empire and became a member of the elite aristocratic scientific association Royal Society. German Field Marshal Erwin Rommel had been tasked with securing the coast and relied on forecasts from the meteorological center in occupied Paris that predicted two weeks of stormy weather.
There was a real North Atlantic weather war beforehand about the best possible data for the North Atlantic and the Arctic oceans. Germany and the Allies used weather, ships, weather stations and aircraft. The Nazis’ Operation Warhorse established weather stations on the Svalbard archipelago in the North Atlantic. The Allies tried to prevent the Germans from receiving weather data from there. By August 1940, the Allies had destroyed German weather stations in Greenland and Bear Island.
The British meteorologist Captain James Stag became president of the Royal Meteorology Society and later in his career served as director of the British Met Office, which later housed the working group of the UN Climate Council IPCC and also the extremely influential Hadley Center for climate research. From 2001, the IPCC working group moved to the American agency NOAA.
After the First World War, the Met Office became part of the Air Ministry, which administered the Royal Air Force. The ministry’s first director was Harold Holmes.
The Met Office has based many of its weather stations at Air Force airfields. In 1990 the Met Office was declared an executive agency of the Ministry of Defense. In 2011, around the time the climate protection agenda was gaining momentum, the Met Office was separated from the Ministry of Defense on paper. Nevertheless, strong links with the military remained, with offices at air force and army bases both in Britain and overseas and with involvement in the Royal Navy’s Joint Operations Meteorology and Sonography Centre.
The first five directors of the Met Office were all Fellows of the Royal Society and were knighted for their services to the Empire. Sir William Napier Shaw, Fellow of the Royal Society, President of the Royal Meteorological Society, knighted by King George the Fifth. Sir Graham Sutton, Fellow of the Royal Society, President of the Royal Theological Society. Knighted, worked at the notorious Porton Down chemical and biological weapons center on how weather affects the use of combat gas and on a test for the mass use of Anthrax against Germany.
He later researched radar systems and became Professor of Mathematics at the Royal Military College of Science. Sir John Mason served in the radar department of the British Air Force, Fellow of the Royal Society, President of the Royal Theological Society. The knighted Sir John Theodore Houghton, Chairman of the IPCC Working Group on Climate Change, lead author of the first three major IPCC assessment reports. He shared the Nobel Peace Prize with Al Gore. Founder and Scientist of the Hadley Center for Climate Predictions and Research. Julian Charles Rowland Hunt, Baron Hunt of Chesterton, Professor of Climate Models in the Department of Space and Climate Physics and Department of Applied Sciences at University College London, Fellow of the Royal Society.
In a November 2006 article in Australia’s Daily Telegraph, Houghton, the lead author of the first three major IPCC assessment reports, was quoted as saying:
If we don’t announce disasters, no one will listen.
This quote was not an accurate representation of Hutton’s words and did not come from his book. Finally, the following direct statement from him was found in the Sunday Telegraph:
If we want good environmental policy in the future, we have to have a catastrophe. It’s like safety on public transport. People only act when there has been an accident.
In Nature magazine in 2008, he described how the IPCC meeting in Madrid in 1995 took place under his leadership, without ultimately leading to the Kyoto Protocol. The first international treaty with binding targets for CO2 emissions. Among the participants at the meeting was one representing the interests of major American and international Arab oil companies. People fought over the individual sentences. In the proposed Summary for Policy Makers of the next IPCC assessment report, envoys from Saudi Arabia and Kuwait wanted to tone down certain language about the level of certainty that scientists expressed about the causes of climate change.
The US representatives objected, among other things, to the beginning of the eighth chapter and made changes. Afterwards, the oil companies’ lobby group Global Climate Coalition complained in the press that the IPCC had adjusted the final text again. According to Houghton, this was a transparent, nasty propaganda campaign with which the oil industry wanted to obscure public opinion about the fact that scientists from around the world were independently, independently of one another and independent of national or private sector interests, moving towards a consensus on global warming.
So here we have the fairy tale again about the underdogs who compete against the evil oil industry and right-wing politics and still prevail with the bright truth. Sir Houghton, the ennobled fairytale uncle, became a trustee of the foundation of the oil and gas company Shell in 2001, which goes back to the nobility. This foundation professes to be charitable and has nothing to do with oil profits, but was suspected of having worked on a British minister with regard to a multi-billion dollar deal called Sakhalin Two.
Welcome to the Mobile Weather Unit at Camp Bastian, Afghanistan. Met Office Mobile Weather Expert staff are uniformed when taking part in operations. Just like during the Second World War, the Met Office weather experts provide very important information to the Commanding Officers to gain the weather advantage. This is our weather station. These can also be found throughout the UK and the rest of the world.
This is a portable station that we can take with us. During operations and exercises. This is how we measure the weather. Weather affects many types of helicopter and aircraft operations. Ground forces can be affected by heat. So the weather is extremely important. The first thing pilots and senior commanders do at a briefing is obtain the report from the Met Office. We are the first on the podium and present our results on what the weather will be like throughout the day.
This can determine what the commanders do that day. It may lead to changes in plans. Technology has changed. We now have supercomputers doing the forecasting, not just for Afghanistan but across the UK and the world. The technology is much better today. Satellites are very important. We have only recently started using satellites.
That was a big change.
The Hadley Center is one of the leading centers for research into scientific issues related to climate change and works closely with the IPCC, which does not conduct its own research. The climate model developed by the Hadley Center was running climate predictions that are passed on to the IPCC and adopted by them.
HadGEM is the abbreviation for Hadley Center Global Environment Model version one. It was used in the IPCC fourth assessment report on climate change. The current version is number three. Initially, the modeling was so poor that oceans were depicted as being as deep as puddles. Although most of the warming supposedly caused by humans ends up in the oceans. In 2010, the respected publication Nature published a glimpse into everyday life in the British military basement where the supercomputer is located.
The sign on the gate of the building in Exeter, in southern England, says UK Met Office, but it is Ministry of Defense premises. At that time, 27 large boxes were calculating a simulation of the climate up to the year 2100 for the next IPCC report. Because computer climate models are becoming more and more complex by trying to depict more details of nature, the error rates can become ever larger, it is said by the author of the report in Nature magazine.
Only the HadGEM two model was able to simulate ebb and flow and represent the ocean down to a depth of 5,000 meters. In an earlier version of the model, life was a kind of green layer, like the texture in a video game with outdated graphics. Then there were differences, but rigid vegetation and only later vegetation that really changes. You couldn’t model clouds at all in 2010, so you made do with mathematical estimates for all the individual blocks into which you divided the world.
It is said that parameters are a necessary evil that can introduce errors into models. However, even more annoying are the random errors. Just one incorrectly entered bit of data can mess up the entire system. For example, when the scientists at the Hadley Center assumed too few plants in certain places, the final results of the calculation were completely absurd. As a test, you can try to map the current state of the world and calculate it back into the past, but in the more distant past there were no measuring stations; you only estimate the conditions at that time with controversial replacement data, so-called proxies such as tree rings.
Unfortunately, there is no time machine with which you could travel into the future and back again in order to obtain measurement data from the year 2100 so that you can compare it with computer simulations. If you instead developed software to control a car engine, you could measure at any time whether the software is working correctly. If you want to program software to predict the future in a chaotic, non-linear system like Earth’s climate, there is no way to mathematically calculate exactly how accurate or inaccurate the prediction is.
Of course, other countries also have supercomputers and climate models, but these are significantly inferior to the Hadley Center. At least, it is assured, the IPCC also takes these other models into account. The models, according to the author in Nature, are the only tool available for predictions. At the Hadley Center, some employees joke that model developers will eventually try to include panda bears in their simulations.
The US agency National Oceanic and Atmospheric Administration NOAA has an enormous influence on climate research and the resulting climate policy in many countries. Officially, it reports to the Ministry of Commerce and is separate from the Ministry of Defense. However, air and ocean exploration logically provides valuable data that can be used by the military. NOAA grew out of a number of organizations that were consolidated over time.
The NOAA Commission Office selected the ship’s uniformed officers, pilots, research managers, and so on. They also serve, for example, at the Department of Defense, the Coast Guard, NASA and the US State Department, the Weather Bureau and the National Weather Service. After it was founded in 1870 for military purposes, the authority was subordinated to the Secretary of War and the US Army Signal Service. From 1890 it was assigned to the civilian Ministry of Agriculture and then to the Ministry of Commerce.
However, the general population did not have access to tornado forecasting information until 1952. The extreme weather warnings from the US Air Force remained secret. The Bureau used radar systems based on the design used by the Kriegsmarine on aircraft. The US National GeoSurvey wanted to use this organization to carry out surveys, but civilians were not yet suitable for this, so the task was transferred to the US Army and US Navy for a while.
Even later, when the Geo Dating Survey was assigned to the Ministry of Finance, the Kr iegsmarine had a high degree of control over the ships that carried out the surveys. The important official Ferdinand Hessler was convinced that the surveys would offer advantages for possible wars in the future. The onset of the Cold War in the late 1940s led to the Geodetic Survey Service also making significant efforts in support of defense requirements, such as conducting Distant Early Warning Line and Missile, Ocean surveys and work for the US -Navy and nuclear test monitoring.
On the NOAA website in August 2021, after the publication of the short version of the new IPCC report, you will be confronted with several climate change-related articles on the homepage. This is how NOAA Administrator Rick Spinnrad, who previously worked at two important US Navy installations, comments on the new status report. He has a doctorate in oceanography and held positions with the Office of Naval Research, which manages technology programs for the Navy and Marine Corps.
He was also with Naval Meteorology and Ocean Command, the Navy’s weather forecasting and ocean science division headquartered at the NASA Stennis Space Center in Mississippi. He writes about the IPCC’s assessment report that many NOAA scientists contributed to it, as well as NOAA climate models. In addition, NOAA, through the National Centers for Environmental Information, has a massive database of 37 petabytes in size.
With just a few clicks you can access various categories, such as Climate Data Records. On a world map you can draw a rectangle on the desired territory and then receive the corresponding data. NOAA also has its own satellites. Also working at NOAA was Susan Salomon, senior staff member and temporary chair of the Climate Science Working Group at the IPCC. How easily the truth is distorted at NOAA was shown at the end of 2019 when US President Trump claimed that Hurricane Dorian was also expected to hit the US state of Alabama.
Initially, the Alabama Weather Bureau denied the claim and calmed the angry citizens. Trump followed up by showing reporters a weather map where someone had drawn with a felt-tip pen that the storm was heading toward Alabama. Shortly thereafter, NOAA jumped at the president’s side with a corresponding statement, citing computer models from the National Hurricane Center Noah’s statement. But this did not correspond to the facts. And there have been several investigations into whether the Trum administration obtained this statement by pressuring officials.
According to a report in the New York Times, Trump’s friend and Commerce Secretary Wilbur Ross is said to have threatened to fire various senior figures at NOAA. The Washington Post’s sources, however, said Ross conveyed Trump’s wishes, but without threats. When Ross was still working in the private sector, he bought the insolvent company Bethlehem Steel for his new steel company, which had supplied armor plates, warships and cannons, pipes for the navy and other parts of the armed forces during the two world wars.
At some point, steel from abroad became cheaper. The Climate Research Unit or CRU at the University of East Anglia would like to express its deep gratitude to Professor Hubert Lamb, who spent much of his career in the UK Met Office, part of the Air Ministry, which administered the air force. There, for reasons of conscience, he refused to investigate how the weather affected the use of chemical weapons during the Second World War.
The 2012 self-description states: CRU staff were significantly involved in all four IPCC assessment reports, as well as the fifth, which was currently in development. In 1992, the CRU founded the Climate Impact Project and played an important role for the next 15 years. The purpose of the Link Project was to disseminate the results of climate simulations and future climate projections from the Hadley Center’s computer models to research groups in the UK and abroad attempting to assess the impacts of climate change.
The majority of studies up to about 2000 examining the effects of climate change around the world used data from Hadley Center climate models. The Link Project became a model for data dissemination from other research centers and eventually led to the establishment of the Data Distribution Center by the IPCC, now led by the British Atmospheric Data Center.
So it is and remains a British-American dominated affair. The climate researchers originally received their funding from, of all things, British Petroleum, one of the largest oil companies in the world with aristocratic origins, from Royal Dutch Shell, also one of the largest oil companies in the world with aristocratic origins, and from the Rockefeller Foundation, which belongs to the oil company Exxon. The chairmen of the University of East Anglia were often associated with the Royal Society and Baron Oliver Franks in particular stands out with the highest orders of the British Empire, his membership of the Privy Council and his status as a Deputy Lieutenant of the Crown.
He is called one of the founders of the post-war period. His organizational skills made him famous. During the Second World War, when it came to replacing all the equipment that the fleeing British army had to leave behind at Dunkirk. His staff included several of the most dangerous Soviet spies of all time, Kim Philby, Guy Burgess and Donald MacLean.
After working as the British ambassador to the United States of America, he also became a member of the steering committee of the Bilderberg Group, where some of the CO2-relevant mega-corporations are represented. Additionally, he held leadership positions at the Rhodes Trust and the Rockefeller Foundation. Cecil John Rhodes was a member of the Privy Council, a mining magnate and politician in southern Africa who served as prime minister of the colony, an ardent supporter of British imperialism.
In 1972 the Climatic Research Unit started at the University of East Anglia as a project of the Faculty of Environmental Research. The project was driven by the noble Sir Graham Sutton, a member of the Royal Society, who had researched biological and chemical weapons at the notorious Porton Down facility. He was supposed to find out how the weather affects the spread of combat gas on the ground and how Germany could be combated with anthrax pathogens on a large scale.
His test run on the small island of Grenard near Scotland not only killed all the sheep within a few days, but even led to outbreaks of the disease on the nearby mainland because apparently one of the buried test animals was washed out of the grave in a storm and onto the coast of the mainland. The island was under quarantine until 1990.
In the spring of 1944, Operation Vegetarian was ready to take off and, interestingly, the British planes loaded with anthrax biscuits were scheduled to fly through Oldenburg and Hanover. Two Guelph strongholds, Wettiner and Reginare. It is likely that, in an emergency, the air defense on the ground would have been sabotaged by noble agent networks. In the summer, however, the invasion of Normandy was successful and so Operation Vegetarian was called off and the 5 million anthrax cookies produced were burned.
The Rockefellers, who then funded the Climate Research Unit, were deeply involved in the American bioweapons program, which also produced additional anthrax cookies for Britain. And of course the Rockefellers were among the giants of the oil business, like British Petroleum and Royal Dutch Shell, who also gave money to the CRU. Further money came from the foundation of the noble Viscount William Morris, who was known for building cars and was a member of the Royal Society.
His factories were supposed to produce 60 copies of the Spitfire fighter aircraft per week, but due to delays the project was handed over to the Vickers company. The CRU received four buildings donated by the Vossen Foundation, whose noble founder was a member of the Royal Society, as was his son, who then became a baron. Then the United States Department of Energy got involved. The first CRU director Hubert Lamb, who had conscientiously refused to help the Met Office understand how best to use combat gas in different weather conditions during the Second World War, was interested in the medieval warm period and the so-called Little Ice Age, i.e. exactly the two phenomena that were later dismissed as meaningless by the CRU’s star researchers and of which nothing was visible in the hockey stick analysis of global temperatures over the last 1000 years. Lamb feared global cooling for a long time and was nicknamed the Iceman for this. He has received awards from various noble scientific societies such as the Royal Geographic Society and the Royal Meteorological Society. His grandfather was a Fellow of the Royal Society.
The successor at the head of the CRU, Tom Wrigley, was already considered an authority on the topic of global warming. And this trend continued until the hockey stick was finally introduced under Chairman Jones. Temperatures remained relatively the same for almost 900 years, only to rise sharply in the industrial age. From a medieval warm period and a small E There is practically nothing to see at this time.
The CRU, in collaboration with the Hadley Center, is compiling a range of climate data on temperature anomalies.
An undertaking such as putting a satellite into orbit is only possible through excellent teamwork through and through. Scientists around the world wanted to better understand how the planet works. That’s exactly what Explorer was designed for. What I find so honorable about NASA is that it all started with science.
We wanted to look at Earth from space and see what the environment is like. We have one of the greatest challenges in modern history: the expansion of human knowledge of space, the development and use of aircraft that can carry humans and instruments into space. Long-term studies of aviation and space travel for peaceful and scientific purposes and the preservation of United States leadership in space technology.
We have a lot of work ahead of us.
As early as 1909, the British War Secretary Viscount Haldane, holder of the highest British order, Queen’s Council, member of the Privy Council and the Royal Society, suggested the creation of the Advisory Committee for Aeronautics to research aviation. By the way, Haldane helped create the London School of Economics and Political Science. In 1915, the USA created a very similar authority to research aviation, the National Advisory Committee for Aeronautics.
The researchers achieved breakthroughs such as the development of better compressors and superchargers for the engines of Luftwaffe fighters and bombers so that they could fly at higher altitudes with enough power during World War II. After the war, the experimental, rocket-powered supersonic aircraft was developed that could fly almost into space. The first chairman was General George Griffin, who had also been sent as an emissary from the USA to the coronation of the Russian Tsar Nicholas II.
Vannevar Bush brought significant expertise and was also a driving force in the Manhattan Project to build the first nuclear weapon. What is less known is that he worked diligently on biological weapons during his time at the Office of Scientific Research and Development. In the book Baseless by Nicholson Baker it says that Vannevar Bush believed in unconventional weapons, invisible weapons.
At the time, he was spending $3 million a week developing new ways to die. This is a science war, he said. Our weapons must be the final word, they said. A biological warfare committee was created and various colleagues took over the research. Following various suggestions, the US military used napalm against the Japanese, which is said to have given him regular nightmares.
After World War II, he continued to oversee biological weapons at the Defense Department, including the planned use of parasites such as Colorado potato beetles to sabotage crops. Later, the GDR and Czechoslovakia accused the USA of having used Colorado potato beetles. The last chairman was General James Doolittle, a visionary when it came to fighter aircraft. His mission to bomb Japanese targets in World War II became famous as the Doolittle Raid because his own planes could not fly back to base due to a lack of range, but had to make an emergency landing under stormy conditions on Chinese or Russian territory.
He couldn’t find the planned airstrip and instead came down into a rice field and then encountered a number of helpers in the form of Chinese civilians and an American intelligence agent named John Birch, who was disguised as a missionary and was later murdered by communists. This is the very man after the American conservative industrialist Robert Welsh named the anti-communist organization John Birch Society in 1958, which went on to produce some of the most influential conspiracy books.
The material from the John Birch Society also ended up in the best-selling German conspiracy books in the 80s and 90s and provided the essential narratives. Today, the view that global warming is merely a deception operation by the left-wing global conspiracy, led by international bankers, is based on these narratives. One of the John Birch Society’s major donors was Fred Koch of Koch Industries, the oil company that has funded global warming skeptics in recent years.
General Doolittle, as aviation manager of the Shell Oil Company, pushed for the production of 100 octane fuel and worked with Robert Goddard on rocket fuel. NASA’s Goddard Space Flight Center is named in Goddard’s honor. Shell also paid Global Warming-Sk eptics to control these. In the Scottish Rite of Freemasonry, which had been developed together with the Royal Society, Doolittle reached the final 33rd degree.
When the Soviet Union successfully put Sputnik, the first real satellite, into orbit around the Earth in 1957, US politicians were alarmed. Wernher von Braun, the technical director of the Ballistic Missile Agency of the US Army, who originally worked for the Nazi rocket project, actually wanted to launch a Jupiter-type rocket with a satellite last year. US President Eisenhower, who had previously been an important general, followed the recommendation to expand activities.
NACA’s successor in the form of the new National Aeronautics Space Administration NASA was intended to be considered a civilian space agency. The Advanced Research Projects Agency is responsible for military space projects.
Dear space station, the Staff Sergeant of the Air Force Taut Pavilion. Good evening. We hear you loud and clear here on the Mir International Space Station. Greetings from Earth, gentlemen.
More specifically from downtown Baghdad. Thank you for allowing you to be part of this special event. Having two Army officers on the International Space Station is great. I’m here with a group of comrades who would like to ask a few questions. Captain John Swanson, what do you like most about being an astronaut?
We couldn’t hear the question properly. We know you are in Baghdad. They regularly fly over Baghdad. I took a lot of photos. And not a single day goes by that we don’t think about the service to our country and the sacrifices that so many of them make down there. Protecting our freedom that so many people take for granted.
NASA now has many projects that are classified as climate protection. It says NASA is conducting a groundbreaking climate science research program and improving the international scientific community’s ability to advance global integrated science using space-based observations. The agency’s research includes solar activity, sea level rise and the temperature of the atmosphere and oceans, the state of the ozone layer, air pollution, and changes in the sea and land ice.
NASA scientists regularly appear as climate experts in the mainstream press. NASA is the source for many of the observations of the atmosphere, landmass and oceans through research, satellites, etc. NASA scientists look at the data and then theoretically think about what a computer model might look like. We create a model based on this data and run it through our computers and then compare it with reality.
Advances in computer technology mean models are becoming more sophisticated and powerful. This allows us to simulate our complex environment with ever more details. NASA uses a number of weather models such as GEOS five. It provides a realistic-looking, high-resolution view of our atmosphere. These simulations from GEOS five demonstrate the model’s ability to represent cloud formations worldwide. In the 1970s, the US Congress lost interest in expensive discoveries in deep space.
And so NASA was commissioned to conduct cheaper research into Earth’s climate. As part of the Ars Observer System in 2007, NASA had 17 different missions underway using its own satellites as well as satellites from the Department of Defense, NOAA, etc. The brand new Tempo mission, for example, is intended to measure air pollution over North America at very high resolution. Such technologies could theoretically be used to check worldwide exactly how much CO2 is emitted in different countries and whether the countries adhere to the specified limits.
One could also discover hidden underground production facilities. The pollutants probably also include substances that are used to modify the weather. Is Tempo really just about finding the last particles of dirt to protect the environment? Or is it an instrument that is primarily intended to provide military-relevant data? Are ordinary environmental researchers possibly not getting the real, unadulterated raw data?
Tempo is part of a larger project, together with European and Asian partners.
In the USA, the agricultural sector began experimenting with techniques such as cloud seeding early on in order to generate rain or prevent hailstorms. This knowledge naturally ended up in the military and in the middle of the Cold War against the Soviet Union, weather weapons were used offensively in Vietnam to turn the enemy’s most important roads into mud n to transform. In 1971, investigative reporter Seymour Hersh revealed this fact in the New York Times.
Government officials asserted that Operation Popeye was not particularly large or effective. This Atandard excuse is not really convincing. The sources also said, without elaborating, that a method had been developed to treat clouds with a chemical that eventually produced acid rain, disrupting the operation of North Vietnamese radars used to guide surface-to-air missiles. can affect.
Operation Popeye was carried out by the 54th Weather Reconnaissance Squadron under the motto “Make mud, not war.” Project Stormfury was an attempt to weaken tropical cyclones by flying planes into them and spraying silver particles. Officially, the project was considered a complete failure, although significant successes and subsequent research would be permanently shrouded in secrecy. The Lockheed P3 aircraft would have at least been suitable for collecting data on tropical cyclones, allowing for improved forecasts.
These aircraft were still used by NOAA from 2005 onwards to keep an eye on the weather at all times and to modify it as desired. In 1996, the US Air Force called for a global surveillance system. Interestingly, just three years later, researchers from the UK’s Climate Research Unit published the hockey stick graph of a supposedly steep temperature rise caused by CO2, triggering a boom in further research and ever-better monitoring methods.
The construction of the first nuclear weapon was already a project of the highest secrecy and used various civilian camouflages. Biological weapons research could also be hidden behind civilian facades such as tropical medicine, epidemiology or cancer research. How obvious it would be for the military to set up a global system of weather observation and weather manipulation, disguised as climate research and climate protection. It was openly admitted in 1996 that weather information from civilian networks such as the Global Weather Network would be used by the military to deploy weather weapons.
This raises an important question: How do Americans plan to prevent the extensively obtained data on global weather from being exploited by some hostile state? Is the actual weather data subject to secrecy and ordinary researchers only receive falsified data? What about the computer models? How are we as citizens supposed to trust climate research and climate policy if the actual weather is a secret?
It is said that as a product of the information age, this system would be the most vulnerable to information warfare. It would require the most up-to-date defensive and offensive information capabilities available. Defense capability would be essential for survival. Offensive capabilities could provide options to create virtual weather in the enemy’s sensors and information systems, making it more likely that they will make decisions not truly their own, but ones we prefer.
It would also provide the ability to mask or obfuscate our modifications and activities. It is of utmost military importance to be able to measure the weather more precisely and predict it using computer models. At the same time, personnel must be trained in information warfare in order to conceal all sorts of things and deceive opponents. How much of climate protection research should primarily serve the military?
Weather and climate modeling may now be accurate enough for military purposes to predict the weather in a given period of time. How good these predictions are now and how far into the future they extend is probably top secret. Climate science is not necessarily capable of predicting the climate 50 or 100 years into the future because of too many inaccuracies and problems with modeling clouds and so on. You can see that weather modification should not only be used rarely here and there on special occasions, but that weather modification should be used as standard in all possible military operations worldwide. It’s not a thing of the future either. This huge undertaking stands or falls with civilian-disguised weather and climate research and the strategic deception of all opposing powers. And we as citizens should believe that we are getting honest data and that political decisions are made on the basis of this honest data.
US adversaries are to be comprehensively deceived about global weather using airborne chemicals and HAARP-like radiant energy. Most efforts to change the weather rely on d he existence of certain, pre-existing conditions. It may be possible to artificially create some weather effects independent of pre-existing conditions. For example, virtual weather could be created by influencing the enemy’s sensors. The perception of parameters, values or images from global or local meteorological information systems would differ from reality.
This difference in perception would result in the end user making poorer operational decisions. Logically, not only the NATO military depends on accurate weather information, but also the military of Russia and China. Imagine how large and comprehensive this deception by the US military would have to be in order to actually be able to foist enough falsified data on the rest of the world.
Michael E. Mann, the famous climate scientist whose 1999 study has been called a hockey stick, declared in Science in January 2021 that climate science had now won the propaganda war. Next, some kind of physical war is necessary. He is a member of the advisory board of The Climate Mobilization, an American lobbying group calling for national economic mobilization against climate change on the scale of the home front during World War II.
The goals are climate neutrality and 100% renewable energy by 2025, which could practically only be achieved by crippling civilization and destroying the economy. In this analogy, CO2 emissions and arguably the CO2 emitters are the Nazis and Michael Mann is the green general. In July 2016, the United States Democratic Party National Committee approved an amendment committing the party to a World War II-scale international mobilization against climate change. Because his emails and those of his colleagues from the British Climate Research Unit were hacked over ten years ago and led to an anger campaign from conservative circles, he considers himself a martyr and the victim of the largest smear campaign in history. He certainly doesn’t come from a left-wing working class household where money was tight. His father didn’t earn his money on the assembly line in any car factory, but was a mathematics professor at the University of Massachusetts. The family’s home was in Amherst, named after the British colonial ruler Jeffrey Amherst, the First Baron Amherst, who in 1756 defended Hanover against the French with ducal Hessian troops.
The British Guelph kings came from Hanover from 1714 onwards. Two years later he commanded British troops in the American colonies to fight the French and Indians. Several native tribes had banded together to drive the British out of their territories. In this so-called Pontiacs War, smallpox was specifically used as a biological weapon.
A correspondence has even been preserved in which General Amherst speaks directly to Colonel Bouquet about giving contaminated blankets to the Indians as gifts in order to “exterminate this vile race.” Michael E. Mann first studied physics and math at the University of Berkeley in California and then went to the elite Yale University, which dates back to the British colonial empire. If he had gone into quantum physics at Yale, he would have been able to make only minimal progress in the field with maximum effort and hardly attract attention.
In his new book “New Climate War,” he admits that research at Yale on climate computer models was relatively new territory and was much more likely to be successful in this area. And so he earned a degree in geology and geophysics. In 1990 he was a participant in the workshop of the Geophysics Statistics Project, the National Center for Atmospheric Research and, in turn, managed by the University Corporation for Atmospheric Research, a consortium of more than 100 universities engaged in atmospheric research.
The money for Ucar comes from the National Science Foundation, NASA and the US Department of Defense, among others. The National Science Foundation is a multibillion-dollar government effort created in 1950 by a law whose text was the wishes of Vannevar Bush, head of the Office of Scientific Research and Development, which led the development of the first atomic bomb and various biological weapons propulsion.
Although Mann had not worked as a climate scientist for very long, he became famous with the Hockey Stick study and as a key author of the IPCC’s Third Assessment Report, led by the British noble Sir John Houghton of the UK Met Office, which was appointed Defense Commissioner until 2011 Ministry belonged. Managed several projects at the National Academy of Sciences as well as projects funded by NOAA and the Navy’s Office of Naval Research.
Mann has been embedded in the Anglo-American Empire his entire career and is certainly no rebel, even if he is a fanatic on the transformation agenda. Mann’s new book “The New Climate War” reveals nothing about the military background of climate research. Did Mann never notice it or did he keep this fact quiet because otherwise people on the political left in the USA and other countries would lose trust in climate research?
But why does he trust the figures about the future climate that come from a supercomputer in the military basement of the UK Met Office? Why does he, as a physicist, think he is qualified to understand the power-political context of climate research? In his new book, Mann presents himself not just as an activist, but as a kind of green general who wants to show how to achieve victory against the enemy.
This enemy is right-wing, capitalist and will use psychological warfare techniques to discredit climate scientists, just as the tobacco industry once failed to permanently cover up the harmful effects of smoking. In exactly the same way, the CO2-intensive industry will fail to cast doubt on global warming. He feels like his great role model, the famous space explorer Carl Sagan, who was badly talked about because of his opposition to the nuclear weapons program.
Right versus left, good versus evil, pacifism versus militarism. The right-wing circles he mentioned, who allegedly did everything they could to discredit climate research, failed to use by far the most promising technique: telling the truth. That climate research is military and belongs to the Anglo-American Empire. Ironically, Mann said, carbon trading dates back to the chairman of the Environmental Protection Agency (EPA) during the Republican administration of President George W. Bush.
William K. Reilly, who was a good friend of Mann and an environmental hero, graduated from Yale University, as did President Bush and various government officials. At Yale, President Bush was a member of the secret society Skull & Bones, which dates back to the British colonial empire and whose members are particularly well represented in the secret service and military sectors.
Riley, the environmental hero, joined the US Army and served in Europe with the rank of captain in an intelligence unit in 1966 and ’67. He later became president of the World Wide Fund for Nature, an organization that dates back to the nobility. In 1989, he served as head of the U.S. Environmental Protection Agency (EPA) and played a key role in the new Clean Air Act, which capped sulfur emissions and created the first true emissions trading system.
He led the U.S. delegation to the summit in Rio de Janeiro in June 1990. Under his leadership, the EPA conducted groundbreaking research to reduce greenhouse gases that paved the way for a climate change treaty. Sounds like wonderful music to the ears of climate protectors. However, the Republicans and especially the Skull & Bones network simultaneously ensured a massive industrial boom in China in the 1980s and 90s.
The economic historian Anthony Sutton wrote: At the beginning of 1984, the American Bechtel Group started a new company called Bechtel China Inc. to take on contracts for the Chinese government with regard to development, technology and construction projects. The new president of Bechtel, China Inc., is Sidney Beaufort, former Bechtel marketing manager. Bechtel is currently working on studies for the China National Coal Development Corporation and the China National Offshore Oil Corporation.
Both of course Chinese communist organizations.
Communist China will be a superpower around 2001, created by American technology and capabilities. Former CIA Director Richard Helms worked for Bechtel, as did Secretary of State George Shultz and Secretary of Defense Caspar Weinberger. Chinese production of oil and coal for sale and domestic use for power generation and heavy industry is undoubtedly carbon intensive. Since the 1950s, Bechtel has designed, maintained or delivered 80% of all nuclear power plants in the United States.
Bechtel has been responsible for managing the United States Navy’s nuclear propulsion research facilities since 2011. In June 2013, the planning and construction of a US missile defense agency project in Fort Riley, Alaska, was completed.
In 2014, the British Ministry of Defense commissioned Bechtel mi t supporting the Royal Navy and Airforce with procurement and managed support services. For Michael Mann, all objections to climate transformation are nasty lies and psychological warfare. There are even fears that important jobs will be lost and living standards will fall, because everything will disappear as long as Mann’s way of thinking is accepted without reservation and the appropriate politicians are elected and activism is carried out.
Mann seems to forgive his friend Reilly, the environmental protection hero, for being a director at the oil company Conoco Phillips and at the foundation of David Packard, who once took a break from the Hewlett Packard company and instead became US Deputy Secretary of Defense. Reilly is also an advisor to an investment giant that bought the Texas electricity company Energy Future Holdings Corporation together with Kohlberg, Kravis Roberts and Co. and Goldman Sachs.
Energy Future generates most of its electricity with coal and nuclear power. But somehow Michael E. Mann could mentally correct these issues again. Because even the corporate giants would at least gradually save CO2 and implement major changes. Whether one realizes it or not, the Anglo-American Empire, which owns heavy industry, also owns climate research. No empire can exist without heavy industry.
The desire for a transformation of the economy and society comes from the very top. And exactly what this transformation will ultimately look like and how much ordinary people will lose in the process is secret. Just as secret as the actual findings on weather and climate at the UK Met Office, NOAA and NASA. The information war on climate change is much bigger and cleverer than Mann describes in his book.
The script seems to have been set long ago. Heavy industry and the Republicans played the role of nasty climate change deniers for a while, and then they finally come around and tell conservative voters that there is no alternative to climate neutrality and that it can somehow be made conservative as long as people continue to vote for Republicans.
What is behind the argument that almost 100% of relevant scientists agree on global warming?
A widely reported study by Mr Cook in 2013 in the publication Environmental Research Letters is said to have shown that 97% of scientists agreed with the IPCC’s view on global warming. The evaluation of 12,000 scientific papers on global warming and global climate change between 1990 and 2011 revealed this. After six years, the study had already been downloaded over 1 million times.
A number of other similar studies have since appeared. That should settle the issue; but it is not. Cook was actually a cartoonist and website creator. According to his own statements, he graduated with a degree in physics with highest honors, but then gave up his academic career to become self-employed with cartoons, drawings and HTML websites. His degree was only a bachelor’s degree, which only taught the basics.
Despite his meager scientific background, Cook was associated with various universities as a kind of communication expert on the topic of climate change due to the success of his climate change blog. In 2013, a tweet from US President Obama made his publication about the alleged climate consensus extremely popular. Three years later, he received a PhD in cognitive science from the University of Western Australia, but that was all about the topic of his blog: what people think about climate change.
Will you be able to get a doctorate like this in the future just by reading Michael E. Mann’s new book? Or similar publications about climate change deniers? Will Greta Thunberg be given such doctorates next? Cook developed an online course for the University of Queensland on climate change deniers, in which 25,000 students have already enrolled. A whole division of soldiers is being trained here for the war against CO2 emissions.
Let’s look at John Cook’s 2013 study on Global Warming in the Scientific Literature. Right at the beginning we learn that 12,000 scientific publications were not actually examined, but only the short summaries that contained terms such as global warming. This alone is already a big problem, because such studies often contain, in addition to a thin commitment to global warming, considerable criticism of the gaps and deficiencies in previous research.
Ocean experts complain that there are far too few measuring buoys n the deep sea are used and elementary heat turbulences cannot be sufficiently understood. Satellite experts complain that rising ocean levels cannot be measured precisely enough. Experts complain that clouds cannot be simulated on a computer. And the criticism that climate computer models simply cannot be checked with enough real data.
Experts at Stanford University estimate the error range in surface temperatures over the past few decades to be around one degree Celsius. All these inaccuracies and deficiencies from the different scientific fields add up and multiply. And when you put everything together into the master computer model that predicts concrete warming in 50 or 100 years on the supercomputer at the UK Met Office, it becomes inaccurate.
Many experts complain that they can hardly keep an overview of their own field because of the mountains of data and computer models, let alone that they can overview all relevant research areas individually and in their combined form. So if a scientific publication or summary contains a commitment to the IPCC’s climate change view, then this does not tell us much about the quality or significance of this commitment.
The authors of scientific publications can identify significant problems in their scientific field. And they don’t have the ultimate overview. John Cook’s 2013 research stated:
We find that 66.4% of summaries did not express a position on human-caused global warming. 32.6% confirmed warming. 0.7% said no and 0.3% were unsure about the cause of global warming.
Of the summaries that expressed a position on warming, 97.1% supported the consensus position that humans are causing global warming. These are the crucial details that are often left out in press reports that mention the study. Two thirds of the summaries examined, i.e. around 8,000, did not contain any assessment at all as to whether one believes in the IPCC position on warming.
The remaining third almost always abstains from committing to the IPCC. It is said that in the second phase of this study the authors were asked to evaluate their own work. A proportion of the self-assessed papers did not express a position on warming. 35.5%. Of the self-assessed papers that expressed a warming position, 97.2% agreed with the consensus.
At first there was a big question mark with 8,000 of 12,000 summaries where it was not possible to determine how the authors actually feel about global warming. Interestingly, Cook’s 2013 investigation was conducted as a citizen science project, meaning that some participants were simply volunteers and supporters of the website Skeptical Science. Almost all of the work of wading through 12,000 summaries was done by just twelve volunteers.
Who are these volunteers? It is explained that all of these parties believe in global warming and therefore may be biased and view the summaries of scientific papers as affirming the IPCC view on global warming. The Skeptical Science website was created by none other than John Cook himself and was funded by donations from readers who share the IPCC view. If Cook had not satisfied the needs of his audience and provided confirmation of existing opinions, he would not have received any money or a doctorate.
The website has recently become part of a tax-exempt, non-profit organization so that you can finally formally ask financially strong foundations for money for special projects and so that ordinary donors in the audience can deduct their donations from their taxes. In the summaries of climate publications from 2004 to 2007, an additional document explains in more detail that the authors are in the middle between agreeing and rejecting the IPCC view of global warming.
It is not even defined what exactly a neutral stance should be. Does it mean the authors aren’t sure and don’t want to explicitly rule it out or confirm it? True agreement is only present in 100 2/506 cases, so neutral attitudes are around four times as common as clear affirmations. Explicit rejections are relatively rare, with two cases occurring, but a neutral attitude does not necessarily mean a clear affirmation.
Neutral can mean that the researcher does not make any judgments or does not want to attract negative attention or is simply waiting to see how the research develops developed. In the vast majority of cases, there were no clear sentences in the abstract of a paper in which the author described the influence of humans on the climate as ambiguous or not sufficiently investigated. In the vast majority of cases, a neutral stance on global warming can only be roughly and vaguely identified.
Mind you, only John Cook’s additional document from 2013 provides a more detailed breakdown, stating neutral positions on global warming. In the actual study you do not see this more detailed breakdown, but the neutral attitudes are simply added to the category of those who take the position that humans are causing significant global warming. Incidentally, the 12,000 summaries of papers evaluated represent only a fraction of the publications on the topic of climate.
So when a press outlet like the London Guardian mentions the paper “Consensus on anthropogenic global warming”, the content of the paper is completely abbreviated and presented in a grossly misleading manner. The reader thinks that almost 100% of climate-relevant researchers would believe the IPCC view on global warming. But the paper, which was created with twelve fans of the Skeptic Science website, doesn’t say that at all.
What the Guardian says is a complete false statement. It is said word for word that 97% of climate scientists in 12,000 academic publications agree that there is a link between CO2 emissions and massive global warming. So the Guardian reader thinks that the authors of 11,640 papers firmly believe in global warming and only 360 differ. These 360 are then probably incompetent or bought by the oil industry.
Everyone can immediately see that there is a world of difference between saying that 97% of scientists clearly support global warming and saying that only 20% clearly support global warming. John Cook’s 2013 paper does not have much significance on its own and it is simply a false statement for the Guardian or anyone else to peddle the 97% figure.
US President Obama and Secretary of State John Kerry have repeatedly said that 97% of scientists agree. Other studies used a practically identical approach and sometimes list the level of climate competence of the authors of the papers examined or summaries of papers. Hans von Storch published a better study in 2007. Instead of interpreting any summaries, clear questionnaires were repeatedly sent to researchers from different countries over the years.
They were asked to rate on a scale from 1 to 7 how well or poorly computer climate models are able to depict things such as hydro dynamics, water vapor, clouds, rain, etc. The rating is often in the middle of the scale. Next, the scientists were asked how well or poorly they were able to estimate the accuracy of the climate models. Over the course of several years, the accuracy was consistently not considered to be particularly high, significant improvements did not occur and, in general, the models were not really believed to accurately predict the future.
But in order to ultimately get an accurate prediction about the climate in 50 or 100 years, a very high level of accuracy would be needed for all the different individual climate models. In raw data, in proxy data, in ocean research, etc. However, this high level of accuracy is almost never present anywhere. It is easy for many laypeople and unfortunately also for some researchers to commit to global warming.
But it is completely underestimated how hyper-ambitious a goal it is to predict the climate in 50 or 100 years. Slogans about climate change are quickly said and a few rehearsed, memorized arguments can be quickly reeled off. But almost no one delves into the hyper-complex climate research with all of its glaring weaknesses and distortions. Von Storch gives the value of 4.53 when asked how well climate models can generally predict the future.
A value of one would mean the scientists thought the models were perfect. A value of seven would mean that the models completely fail to predict the future. When predicting 100 years into the future, the value is 4.78. It says that although there have been some statistically significant minor improvements over the years, the data suggests the scientific community does not perceive models as truth machines, as they are often portrayed in the media.
On the contrary, climate scientists appear to be all too aware of the limitations of climate models and show minimal confidence in the results when it comes to the long term current or short-term predictions. When asked whether humans are primarily responsible for climate change, the answers were mixed. Not every researcher is directly concerned with the final calculation of the software, but researchers examine clouds, oceans, rain, storms and so on. In other words, research areas that are not only relevant for the final calculation of the climate in 2123, but also fulfill all sorts of specific purposes. It will benefit us as humans in the here and now if we understand hurricanes better. The military needs the most accurate weather forecasts and options for weather, weapons and agriculture. Protection against hail, droughts and heavy rain. Thousands of researchers can therefore research useful things even if the accuracy is not sufficient to calculate the climate 50 or 100 years in advance.
What about targeted, organized manipulation in climate research and the essential computer models? Would such an operation be possible in principle? Yes, absolutely. Because you would only have to sporadically insert weak points and hidden manipulations that would be difficult to find. Are there cliques that have the power to pull something like this off? The Anglo-American upper class controlled climate research and the military from the start.
Is there a compelling motive? Yes. What is the benefit-risk ratio of such an operation? Attractively priced. Would there be much better, simpler alternatives to the surgery? No. When it comes to climate change, the mainstream rejects the hypothesis of a malicious secret operation from the start. That wouldn’t be plausible. But those who reject this are generally not experts in secret operations, but climatologists, politicians, lobbyists, members, activists and journalists.
Now, when it comes to climate change, a malicious secret operation would have to be constructed in such a way that almost all of the science that goes into it is of a fairly high quality. Even if a hit rate of 99.9% is achieved, targeted manipulation would still be possible and would be better hidden. It would do no good to denigrate the 99.9% good science about clouds or oceans and to accuse the scientists of evil intentions or corruption.
In a big, important secret operation, most of it has to be real. Various scientific fields such as clouds, cosmos, oceans, atmosphere, etc. would each have to be almost 100% correct. And these fields have to be put together almost 100% correctly and then translated into computer code 100% correctly. For some less important scientific questions, 99.9% or 99% or 95% accuracy is sufficient.
And we are working towards ensuring that the accuracy will increase in the future. But if you dare to claim that you have a sufficiently good level of computer modeling to predict temperatures 50 or 100 years in advance, then you have to deliver nothing less than perfection. Are trillions of euros being bet on the costs of the energy transition? With the quality of this software, I expect sheer perfection and not 99.9% or below.
People used Windows and common server technologies for years until critical bugs, i.e. vulnerabilities in the code, were found that allowed hackers to break into systems and bypass the entire security system. Anyone with an opinion on climate change should urgently demand a comprehensive and extremely thorough review of all code used in climate-related research. Without a mega audit, I would never bet trillions of dollars.
In addition to source code, there is also compiler code and various algorithms where errors or manipulations are even more difficult to find. Cloud experts, for example, are not programmers and therefore cannot check whether the climate computer model they feed data into and use is free of errors and whether the cloud formations have been correctly translated into computer language. It is also well known that mistakes simply happen when coding, and if someone intentionally inserts mistakes, if they are caught, they can excuse themselves by saying it was an oversight.
Let’s remember the Volkswagen emissions scandal. The software had been manipulated so that it noticed when the vehicle was on a test bench and changed parameters accordingly in order to achieve better emissions values. Which were not achieved under real conditions. So-called underhand techniques can be hidden much better so that the manipulations are not found during an examination and, if discovered, look like unintentional errors.
There are even regular hacking competitions for subtle manipulation. Climate-relevant scientists can theoretically do good work on clouds, oceans and Arctic ice masses. And yet the master computer model used to predict the ultimate temperatures would then be flawed. It is said that most of the warming will be absorbed by the oceans. For a long time, buoys measured the surface temperature, then also 15 meters deeper and at some point also the deep sea.
Overall, far too few buoys are used, even though they are cheap. At 166.2 million square kilometers, the Pacific Ocean is by far the largest sea in the world. Using 3,000 buoys makes one buoy per 55,333 square kilometers, roughly the size of the country of Croatia. And we’re not talking about a shallow body of water, but rather a three-dimensional ocean several kilometers deep.
A buoy drifting with the current can of course send a signal many times. So let’s imagine some points of the line going across a piece of ocean the circumference of the country of Croatia. The buoys and satellite monitoring actually don’t cost much, but strangely little money is spent on them. It is said that the cost of satellite tracking per day is about ten US dollars.
Which gets pretty expensive for continuous 1 year tracks. To reduce costs, some drifters were programmed to transmit only once every three days, or only 1/3 of the time of day. This creates gaps that need to be interpolated. Interpolating means guessing. The deeper you want to measure, the more expensive the devices become. The deep sea, i.e. everything below a few 100 meters, usually has temperatures near freezing point.
And the deep sea makes up around 90% of the ocean’s water. The Argo program has 4,000 measuring devices, most of which measure at a depth of 1,000 meters. Doesn’t sound too bad. If it weren’t for the fact that the experts complain that the system, which is already based on cost-effective technology, suffers from relatively flat financing. One would like to improve coverage in critical regions where higher resolution is required.
And the western border regions where the meso-scale noise is high. This is a polite way of saying that too little measurement is being carried out, especially in important marine regions, due to a significant lack of funding. The measurement buoys cost a few thousand dollars each depending on the quality level and sensors, which means that thousands more buoys could be purchased for a few tens of millions of dollars to improve coverage.
We are supposed to bet trillions on climate computer models, but when it comes to obtaining data from the oceans, they skimp by a few million. Why haven’t filthy rich patrons provided $100 million for new buoys? The ocean researchers who receive data from the Argo Project demand that Argo collaborate with our end-user community. This refers to researchers working to improve the use of Argo data in forecasting systems and services.
So people are dissatisfied with the way they are treated by Argo leadership and how Argo has so far ignored the researchers’ input. It’s amazing: There are two extremely important things missing that shouldn’t actually be a problem: adequate funding and normal scientific exchange. It says only a small fraction of the funding needed to support the ambitious community requests for an expanded ARGO program can currently be identified.
It is important for Argo to address its future challenges as a single, integrated program. 50% of the water volume lies below a certain ocean depth and this is exactly where there is substantial variability. It says changes in deep ocean heat were estimated over decadal intervals using a sparse network of repeating hydro-graphic sections sampled at quasi-decadal intervals.
Therefore, only estimates are possible and uncertainties due to the sparse observations affect about 2/3 of the magnitude of the signal. The Deep Ocean sub-project is therefore urgently needed, according to the top researchers in 2019. This means that the energy transition has been in full swing for several years, although there is not enough knowledge about the oceans. The data from the various measuring buoys are transmitted by satellites and processed, corrected and, after a complex procedure, then made available on Internet servers, which are then accessed by various researchers from all over the world.
Researchers are demanding that we finally take into account a phenomenon called turbulence mixing, which has significant effects on the distribution of thermal energy. If you want to be able to make predictions about temperatures and sea levels that are valuable for society on the basis of computer models, then you have to It is imperative to understand the mixing in ocean water. This is a really harsh criticism of the previous ARGO project and the previous computer models.
Despite their importance, observations of ocean turbulence are said to be extremely sparse, particularly in the deep sea. Imprints of turbulent transport processes in the large ocean state, as readily observed by ARGO, allow for an invasion of diffuse effects. Measurements are required to make inverse estimates. In 2019, top researchers complain about the significant lack of understanding of heat exchange through water turbulence.
It says recent work suggests significant temporal and spatial inhomogeneous mixing of near-surface and deep ocean waters. For example, buoys in the upper few 100 meters have identified the importance of underwater turbulent flows in setting the annual cycle of tropical ocean temperature and have found variability on long time scales that include ENSO. Furthermore, mixing measurements in the deep ocean show strong variability, indicating the need for sustained global measurements across the entire water column.
The deficiencies are so significant that the Climate Process Team and Ocean Mixing had a lively exchange about them. There are doubts about previous computer models for predictions; that is, the underlying assumptions of these methods are uncertain and may be unreliable where suspected turbulence is greatest. The assumptions of these methods are known to prove incorrect near limits, with deviations between parameters and direct measurements of up to a factor of ten.
A deviation by a factor of ten is an additional zero. Researchers complain vehemently that despite the availability of better, cheaper technology, the urgently needed improvements are not being made. Existing observations based on the Argo buoys are inaccurate because they only measure water at a few points. The buoy fleet would actually have to be drastically increased in order to finally properly measure the water-heat turbulence, but unfortunately this is prevented due to limited budgets.
More and more sensors are being packed into the buoys that measure things other than temperatures, creating huge new mountains of data that are a burden and distract from the important parameters that should be focused on. There are problems with the sensors that, among other things, measure the salt content. The Argo program relies almost entirely on the SB 41 and SB 41 CP sensors from Siebert Electronics for temperature and salinity measurements. The SB Group was taken over in 2008. They also bought Weather Labs and Atlantic. The company is now the world market leader under the name Siebert Scientific. Researchers complain that almost all Argo buoys use sensors from a single company. The deep-sea buoys also use separate sensors. Researchers complain that there are uncertainties in the way the buoys move.
Especially in regions where there is a lot of ice. It is hoped that in the future the accuracy of the sensors will be improved to such an extent that it meets expectations. The criticism is politely worded and follows basic praise, but is harsh because more and more data is being measured. Away from temperatures and due to a lack of money, a limited number of staff have to manage and process the mountains of data.
There is a lack of buoys, measurements in seasonal ice regions where buoy operation is difficult. In the tropical Pacific, twice as much measurement density would be needed as before. More buoys are needed in the boundary current regions. The computer code used is better shared among programmers, otherwise programmers who no longer work on Argo Code will take their knowledge elsewhere without fully sharing it with other new programmers.
Apparently, talented programmers aren’t exactly vying for jobs for Argo. It said it remains a challenge for the team to continue to find interested and talented members for the Argo Data teams and to train them appropriately and in a timely manner. Due to the lack of qualified personnel, people are even considering using artificial intelligence to process the mountains of data. This would mean that the data would be in the hands of even rarer artificial intelligence specialists.
Among other things, they want to find out how reliable the sensors in the buoys really are. Some of the world’s leading climatologists have at times had to face unpleasant committees of inquiry because of a lack of transparency. Including Phil Jones from the Climate Research Unit and the University of East Anglia. The central question was whether there were already significant warm phases before the industrial age gave. The question should have been asked to Jones whether he or his colleagues might be withholding data because they were ordered to do so for military and intelligence reasons, to prevent other, especially hostile nations, from understanding the weather as well as Britain and the USA.
After all, it is understandable that the military established weather and climate research and that this research is of fundamental military importance. Who seriously expects to provide the whole world with comprehensive, unadulterated knowledge about weather and climate? Let’s imagine that someone like Jones had publicly stated, in a committee of inquiry, that he withheld crucial, unadulterated data for national security reasons.
It would have sparked a heated discussion about how military climate research actually is and how limited access to the information is.
My name is Professor Edward Akten. Here is the Vice-Chancellor of the University of East Anglia, Professor Jones, Head of the Research Unit at the University. Thank you for appearing before this committee. It. I would like to start with the allegations of alleged intentional deception.
As I understand it, the criticism from McIntyre and others is not about keeping raw data secret. These are available. However, the computer programs and the methodology and selection of weather stations used have not been made available. That’s what the criticism boils down to. Why haven’t you made this information available when critics say they are unable to reproduce their results to agree with them or point out errors in them because they don’t have the programs and the names of the weather stations?
If you can’t understand the methodology, is the peer review system broken?
The methodologies are included in the scientific publications. This isn’t rocket science.
Why can’t people independently check their scientific publications? That’s not usual. You don’t necessarily make all the data available when you publish it. We should trust them blindly. No, because there is enough data available. If a scientist wants to check and falsify their work, he doesn’t have the opportunity to do so.
We have made our adjustment to the data available in these reports.
They are now 25 years old. But the programs that calculate global average temperatures are available through the Met Office.
I would like to continue with an email from you asking why you should provide someone with data. If the person then wants to try to find errors in it.
The ice age cycles were very slow and predictable, while warming since the beginning of industrialization has been rapid. This view has been supported by researchers such as Jones and Michael Mann in their study of temperatures over the past 1,000 years.
Only in the 20th century did there suddenly be a steep increase. What looks like a hockey stick in the graphical representation. But then there was research by astrophysicists Wilson and Sully Paul Jonas from the Harvard Smithsonian Center for Astrophysics. In the journal Research and in Energy Met. The two spoke of a medieval warm period when there was no industry. Humans were neither responsible for this significant warming, nor does it fit with the 100,000-year-long cycles of ice ages and the subsequent warm phases.
The two men said they received support from NASA, the US Air Force Research Division and the American Petroleum Institute. The changing solar activity would have triggered the medieval warm period and a mini ice age. Phil Jones and Michael Mann’s hockey stick was rejected. Jones and Mann became downright aggressive and decided to ignore Sun and Jonas’ research in public and then ensure that the editor-in-chief who published Sun and Jonas’ work was fired.
Silence and intrigue instead of scientific discourse. How do we know what Jones and men had cooked up? Well, her emails were hacked. In these e-mail conversations, the accusation was made bluntly that Sohn and Jonas were corrupt and only wanted to provide a pass for the benefit of right-wing conservative politicians. Michael Mann’s career depended on the hockey stick, and he secretly created a memo with Michael Oppenheimer that was sent to other researchers who would comment in the media about Sun and Jonas’ work.
The hockey stick was not presented by Jesus and the Virgin Mary on a silver platter presented, surrounded by clouds and white doves. Nevertheless, the hockey stick researchers threw an epic tantrum when their work was questioned and launched a campaign of destruction against their critics. At the trade publication that Sun and Jonas had published, the editors-in-chief disagreed about their assessment of the matter.
The German editor Professor Hans von Storch resigned. He explained in Spiegel that politicians are exaggerating climate change to get attention. The University of East Anglia, where the hockey stick researchers worked and the email server was hacked, would have violated a fundamental tenet of science by not sharing data. They practice science as a power game. In 2004, the publication Science published an article by Storch and his team in which the research methods used by the hockey stick researchers were criticized.
From the outset, these multi-proxy methods were unsuitable for detecting major temperature fluctuations before the industrial age. Storch created a series of temperature fluctuations for demonstration purposes and then applied multi-proxy methods to them, with the result that the fluctuations were ironed out. At first there were a number of angry researchers who accused Storch of making mistakes. But then researchers like Leeds and Christiansen confirmed the criticism that Storch had made about the multi-proxy methods behind the hockey stick.
In 2010 Storch received the SC Prize at the International Meetings, Statistics and Climatology in Edinburgh. On the one hand, von Storch has repeatedly submitted to the creed of man-made climate change, and on the other hand, his work has attacked the hockey stick as an essential foundation of climate change ideology. Overall, the medieval warm period is now viewed in climate research as an insignificant, spatially limited anomaly. Based on complicated and controversial research methods, it is considered certain that the medieval warm period had no global significance.
Pat Michaels assumed that all of the relevant warming in the industrial era occurred between 1920 and 1935, before human greenhouse gas emissions rose dramatically. The George Marshall Institute said the methodology behind the hockey stick, called mbH 98, was misleading because the research initially only went back to 1400 and therefore did not cover the medieval warm period. The researchers Bridger and Tim Osborn critically examined the mbH 99 methodology in a detailed study from May 190 with regard to the so-called proxies.
This refers to things like pine trees, from which one tries to determine what temperatures used to be, when there were no thermometers and no records. They concluded that the extent of anthropogenic warming was uncertain. Wallace Smith Brocker argued that the Medieval Warm Period was global. He attributed the recent warming to a roughly 1,500-year cycle, which he saw as related to episodic changes in the Atlantic’s production circulation.
Torpedo Schneider of the Atmospheric in Oceanic Sciences Program at Princeton University criticized the methodology behind the hockey stick for using incomplete proxy data and poorly guessing and estimating the gaps in the data. Climate data is usually so incomplete that it contains more variables than real fixed data and therefore one should avoid using certain algorithms to guess and estimate the gaps in the data.
The hockey stick researcher Michael Mann then switched his calculation method to the one recommended by Schneider. McIntyre’s paper states that Mann and Hughes’ hockey stick form is primarily an artifact of poor data processing and the use of outdated proxy data sets. The hockey stick was attacked from many sides, but prevailed. Critics of the hockey stick were accused of some perceived mistakes, while the stick’s representatives were increasingly defended.