The History of Climate Change Research

December 26, 2017 - Emily Newton

Revolutionized is reader-supported. When you buy through links on our site, we may earn an affiliate commision. Learn more here.

For nearly 200 years, climate change research has measured the shifts in regional and global climates. Today, the National Climate Assessment report provides updated information detailing the impact of climate change on the country, analyzing floods, droughts, heat waves and shifts in agriculture.

Reports by IPCC and AAAS also detail a vital need to take substantial and prompt action regarding greenhouse gases to offset dire consequences in the future. Reports indicate the window of action is fast closing, but on the bright side, there will be more barbecues, and people could get to stay home more frequently.

Many ignore climate change due to busy lives or skepticism — climate change research to them is mythology, like Santa, only the consequences come in the style of the Krampus. Unfortunately, that closing window means increasingly limited resources for animals and people — while precious resources continue to be abused — and dramatic shifts in environments, populations and weather in the future.

The History of Climate Change Research: A Timeline

Climate science has a long history filled with scientific discovery, analysis and tracking. No matter if you call it “global warming” or the more neutral term “climate change” — it’s still happening. To understand the impact of climate change today, you must understand the history of climate change research.

The 1800s

As of 1800, the world population reached one billion souls, and the increasing population would need more resources. As industries advanced, so did science and the desire to learn about human impact on the world.

Climate change research started in the nineteenth century, when John Fourier uncovered the greenhouse effect in 1824, calculating that the planet would be much colder if it didn’t have an atmosphere. The term “greenhouse effect” comes from the idea that gases in the atmosphere capture the sun’s heat like greenhouse glass.

In 1859, John Tyndall discovered that certain gases block out infrared radiation, and fluctuations in the concentration of these gases could promote climate change. Tyndall developed the idea that greenhouse gases “blanket” the Earth.

The discoveries of Fourier and Tyndall didn’t suggest that humans were contributing to changing the atmosphere’s chemistry, but the basic understanding of how human contribution of gases could create a potential impact on the atmosphere was there. So, it’s not humanity’s fault at all, right?

The Early 1900s

The greenhouse effect can be measured, and the first calculation that included greenhouse gases through human emission occurred over a century ago. In 1896, Svante Arrhenius, a Swedish chemist, calculated that the doubling of CO2 content in the atmosphere would increase the planet’s temperature by 4° Celsius. That doesn’t sound so bad until you count the passing centuries.

The Second Industrial Revolution (1870-1910) accelerated the growth of human-driven emissions into the atmosphere with the increasing mass production of chemicals — like fertilizers — and factory CO2 release, for example, also affecting public health. Earlier on, the greenhouse effect didn’t seem like a big deal because at the older rates of CO2 release, global warming would presumably take a thousand years.

World War I taught governments how to control and mobilizes industrial societies. With the emergence of the oil industry, oil fields opened in Texas and the Persian Gulf would impact the climate and war, along with the advent of cars on U.S. soil. As of 1927, fossil fuel burning and industry carbon emissions achieved a rate of a billion metric tons per year.

During World War II, the military’s strategy largely revolved around the conflict to control oil fields. The war also brought generous funding from the U.S. Office of Naval Research to various fields of science that would further the history of climate change research and its findings.

The Mid-1900s to 2000

Climate change wasn’t a worry yet, right? What about the ocean’s ability to absorb CO2? In 1957, scientists Roger Revelle and Hans Seuss uncovered that the oceans were limited in their capacity to store CO2 emissions.

In 1958, Charles David Keeling began systemic measurements of CO2 in the atmosphere and found concentration levels rising 2.2 percent annually due to fossil fuel combustion. These measurements continue today. In 1960, the population reached three billion.

When Tyndall made his discovery about the blocking of infrared radiation, skeptics were critical of his findings because water vapor also does the same thing. However, calculations in 1963 suggested that feedback via water vapor would likely make the climate highly sensitive to shifts in levels of CO2, and in 2014, a new study confirmed water vapor as a global warming amplifier. In 1965, the U.S. President’s Advisory Committee warned that the greenhouse effect was a real concern.

A New Perspective

In 1968, studies suggested the potential collapse of the Antarctic ice sheets, which would result in dramatically rising sea levels. In 1969, humans viewed a fragile, beautiful Earth for the first time during the moonwalk, and in the following year, Earth Day was born. Space exploration continued in the seventies, and Mariner 9 noted that Mars once had a different environment than its existing desert.

In 1972, the UN finally placed the issue of climate change on the agenda, though solutions and discussions were limited. In the 1970s, droughts raised worries about the environment, and further scientific study revealed concerns regarding traces of airplane gases in the stratosphere, the impact of CFCs and methane on the ozone and how deforestation and additional ecosystem changes factored into the growth climate change.

Growing Concern

By the start of the 80s, news coverage had increased on the worries of global warming, and President Reagan’s election linked political conservatism to climate change skepticism. In 1989, the New York Times would report that 1988 set a new record for high temperatures. Though natural changes in temperatures could be part of the impact, many scientists reasoned that greenhouse gases had evident impact while others were hesitant to make that call.

By the late 80s, some world governments began to impose CO2 restrictions, while fossil fuel industries pushed back on politics and stated climate change wasn’t a major worry to justify taking action. Similar pushback would continue into the 2000s.

The 2000s

Closing in on 2000, the effects of climate change became increasingly apparent, and the theories and measurements of scientists proved true and reliable. By 2007, the IPCC’s fourth report warned that global warming effects were now evident, and in the 2013 report, scientists were 95% sure that humans are the “dominate cause” of global warming.

The history of climate change research began nearly 200 years ago. Since then, it has monitored shifts in regional and global climates. If then wasn’t the time to take action and now is not, when is the “right” time? Will we take action before it’s too late?

Revolutionized is reader-supported. When you buy through links on our site, we may earn an affiliate commision. Learn more here.


Emily Newton

Emily Newton is a technology and industrial journalist and the Editor in Chief of Revolutionized. She manages the sites publishing schedule, SEO optimization and content strategy. Emily enjoys writing and researching articles about how technology is changing every industry. When she isn't working, Emily enjoys playing video games or curling up with a good book.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Recent Articles

Share This Story

Join our newsletter!

More Like This