Today, hardly a week goes by without some sort of study being published about global warming and the impacts of ever-rising carbon dioxide levels on the planet and ourselves. But climate change wasn’t always on everyone’s radar. So when did humans first become aware of climate change and the dangers it poses?
Actually, we have to go back to the beginnings of the industrial revolution to fully explain what happened and how scientists began to see a correlation between carbon dioxide values in the atmosphere.
In the late 1800s, humans started burning large amounts of fossil fuels—coal, oil, and natural gas—for heat, to power machines, and to generate electricity. Coal, in particular, was widely used for heat in homes and businesses and to make steel.
These fuels helped us make significant technological, social, and economic progress and elevated standards of living around the world. And based on a 2016 study, published in Nature, global warming as we know it today actually began in some regions of the world as early as the 1830s.

Suffice to say that as the fortunes of the world changed following the advent of the industrial revolution, we burned more and more fossil fuels over two centuries, adding large amounts of carbon dioxide (CO2) and other heat-trapping greenhouse gases to our atmosphere.
Early in the 20th century, it was an accepted idea that man could alter his environment locally – such as cutting down trees, filling in wetlands, and plowing up virgin lands. But this was considered progress.
Of course, the ice ages and other wrenching climate shifts of the past were topics of research. But few people considered them an immediate threat, and hardly anyone thought humans could trigger worldwide climate change, according to The Guardian.
One early thinker on the subject was Swedish chemist Svante Arrhenius in the 1890s – He and a few others had already seen the potential global impact of fossil-fuel use, but their views were typically dismissed by colleagues.

The advent of the Keeling Curve
Spencer Weart, a historian and retired director of the Center for History of Physics at the American Institute of Physics in College Park, Maryland, told Live Science in an email that scientists began to worry about climate change in the late 1950s.
Dr. Charles David Keeling began studying atmospheric carbon dioxide in 1956 by taking air samples and measuring the amount of CO2 they contained. Over time he noticed a pattern.
The air samples taken at night contained a higher concentration of CO2 compared to samples taken during the day. He drew on his understanding of photosynthesis and plant respiration to explain this observation: plants take in CO2 during the day to photosynthesize—or make food for themselves—but at night, they release CO2.
But Keeling’s interest was piqued even further when, after a few years of keeping a record of his observations, he noticed a larger seasonal pattern in the CO2 levels.
He discovered CO2 levels are highest in the spring, when decomposing plant matter releases CO2 into the air, and are lowest in autumn when plants stop taking in CO2 for photosynthesis.
Keeling eventually was able to establish a permanent residence at the Mauna Loa Observatory in Hawaii to continue his research. It was here that Keeling discovered that global CO2 levels were rising every year.
By analyzing the CO2 in his samples, Keeling was able to attribute this rise to the use of fossil fuels. Since its creation, the Keeling Curve has served as a visual representation of Keeling’s data, which scientists have continued to collect since his death in 2005
