It is important first to recognize that progress in climate science dates from more than 200 years ago. By the middle of the 19th century, scientists understood that the earth is heated by sunlight and would keep warming up indefinitely unless it had some way of losing energy. They knew that all objects radiate energy and that the earth radiates it in the form of infrared radiation. Infrared radiation is a form of light but with longer wavelengths than can be seen by the human eye. However, it can be measured by instruments, including infrared glasses that combat soldiers use to “see” in the dark. The hotter the object, the more radiation it emits, and the shorter the wavelength of the emitted radiation. The sun’s surface temperature is about 6,000 K (about 11,000 °F), and it emits mostly visible light, while the earth’s effective emission temperature is closer to 250 K (−9 °F), and so it emits much less radiation, and at a much longer (infrared) wavelength.
In 1820, the French polymath Jean Baptiste Fourier calculated how warm the earth’s surface had to be to emit as much radiation as it receives from the sun, so that the temperature of the planet could remain constant. He found that his estimate was much too low. He reasoned that the atmosphere must absorb some of the infrared radiation and emit some of it back to the surface, thereby warming it. But he did not have enough information about the atmosphere to test this idea.
It was left to the Irish physicist John Tyndall to solve that problem. He used an experimental apparatus of his own design to carefully measure the absorption of infrared radiation as it passed through a long tube filled with various gases. His measurements astonished him and the whole scientific community of the mid-19th century. Tyndall found that the main constituents of our atmosphere—oxygen and nitrogen, which together constitute about 98% of air—have essentially no effect on the passage of either visible or infrared radiation. But a few gases he tested, notably water vapor, carbon dioxide, and nitrous oxide, strongly absorb infrared radiation, and water vapor also absorbs some visible light.
Tyndall’s discovery was entirely empirical, based on careful laboratory experiments and measurements. The fundamental physics of the absorption and emission of radiation by matter would not be understood theoretically until the development of quantum mechanics in the early 20th century. According to this physics, symmetrical molecules with only two atoms— nitrogen (N2) and oxygen (O2), for example—hardly interact with radiation, but more complex molecules like water vapor (H2O—two atoms of hydrogen and one of oxygen) and carbon dioxide (CO2 —one atom of carbon and two of oxygen) can interact much more strongly with radiation.
Thus by the time of the American Civil War, it was well known that the absorption and emission of radiation in our atmosphere is due to a handful of gases that make up less than 1% of air. We now know that without that 1% the average surface temperature would be near freezing, and we would not be here to measure it. While this phenomenon may seem deeply non-intuitive, it has been verified countless times by theory and experiment. And variations of these greenhouse gases, along with variations in sunlight, volcanoes, and wobbles in the earth’s orbit, have played an important role in climate variations over Earth’s history.
A kelvin (abbreviated K) is a measure of absolute temperature; no material can have a temperature of less than 0 K (the so-called absolute zero). The freezing temperature of water is 273 K, and its boiling temperature at sea level pressure is 373 K. The temperature in Celsius is the temperature in kelvins minus 273 (e.g., 373 K − 273 K = 100 °C.)