After rising in the early 20th century, global surface temperatures cooled slightly from just after World War II (the mid-1940s) into the 1970s. These temperature drops were focused in the Northern Hemisphere.
Scientists already knew that carbon dioxide was accumulating in the atmosphere and that it could lead to eventual global warming. In 1975, Wallace Broecker (Lamont-Doherty Earth Observatory) published the first major study with "global warming" in the title.
A few researchers believed that pollution from burgeoning postwar industry in North America and Eurasia was shielding sunlight and shading the planet, causing the observed cooldown. Some even theorized that a "snow blitz" could accelerate the cooling and bring on the next ice age. Their statements got major play in the media. But the majority of scientists publishing in peer-reviewed journals were concerned that greenhouse gases would play a more dominant, warming role that would overtake the cooling of sulfate aerosol pollution in the coming decades. The state of climate science knowledge in the 1970s was summarized in a 2008 article on "The Myth of the 1970s Global Cooling Scientific Consensus" in the Bulletin of the American Meteorological Society (abstract).
Starting in the 1970s, new clean-air laws began to reduce sulfates and other sunlight-blocking pollutants from U.S. and European sources, while greenhouse gases continued to accumulate unchecked. Global temperatures began to warm sharply in the 1980s and have continued rising since then.
Increasingly detailed models suggest that the more recent warmup can be attributed to greenhouse gases overpowering the effect of sunlight-shielding pollution. Computer simulations also suggest that today's atmosphere would be even warmer still, were it not for that air pollution.