BILLINGS -- Human-caused climate change has lengthened forest fire seasons in the United States by about nine days a year, according to a new study in the journal Proceedings of the National Academy of Sciences.
“Over the period between 1984 and 2015, we’ve seen a significant trend – a nine-fold increase in annual area burned,” said study co-author John Abatzoglou of the University of Idaho. “And we’d been hearing anecdotal reports calling the recent fire seasons unprecedented and the new normal. This is a first stab at trying to put some numbers on how much man-made climate change has contributed on fuel aridity and ignitability.”
Forest fires depend on several factors to get started, including how many trees are available to burn and how easy or hard they are to start burning. Wet, cool conditions make fires harder to start, while hot, dry conditions pull moisture out of the forest and increase the risk of ignition.
The “Impact of anthropogenic climate change on wildfire across western U.S. forests” study was published on Monday by Abatzoglou and Columbia University researcher Park Williams. It builds on work by Matt Jolly and colleagues at the U.S. Forest Service’s Fire Sciences Lab in Missoula. Jolly’s 2015 analysis showed that worldwide, fire seasons have grown longer across a quarter of the Earth’s vegetated surface. They also found that the hotter, drier conditions more than doubled the amount of burnable area at risk in those longer seasons.
Abatzoglou and Williams focused their work on forest lands of the United States (excluding grasslands and chaparral country of the Southwest). Then they looked at the mix of natural and human-related drivers of warming climate over the past 40 years.
Natural causes include atmospheric patterns like El Nino and the Pacific Decadal Osculation, which make North America wetter or drier over years or decades. Human-caused effects include emission of greenhouse gases from carbon fuel burning that trap heat in the atmosphere.
Then they looked at eight fire-forecasting techniques used in the United States and Canada. These included long-term weather records, drought index measurements and energy-release component tests (which show how fast a piece of wood will start burning, depending on how dry it is). These are tools firefighters use daily to gauge how intense a wildfire might burn as they plan their strategy.
“We ran two parallel sets of calculations,” Abatzoglou said. “One looked at fuel aridity over the past four decades. Then we asked what if we didn’t have man-made climate change in the mix.”
Western forest fires face a number of limiting factors, such as how many acres of trees are dry enough to easily catch fire. The researchers found that the hotter, drier climate between 1979 and 2015 increased the acreage at high risk of fire by 75 percent. After natural climate forces were subtracted out, Abatzoglou and Williams concluded that human-caused factors contributed about 55 percent of the increased fuel aridity.
“We estimate that human-caused climate change contributed to an additional 4.2 million hectares (10.3 million acres) of forest fire area during 1984-2015, nearly doubling the forest fire area expected in its absence,” they wrote. That’s an area about 10 times as big as Glacier National Park.
Abatzoglou noted that while natural climate factors will keep swinging between warmer/drier and cooler/wetter conditions, human-caused factors will keep pushing toward higher fire risk. That means forecasters needed to be aware of how those natural factors might mask the effect of human-caused ones in the short term, only to find fire risks jumping to new extremes when they all move in the same direction.
2015 was the worst year in the past 17 for acres burned between January and August, at 8.2 million acres. The January-August total for 2016 was 4.5 million acres burned.