Editor's note: Jennie Hoffman is founder and principal of Adaptation Research and Consulting, and an expert in climate change vulnerability assessment and adaptation of natural resource management and conservation to climate change. Hoffman has co-authored three books: Climate Savvy: Adapting Conservation and Resource Management to a Changing World; Scanning the Conservation Horizon: A Guide to Climate Change Vulnerability Assessment; and Designing Climate-Smart Conservation: Guidance and Case Studies. Her email is jennie@adaptationinsight.com

By Jennie Hoffman

[For titles and web links to the journal articles referenced in this piece, see here.]

In the climate change adaptation world, as elsewhere, some big ideas that are assumed to rest on the best of scientific practice actually draw a significant portion of their support from "Well, it just makes sense." In these cases, a few well-documented examples are taken to indicate larger truths. But what if these assumptions are not always true?

One adaptation truism is, "Reduce non-climate stressors." Who could argue against this? Images of massive oil spills and thick blooms of non-native jellies chomping larval fish in the Mediterranean leap to mind. Still, as we advocate evidence-based conservation, we ought to take a thoughtful look at the actual evidence.

A recent set of papers brought home to me the importance of considering not just synergistic interactions – where the combined effects of two stressors are greater than the sum of their independent effects – but also antagonistic effects, where the combined effects are less than the sum of independent effects.

Brown et al. (2013) modeled additive, synergistic, and antagonistic interactions between climate change and local stressors in seagrass and fish communities. Reducing local stressors did indeed lead to big wins when interactions with climate change were synergistic, but had no effect or even increased harm when interactions were antagonistic.

Do antagonistic effects really occur in the real world? Research in the northeastern US (and published in a pair of papers by Coverdale et al. and Bertness and Coverdale) suggests they do.

In many Cape Cod salt marshes, loss of predators on an herbivorous crab (Sesarma reticulatum) has led to significant cordgrass die-off, which in turn has led to more rapid erosion and greater vulnerability to sea level rise. In heavily damaged marshes that have been invaded by non-native green crabs (Carcinus maenas), cordgrass is recovering, apparently because Carcinus reduces Sesarma consumption of cordgrass. So if the goal is to reduce the vulnerability of salt marshes and increase protection from erosion for local communities, getting rid of the invading crabs could be the wrong thing to do.

Is this an isolated, bizarre case of a local stressor decreasing vulnerability to climate change? Unlikely. There are other examples of non-climate stressors likely reducing vulnerability to climate change (e.g., Norkko et al. 2012 on the effects of invasive polychaetes on hypoxia in the North Sea). Recent meta-analyses of interactive and cumulative effects of anthropogenic stressors found that antagonistic interactions accounted for roughly a third of interactions at the population and community level (Crain et al. 2008; Darling and Cote 2008). This suggests that we should not assume that reducing non-climate stressors will always mitigate climate change effects.

Rules of thumb can be quite useful when available evidence is limited. But when new evidence is available, we need to re-examine our assumptions. It may be that some unconventional thinking is now needed around the idea that all reduction of non-climate stressors is a good thing. This means heading into uncomfortable territory for adaptationists – but isn't that what makes this such an interesting line of work?