Ecosystem Modeling to Support Adaptive Management – Lessons from 40 years of Decision Support for the Great Lakes
Managing large interconnected systems of water bodies presents complex challenges, and model simulations are an essential tool. For many years, the Great Lakes community has used models to support management of their system. The complexity of the Bay Delta system presents similar challenges. Adaptive management will be critical, given climate change and other uncertainties. This presentation provides lessons learned from the Great Lakes region about the potential power of integrated modeling as part of an adaptive management approach.
John Wolfe began by noting he would be drawing on case studies given in a keynote address by Joe De Pinto in the spring of 2013 on his experience as the past president of the regional association concerned with the health of the Great Lakes. The case studies he has chosen will be instructive on the use of integrated modeling to support decision making on a large scale, and how the use of that modeling can support adaptive management.
Integrated modeling is modeling that integrates multiple processes such as weather, flows, sediment transport, physical impacts, chemical impacts, ecosystem impacts, and invasive species. “The ideal is an integrative model that can apply management scenarios to all those things at once,” Mr. Wolfe said. “You have multiple stakeholders here that don’t always get along and have multiple, coequal goals to achieve, and to achieve those goals requires not only integrated modeling but cooperative planning, and adaptation as surprises happen.”
Case Study 1: Integrated Ecological Response Models
The first example is an integrated ecological response model that was developed for Lake Ontario. Mr. Wolfe explained that the Saint Lawrence Seaway is a system of locks and dams that was built in the 1950s to bring shipping into the Great Lakes. One of the dams in the system is the Moses Saunders Dam, which was built to control levels in Lake Ontario. “The dam kept them quite constant and shipping interests, riparian interests, and shoreline land owners where happy with that, but as in the Delta, there’s a diversity of organisms that had adapted to changing flows and changing water levels. So what evolved was really a monoculture of cattails along Lake Ontario and not a thriving ecosystem.”
“The charge to the modelers was to build an integrated ecosystem model that was complex in terms of ecological outcomes – not especially complex in terms of chemistry or physical phenomena, but relating flows and water levels to dozens of ecological outcomes,” Mr. Wolfe continued, presenting a slide of the conceptual model. He acknowledged that the slide was a bit wordy, but the point was more to show that there were a lot of organisms such as plants, fish, amphibians and reptiles represented, each responding to different seasonal needs in terms of drying and wetting and flows, as well as temperature.
“A model had to be built that could reflect ideal conditions for each, so the physical modelers worked with academics who each knew each one of these target species, and developed the primary indicators of health that would relate a seasonal pattern of water levels to the health of that organism, and to then build it into a model where the water would drive the health.”
“The process was iterative, not just bringing the science to the managers so that they would know the answer, but starting with a baseline management scheme and iterating until there was a good compromise between ecological goals and economic goals,” he said. “So what’s happening in this diagram, we have weather coming in from the upper right, historical time series of weather and variable time series representing an uncertain future, and a certain plan that you begin with and iterating. Many of these iterations are run by the modelers to try and optimize what they saw in terms of the ecosystem. At strategically timed junctures, the scientists and modelers met with stakeholders to see how it was all going and how good the compromise was. They continued to iterate until a shared vision was achieved, really similar to what’s being pursued here.”
He then presented a screenshot of the primary indicator target visualization, saying that everything’s a compromise on the ecological side. He noted that there were indicators for birds in blue, at risk species in red, and mammals in brown. “The primary indicators were normalized so that the highest you could achieve for each indicator was a bull’s eye, and part of the iteration then was to continue to re-craft the management plan for the dam to try and move all these indicators as close as to the bulls eye as possible without moving any outward.”
The lessons learned from this experience are that a wide range of ecological impacts can be integrated into system-wide models of flow and water level, which requires modelers to work closely with biologists, he said. “Granted, this was a simpler system by far that the Bay-Delta, but not in terms of ecological outcomes as there were many represented here.”
The other lesson is that consensus management requires an iterative process in consultation with stakeholders, and the models need to be nimble and flexible to accomplish that. “You just don’t deliver the science to the stakeholders, but you work with them to show how you’re doing, get their input, and circle around and keep going until you get to the best that you can do.”
Eutrophication modeling in the Great Lakes
The second case study deals with eutrophication issues in the Great Lakes.
Eutrophication occurs when excessive nutrients are added to an aquatic ecosystem, which causes explosive growth of plants and algae that eventually die and consume the oxygen in the body of water, creating a state known as hypoxia. The nutrients can be natural or artificial substances, such as detergents, fertilizers, and sewage. The primary limiting factor for eutrophication is phosphate, as the availability of phosphorus generally promotes excessive plant growth and decay.
Addressing eutrophication in the Great Lakes began a generation ago with models that were used to establish limits and set targets for phosphorous in the Great Lakes under the Great Lakes Water Quality Agreement, a treaty between the states and provinces that dates back to 1972, he said.
“Lake Erie was really the poster child for eutrophication in the 60s and 70s. It was declared dead. It’s the shallowest; it has the lowest volume of any of the Great Lakes.” He explained there was hypoxia in the Central Basin, nuisance and harmful algal blooms in the western basin, and a benthic algae known as cladophora in the eastern basin.
“It was really a pioneering approach that they took to put together an ensemble of models of the lakes and of nutrient impacts on the lakes, especially focusing on the lower Great Lakes which had the greatest problems, to calibrate those models, looking for general agreement in those models in terms of phosphorous targets, and then for the states and provinces to agree to take action to try to achieve those targets.”
He then presented graphics from a model developed for Lake Erie, showing the calibration of the model for dissolved oxygen, chlorophyll A, orthophosphorous and nitrogen. “This is to demonstrate that calibration verification was done carefully by the best science, and the way phosphorous loads were set.”
“From that same model, here’s a graphic that shows the relationship between the whole lake phosphorous load and the area of anoxia in the central basin, and this graphic told folks that a whole lake phosphorous load of about 11,000 metric tons per year would be sufficient to squeeze that are down to 0 for Central Basin.”
“So that was the strategy that was pursued, and limits were written into NPDS permits the states, as well as limits in the counterpart permits in the provinces across the lake,” he said.
He presented a chart of the phosphorous loads, noting that there were much higher loads in the 60s which were brought down by the late 70s and early 80s. “It was successful, and the community sat on its laurels I’d say for 20 or 30 years,” he said. “In our group we always talked about this as sort of the ‘moonshot’ of modeling; it proved that models worked.”
“For the last 5 years or so, the problems with nuisance algae and benthic algae have been worsening in the Great Lakes, and western Lake Erie has been one of the worst spots for it,” Mr. Wolfe said. “This is a satellite photo from one of those days last August when Toledo had to shut down its drinking water plant because there was a plume of microcystis in western Lake Erie,” he said.
The Maumee River is the biggest tributary to the Great Lakes with a watershed that covers all of northern Indiana, parts of Michigan, and most of Illinois, so there are nutrients coming from a lot of farms feeding this problem, he said. It’s been observed in the monitoring at the mouth of the Maumee River that the nutrient loads are changing with much more dissolved reactive phosphorous in western Lake Erie than there was in the days when limits were being set, he said.
“One of the hypotheses is that agricultural practices have changed,” said Mr. Wolfe. “There seems to be a higher density of ag drains and Midwestern farms then there was 30 years ago, so more may be essentially groundwater phosphorous getting into the river as opposed to particulate based runoff. But like some of the issues in the Delta, this is one that’s really not resolved yet.” These are clearly adaptive management areas to pursue, he noted.
He then presented a slide on the re-eutrophication in Great Lakes, noting there’s a recurrence of nearshore algae blooms, a recurrence of hazardous algae blooms, and a recurrence of hypoxia in the Central Basin. Changes in phosphorous loadings are one cause that is receiving a lot of attention, but invasives also play an important role. “One thing that the Saint Lawrence Seaway did for the Great Lakes was deliver a whole lot of invasives from the Black Sea and the Caspian Sea in ballast water that they would exchange in the lakes, so the Great Lakes are crawling with zebra mussels, quagga mussels and other mussels. They filter things that used to be suspended in the lake and this increases the light penetration. They don’t like toxic algae very much, they tend to spit out blue greens, so one thing that they do is to allow blue green’s to thrive.”
One solution is to control phosphorous loads which is easier than controlling invasives, but assessments need to be made and new targets need to be set. New models are now being employed to do this work. “It’s too soon to set targets,” he said. “First we have to figure out what the causes are, but models are a good way to look at the data and run the ‘what-ifs’ and see what really matters by building in the best science.”
The new Western Lake Erie Ecosystem Model being developed accounts for hydrodynamics and sediment transport processes, along with wind waves as well. “Our computing prowess has improved a lot since the 70s, so representing that in three dimensions, and then linking it to an ecosystem model that represents multiple trophic levels of fish, a number of functional groups, the phytoplankton that are of concern here and zooplankton, and explicitly represents the filtering that’s being done by mussels, and also represents benthic algae.”
One of the lessons learned here is that behavior changes and ecosystems evolve, he said. “We might have thought that we had this nailed, but things change in complex ways that take a lot of science to unravel. Models can inform that adaptation by helping you understand what’s changed, by seeing if your hypothesis fits within the context of a model, and then to help set new targets once you have it worked out.”
Bay Delta modeling challenges
“I wanted to provide a little bit of a contrast to the Bay Delta and talk about the challenges here in terms of modeling and humbly, because some of the best modelers in the world are at this conference and located locally,” he said.
“Modeling has traditionally been more fragmented on the Bay Delta. Storage and flows have been modeled at a different scale than other processes. The models that we have that determine flows have traditionally been on a monthly basis based on allocation system and based on regulation, and ecosystem impacts are modeled more locally as they need to be, because temperature and tides vary on a much shorter scale, so I know efforts have been made to bring those together.”
“In terms of scenario development, with those kinds of combinations of models, I think it’s been challenging to iterate through alternative scenarios with the existing models, and it takes a lot of iteration to really get things to work, whether it’s infrastructure like the tunnels or other aspects of the plan. It’s also been challenging to achieve consensus too because there are a lot of players and their interests are strong.”
“In terms of adaptive management, I think it’s also been challenging to represent climate futures with the existing models,” he said. “The existing allocation model is really a hindcasting model that’s not readily adapted to a different climate. It’s really historical in nature.”
“Nevertheless, I want this to be a good news talk, and the message I wanted to convey is that all this can work,” he said. “The models can help set targets to meet multiple and coequal goals; it’s been done. The things to strive for are, to the extent possible, process integration, one Delta, one modeling tool, one spatial scale, and have everything interacting with everything else within the context of that model, to develop tractable and flexible tools that you can use iteratively, where you can run them as many times as you need to get to an answer. That has to involve the constructive engagement of stakeholders, not just delivering an answer but having them be part of finding the answer and working closely with scientists and the modelers.”
“Finally, not only can things change but things will change and systems will evolve on a multi-decadal scale,” he said. “Integrated models can help to better understand the way things are evolving, and even more importantly, can help support decisions to revise targets in the future as things change.”