Limnotech’s Dr. John Wolfe presents two case studies from the Great Lakes which utilized integrated modeling and adaptive management to solve ecosystem issues
The Sacramento-San Joaquin Delta is no doubt a complex ecosystem, but it certainly is not the only one; the complexity and uncertainties facing the Bay-Delta system present challenges similar to those in the Great Lakes system. In this brown bag seminar, Dr. John Wolfe from LimnoTech gives his lessons learned from Great Lakes and discusses the benefits of using integrated modeling as part of an adaptive management approach that can be applied to the Bay-Delta system.
Dr. John Wolfe is an Environmental Engineer and the Regional Manager for LimnoTech, a firm that has been at the forefront of using new, cutting-edge computer models to analyze and resolve environmental problems since the 1970s.
“This is a modeling talk but it’s really not just a modeling talk,” Dr. Wolfe began. “I want to talk about integrated modeling, integrating across media, across time and space with models, but also integrating the players, the stakeholders and the modelers, and how that integration benefits decision making. I’ll share some good news stories from the Great Lakes on how it’s been done and how integrated models have been used by integrated groups of decision makers to make progress in situations that are really just as challenging as what’s faced in the Bay Delta, with as many players and as many problems and plans that were in place; we did some things right but didn’t do other things as well, and how models have been used to adapt them and sort of set a new path forward.”
There are a number of processes that are important in the Bay Delta to talk about integration, he said. “The complex processes of climate and flows through complex system and physical impacts on the system; moving sediments around; chemical impacts, especially salinity gets a lot of attention here; invasives and how they affect outcomes; and ecosystem impacts as a dual goal of water supply. It’s a real challenge here having multiple actors with multiple models and multiple goals, and how to get them all together.”
“One of the biggest challenges is uncertainty,” he continued. “Even if we have models, we don’t know that they are exactly right, but we need to make decisions anyway, monitor the outcomes and then somehow be prepared to change.”
Dr. Wolfe said in his presentation, he would talk about two case studies from the Great Lakes.
The first case study is about operational controls – water level controls and their limitations with a much more complex system of controls, he said. He presented a profile of the Great Lakes, noting that Lake Superior is the most upstream; Lakes Huron and Michigan share a water level just downstream, and Lake Erie doesn’t have many water level controls. There are locks on the Saint Mary’s River and on the Saint Lawrence Seaway, and a dam to control level on Lake Ontario.
“In Lakes Michigan and Huron where people have vacation homes and there’s a lot of recreational boating, but there’s been not enough control over water levels,” he said. “Climate is a big part of it; in drier years there is less ice cover in the winter and more evaporation. This graphic from NOAA shows the record low lake level in 2013 – 2 or 3 feet lower than average. This is a dock that can’t be used because the water level isn’t well enough controlled.”
The flip side of that is in Lake Ontario where water levels can be controlled by the dam, he said. “The initiation of water level controls in 1962 when the Seaway was built led to a flattening out of water levels. In the past, they’d cycle up and down and a lot of elements of the ecosystem really needed that, but the shores of Lake Ontario became choked with cattails that thrive when the water level is steady. It’s good for cattails but not so good for a lot of other species that need water level fluctuations,” he said.
The second is the reoccurrence of harmful algal blooms and nuisance benthic algae; last year, the city of Toledo had to shut off its water supply because it was choked with toxic blue-green algae or microcystis. “There were models in place and limits in place for phosphorous in the Great Lakes for 25 years that said that shouldn’t have happened, so the technical people and the decision makers had to go back to the drawing board.”
Context for decision making in the Great Lakes
Before he discussed the case studies, he first gave the context for decision making in the Great Lakes region. There are about 40 million people living in the Great Lakes basin in the U.S. and Canada; there are eight states and two provinces that border on the Great Lakes. “That’s about the same population as California, so things are happening on roughly the same scale as here,” he said. “There are just as many interests to balance and promote in the Great Lakes as there are in the Central Valley – balancing water supply, water quality, recreation, commercial fisheries, navigation, and the health of the ecosystem; there are trade-offs in those objectives.”
“One thing folks have learned in the Great Lakes is that even though there are all those parties that need to agree on these competing objectives, their governments are most responsive when stakeholders can reconcile these competing interests, work in advance among themselves, come to some agreements and put them to the governments to consider and approve,” Dr. Wolfe said. “There are a lot of organizations in the Great Lakes that have formed to try to reach pre-agreements: the International Joint Commission among the states and provinces; the Great Lakes Commission has a lot of participation of navigational and industry interests; Healing Our Waters is a group of NGOs; Great Lakes industries have their own council; mayors, governors, tribes, and delegations from the Great Lakes states in the house and senate all have organizations to try to help reach agreements.”
As a result, there are lots of multi-party agreements in place, such as the Great Lakes Water Quality Agreement between the U.S. and Canada in 1972 that focuses on nutrients and eutrophication; the Great Lakes Compact from 2008 which is an agreement between the states and the provinces to try to prevent other regions from taking the water, so it’s based on ecosystem sustainability; and the Great Lakes Restoration Initiative, which was by an executive order of the president that gets a lot of the agencies together to clean up harbors that are contaminated with toxics.
One tool the parties have developed over the years an annual federal priorities sheet. “It’s doing Congress’ homework for it in advance,” he said. “It’s a program that’s endorsed by the states and provinces, industries, mayors, tribes, chambers of commerce, and the fisheries commission. They have competing interests, but they are all behind this core program. The priorities are listed for 2015; they are looking for funding to get those things accomplished.”
Besides the decision making context, data is important as well. “I want to pay attention to data even though we’re going to get into modeling because there is no modeling and decision making without good data, shared data, and accessible data,” he said. “The Great Lakes have gotten their act together on data accessibility and management too. The Great Lakes Observing System (called GLOS) is a clearing house for data combining providers and users. It’s an organization you join and pay dues, and the dues are scaled to the kind of organization you are and your size. There are governments, universities, NGOs, private industries, and individuals. It’s a binational nonprofit, it’s one of 11 regional associations in IOS, the Integrated Ocean Observing System, so if you go online, you can see a map of these all around North America, this is the one that is freshwater data centered. It’s mission is to link users and providers of data, information and knowledge in a way that supports sound decision making. Something I know there’s a yearning for in California to make things work better.”
Before GLOS, there were several different observation systems for real time monitoring data, and you had to go to each one separately to do analysis and pull all the data together, he said. “The concept here is to go from observations from a variety of providers, satellites, aircrafts, ships, fixed platforms, buoys, and route that information to all the users that are shown in the red boxes on the right – people who want to close beaches when they are dangerous, manage water quality, manage navigation, search and rescue, fisheries management – not all environmental quality modelers, but it’s good for us that this data are all in one place, and the heart of it is the data management and communications system.”
Dr. Wolfe presented another diagram of the system, noting that the user communities are shown on the left and their needs are taken into account by the management system; the data providers are represented on the right either doing research and development or operating data management systems. GLOS is the nerve center that manages and enables communications. It is managed by a board of members and it is highly interactive among the players, he said.
Case studies in integrated modeling
Dr. Wolfe then turned to the case studies, noting that these studies were controlled programs that were in place which worked reasonably well, but were found to need adjustments, and how models were used to bring the stakeholders together to make those adjustments.
Case 1: Integrated Ecological Response Models for water level regulation (IERM)
Simulation of shoreline ecosystem responses to varying water level regimes governed by regulation and net basin hydrologic supplies for Lake Ontario-St. Lawrence River (LOSL) and Upper Great Lakes
He started by presenting a map of Lake Ontario, noting that the lake drains out the Saint Lawrence River. It was uncontrolled until the 50s when it was dammed and locks were built so that bigger ships could get into the Great Lakes via the Saint Lawrence Seaway. One side effect of that was controlling the water level in Lake Ontario and the Upper Saint Lawrence with the Moses Saunders Dam, according to a plan that was put together in 1958.
“The regulations promulgated in those days set a plan for targeting weekly water levels and flows, and when they built the Seaway, the objectives were to serve hydropower, commercial navigation through the locks, water supply and prevent floods. They weren’t thinking much about ecosystem health.”
Since then, interests have evolved, he said. “There is more interest in environmental preservation, more recreational boating on Lake Ontario, and altogether different priorities among the riparian landowners, so a process of shared vision planning was initiated by the states and provinces in 2000. I’m going to focus most on the ecological part of it, because that is the part that we worked on. Our role was to create a model of ecosystem outcomes that would vary depending on how the controls at the dam and the lake level were controlled. If you allowed lake levels to vary more seasonally in such a way that would be consistent with breeding patterns and habitation by wildlife, what would be the benefits and what would be an optimal plan.”
Dr. Wolfe presented a conceptual model, explaining that there are multiple fish species that were of concern, and their spawning depends on water temperatures, water levels and access to wetlands, as well as the existence and growth of those wetlands. “What this model does is it takes years and years of past precipitation history, and first of all, applies the 1958 plan and translates that into water level series, and then looks at each species to see how it would fare, and that sets the baseline,” he said. “To do that, subject area experts and ecological scientists specializing in each one of these species developed that module, feeding into rules for water level controls and into either an improved or degraded status for each species.”
“It was done by an iterative process,” he continued. “The starting point here was the existing plan for water level controls. The plan was applied to rainfall series to get a time series of water levels simulated over decades, it computed the status of each species over those decades, evaluated where the biggest problems were, looked at possibilities of improvement and tried out a new plan.”
“Let’s say we had identified that the least bithern was the species in the most trouble, then a plan was developed that would better encourage its breeding, and fed again through the water level, and in making these changes, it was really a combination of judgments by technical people and non-technical people to decide how to improve things.”
Dr. Wolfe presented a slide of the bull’s eye diagram used to assess progress. He explained that each one of the dots represented a species and a comparison between the status of the species under the existing 1958 plan and the plan that was being tried out.
“Doing better than the existing plan gets you within the red circle,” he said. “This one is pretty good, almost all the species have their status improved if everything is normalized, so if your status was twice as good as under the existing plan, you’d be at the brown dot, and if it was worse, you’d be out here at the black circle. The breeding success of the least bittern is shown here in red in the lower right hand panel, and you can see it is consistently higher because of the way water levels are being controlled.”
Dr. Wolfe then turned to the institutional set up of the planning process. “There was a study board of non-technical professionals who went through the process of sitting down with the models and looking at the output and saying that’s not good enough, what you can do to foster this species or this interest; they had a process of would-be recommendations and would-be decision making, given what we see right now, if you had to make a decision, what would you do, based on the modeling that exists now,” he said. “It allowed them to learn about the research. What this process was trying to avoid was dumping a report on their desk at the end and saying, ‘read this and decide what you want to do.’ Instead, they worked through it with the modelers iteratively, and considered the results in a decision context. They could play a role in deciding what are the unknowns that we need to know more about or what did we think was important or emerging as not very important, what are the red herrings that we can ignore, and so participants could form their opinions slowly, build more confidence in understanding in their ability to make decisions. In that way, it bridged the gap between technical people and stakeholders and decision makers.”
The outcome was Plan 2014, a revised plan for controlling lake levels in Lake Ontario. “It was approved by the IJC last year, supported by a broad coalition of stakeholders, The Nature Conservancy is strongly behind it, and it’s now under review by the two federal governments, the state department and the ministry of foreign affairs in Canada, but hopefully they will approve it and it will be a process that worked,” he said.
Plan 2014 has an explicit plan for adaptive management, Dr. Wolfe said. “First step is to identify areas of uncertainty in the modeling, and one of those is future water supplies. We know that depends on future climate and how much is it going to rain on Lake Ontario for how much water is there to work with to retain in the lake or let out of the dam,” he said. “The other major area of uncertainty that was identified were the ecological impacts. How accurate were each one of those models of species status relating it to water levels? It was the best knowledge that we had but such models are uncertain. So part of the plan moving forward, even if the consensus plan is accepted, is to monitor indicators related to water supply and to ecological impacts and compare it to the models and see how well the model did. Change the models if necessary and be prepared to change decisions if warranted.”
Lessons learned from the Lake Ontario case study:
- “Ecological impacts can be integrated into system-wide models of flow and water level; that’s one of the big challenges is to integrate them all, and to have them run as an integrated whole. To do that, modelers need to work closely with biologists and not to think of models as something that are just about flow, but they are really about ecosystem impacts.”
- “Consensus management is important. There are a lot of people who want a lot of different things, and it works best with an iterative process in consultation with stakeholders, and not something that modelers just drop on decision makers.”
- “We need to maintain these models for adaptive management. They need to be in place, we need to keep collecting data on ecosystem impacts, and have the models ready if they need to be modified in the future to adjust those decisions.”
Case study 2: Eutrophication modeling in the Great Lakes
The second case study is about euthrophication modeling in the Great Lakes.
Eutrophication occurs when excessive nutrients are added to an aquatic ecosystem, which causes explosive growth of plants and algae that eventually die and consume the oxygen in the body of water, creating a state known as hypoxia. The nutrients can be natural or artificial substances, such as detergents, fertilizers, and sewage. The primary limiting factor for eutrophication is phosphate, as the availability of phosphorus generally promotes excessive plant growth and decay.
This is a case of two stage adaptive management: The problem of nuisance algae blooms was solved in the 1980s and worked for 25 years, only to reemerge later even though the phosphorous targets were still being met; it illustrates what was done to adapt then and what will be done to adapt in the future, Dr. Wolfe said.
- Lake Erie is the shallowest of the Great Lakes; the western basin is especially shallow which is where the toxic algal blooms appeared. Toledo at the west end of Lake Erie.
- The central basin of Lake Erie where Cleveland is located is where the most serious hypoxia occurs because of its depth, so when the lake stratifies in the late summer, oxygen disappears in the lower lake.
- In the eastern basin where Buffalo is located, nearshore benthic algae form and foul the beaches.
These were all serious problems in the late 60s-70s and so modelers were brought in to try and solve them using an ensemble approach of models. Dr. Wolfe presented the Ditoro model which models phosphorous and nitrogen and its effects on dissolved oxygen, noting that the left hand column of model outputs was from 1970 which matched the plunging of dissolved oxygen in the central basin of Lake Erie in August of 1970. The model was verified in 1975 to show that it also did a good job simulating dissolved oxygen sags, he noted.
“The outcome was a phosphorous load response curve to try to solve this problem of anoxic area in the central basin,” he said. “The curves that come out of the Ditoro model show the more phosphorous, the more area was anoxic, and it appeared that if you could reduce the whole lake phosphorous load to between 10 and 12 thousand metric tons per year, that anoxia would disappear. And so that’s the target that was adopted, and agreed to by the states and provinces.”
The phosphorous problem was primarily loads of nutrients from urban wastewater treatment plants, and so reducing the loads of phosphorous from treatment plants, especially Detroit’s, was important for solving the problem, he said. “In those days, it was sufficient to cut point source loads through NPDES permits, so a lot was spent on wastewater treatment in the Great Lakes – $8.8 billion between 1972 and 1985. The annual load of phosphorous did come down to below 11,000 metric tons per year in almost every year, not in the wettest weather years, but typically so and the problem was considered to be solved. The anoxic zone really did disappear.”
In 2011, the problem remerged, and the IJC brought in modelers and data gatherers to solve the problems. “All three problems recurred: Toxic algal blooms in the western basin, hypoxia in the central basin, and [nuisance algae] in the eastern basin, so what changed? Why weren’t the nutrient reductions sufficient?”
Dr. Wolfe said two things had changed: One change that had occurred is that the bottoms of the Great Lakes are now coated with invasive zebra mussels and quagga mussels, brought in by ships via the Saint Saint Lawrence Seaway. “They’ve been filtering the water column ever since; they filter out a lot of algae and allow light penetration, but they don’t like blue-green algae, so they allow blue-green algae to really thrive.”
Secondly, the nature of phosphorous loads had changed. “In part because of changes in weather – its now wetter in the spring,” he said. “The nutrient loads are more weather-dependent, and phosphorous is now in a more bioavailable form, and that part of the total that’s getting to the Great Lakes of those 11,000 metric tons, much more of it is in the form of soluable reactive phosphorous. Since about 1995, the trend has been doubling of soluble reactive phosphorous. It’s not completely understood why that is; there’s more no-till farming of corn and soybeans in the Lake Erie basin and there’s more winter fertilization of fields of corn and soybeans, so it’s sitting there in the spring when the rains come. At any rate, there’s more of it, and it’s more available to algae so that the same number for total phosphorous is no longer sufficient to eliminate toxic algal blooms.”
A subcommittee under the nutrients annex of the Great Lakes Water Quality Agreement has proposed new lower targets to restore compliance with these objectives that are codified in the Great Lakes Water Quality Agreement: to maintain cyanobacteria below toxic levels in the western basin, to minimize hypoxic zones in the central basin, and to maintain algae below nuisance levels in the eastern basin.
An ensemble of models have been pulled together, ranging from statistical models to detailed numerical models that are more integrative, he said.
The Western Lake Erie Ecosystem model, led by Dr. Wolfe’s colleague Joe DePinto, was built to understand and find solutions for toxic algal blooms in western Lake Erie. He presented a graphic of a hydrodynamic model of Lake Erie, noting that most of the flow from the Great Lakes comes in from the north via the Detroit River; the Maumee River flows in from the east.
“The Maumee River is where most of the nutrients comes from,” he said. “The Maumee is a big agricultural watershed, and the nutrient concentrations are much higher in the Maumee River than the Detroit River, and they move with solids in large part, so the movement of solids suspended in the river flow and kicked up by wind waves is very important for understanding the movement of nutrients.”
The hydrodynamic model is linked to the nutrient eutrophication submodel, and it’s important to integrate biological models as well, Dr. Wolfe said. “We just don’t feed outputs from flow and sediment transport models to spreadsheet models for biologists, but biological models have to really be integrated into these flow and sediment models to be able to iteratively look at solutions – to not just look at it once but look at it over and over and over again.”
The biological part of the Western Lake Erie Ecological Model is ultimately all about phytoplankton growth, and cladophera growth, but you have to understand the cycling of nutrients and solids – the loads that are coming in externally, how they are filtered by invasive clams, and how they move in and out of the sediment bed, depending on oxic and anoxic conditions, he said.
Dr. Wolfe then presented a slide showing how the phosphorous model was calibrated using three stations. “The station that is closest to the Maumee River had the highest predicted and actual phosphorous concentrations and the model did a pretty good job of fitting them. The second box is a little further from the outlet of the Maumee River and the nutrients are more diluted and the model does a good job of hitting those targets partway out and further out into Lake Erie.”
“Ultimately, this model is trying to predict microcystis blooms, so this is how it did predicting microcystis biovolume averaged over four stations, 2008 to 2014,” he said. “Accurate enough to use for decision support.”
He presented the load response developed for the Western Lake Erie Ecosystem Model, noting that the on the horizontal axis is the Maumee River total phosphorous load, and on the vertical axis is the biomass of cyanobacteria in western Lake Erie; the blue curve is the relationship between Maumee River loads and cyanobacteria biomass. “The problem is that we’re trying to get biomass below the green line, which represents a level that was considered not severe in 2004 and 2012. The baseline year is 2008, which is a year that met the existing phosphorous loads and yet cyanobacteria were above the nuisance level.”
The boxed area shows the effect of a 40% reduction in total phosphorous loads from the Maumee River. “The point is that it takes it below this critical line to a level that was consistent with years that didn’t have problems, so this is really the heart of how the models were used to help support decisions,” he said. “The model tells you that if you could reduce these Maumee River loads by 40%, the nuisance algal blooms would go away. It’s not an easy thing to do. It was easier to get people to stop using detergent and have secondary treatment at wastewater treatment plants than to reduce these loads, but this is what it would take.”
He presented a slide of the phosphorous targets that were proposed. “You can see the number 40% in a number of places, the 40% reduction of total phosphorous entering the lake was the target for reducing hypoxia, and 40% reductions from all of these tributaries that are bringing in agricultural loads is the target for reducing nuisance algal blooms.”
Right now it’s a recommended plan; the Great Lakes Water Quality Agreement called for a recommendation this year for an actual strategy to be enacted in 2 years and for measures to be in place 2 years after that, so the report is on the desk of the states and provinces now to put together a strategy to figure out how to reduce these levels, he said.
More information …
Dr. Wolfe said there was also a plan for adaptive management for the targets. “Similar to what I talked about with Lake Ontario, the first step is to identify the uncertainties in the modeling that you’ve done, and the modelers felt that they couldn’t really be sure, based on the data they had, what the mix of response was to particulate and dissolved phosphorous; whether nitrogen load reductions might help; what is the role is of invasives quantitatively; how important climate change in driving these blooms; and which watersheds are really causing the most problem.”
“So the plan for adaptive management is to monitor nutrients and solids from the tributaries, monitor best management practices that are put into effect on ag lands and what the results are with nutrient concentrations in the lakes, and how all these problems respond, and to conduct research and keep developing models to try to understand those responses,” Dr. Wolfe said.
Lessons learned: “Behavior changes and weather changes, too. Ecosystems evolve; it’s all very complex, so to understand all those interactions, you need integrated modeling. Models can inform adaptive management; as the states and provinces are trying to adapt right now to a problem they didn’t think would come back, models help to understand what’s changed, and help to set new targets. And to do that effectively, it’s really best to try to identify the uncertainties in your modeling and track them; monitor loads, monitor outcomes, and maintain. Don’t just put those models on the shelf, but maintain and improve them to reflect what you learned from the monitoring.”
Bay Delta modeling challenges
“Modeling necessarily has been fragmented in different time and space scales to look at different problems and different issues for water supply versus ecosystem impacts, and the models are big and complicated,” Dr. Wolfe said. “It’s really challenging to iterate through alternative scenarios over and over and over again with models that take a long time to run with the existing suites of models.”
“I’ll add that I think it’s conceptually challenging to model operations as part of that iterative process,” he said. “CalSIM II, the model that we use here to model operations and water supply, explicitly incorporates a set of contemporary priorities and to some extent, they can be modified within that model – it assumes a certain contemporary water demand or a different water demand. The big question is how will those priorities and how will water demand change in response to climate, in response to management performance, and how can you build that feedback into an iterative modeling process in a way that takes into account what your models tell you.”
“There’s also been excessive fragmentation of efforts among multiple players, state and federal agencies, local and regional governments, academics, NGOs, consultants – lots of parallel modeling efforts,” he said. “It’s good to have a lot of smart people looking at problems and we benefit from that, and the ensemble modeling approach ultimately was helpful in the Great Lakes. There are a good number of good integrative models, but there are potential benefits of improved coordination . If there was more working together on these models, inefficiencies could be reduced and costs could be reduced for the players, and there could be more information sharing. It might be easier to address issues up front an d help parties reach consensus. Models do different things, and their linkage might help each other to solve each other’s problems.”
“Since the integrated modeling workshop in May, there’s been a lot of buzz about thoughts of a community modeling hub, some kind of shared repository of data and models that could set standards for software practices, something like GLOS does for data in the Great Lakes, but expanded to include models too,” he said. “Providing a forum for comparison of model results, and involving stakeholders up front in developing models that are oriented towards answering the most important stakeholder questions.”
“This process can work,” Dr. Wolfe said. “Those problems identified in the Great Lakes aren’t solved once and for all, but progress is being made on both. In terms of modeling, it’s important to integrate processes and have single models that are tractable and flexible and can look at all the alternatives. It’s important to have the stakeholders engaged up front in exercising those models, and it’s not going to take you exactly where you want to go, so it’s important to keep up the monitoring and the model development and adapt. You make the best decisions you can, but you know it will be best to adapt them at some point. Things will change and systems will evolve, and integrated data and models can help us understand those changes, and help support decisions to revise those targets in the future.”
“So with that … “
For more information …
Sign up for daily emails and get all the Notebook’s aggregated and original water news content delivered to your email box by 9AM. Breaking news alerts, too. Sign me up!