PANEL 1 (Nov. 2): Science Strategies in Large Programs
PRESENTATION: International Examples: Effective Science Strategies
Dr. Clifford Dahm began with some background information on international efforts at managing ecosystems. The International River Foundation gives an annual prize of $300,000 Australian dollars for outstanding management of river basins worldwide. The Lake Eyre Basin in Australia received the award in 2015; the River Rhine, particularly the lower Rhine, won the prize in 2014. The River Thames and its recovery from some very serious degradation was the winner for 2010. The Danube River was the winner for 2007; the Sha River in China won the award in 2006, and the Mekong River was awarded the prize in 2002.
“The awards have, in many cases, identified some excellent programs where you could look for successful science strategies,” Dr. Dahm said. “These are international examples of where smart people have developed interesting and important programs, and are doing really good things.”
In North America, the Willamette River won the award in 2012. “One of the things that I’ve always admired about the Willamette River program is their wonderful planning atlas that they completed in 2002, which is a guide to the kinds of restoration and basin level management that they’ve done. It reminds me very much of some of the things that San Francisco Estuary Institute is doing with their historical ecology, and then bringing that up to how we might use that kind of information in management issues here in the Bay Delta.”
During his tenure as the Delta Lead Scientist from 2008 to 2012, Dr. Dahm had the opportunity to give a plenary talk in Japan at the Ecology and Civil Engineering Society of Japan. The Society has an annual meeting with about 1,000 people in attendance. “It was impressive to me how much river restoration, basin restoration, and coastal management goes on in Japan,” he said. “I bet it is comparable dollar-wise to what we do here in the United States. One of the reasons why it’s comparable dollar-wise is that these are very built environments, and so they have this society where they link civil engineers with people who are doing ecological research and restoration because it is a very difficult thing to do within these built systems and it’s expensive. Yet, they’ve committed a lot of resources to do, for example, things for their anadromous fish populations for example or some of their fresh water aquatic mussels. There are some really interesting examples from that part of the world.”
Dr. Dahm then highlighted the Healthy Waterways Partnership in Southeast Queensland. “In 2007, I was doing quite a bit of work on the Rio Grande River basin in New Mexico and I had become very interested in some of the issues of science monitoring and science decision making,” he said. “I had heard a number of presentations at international and national meetings about this Healthy Waterway Partnership that was going on in South East Queensland. I was very impressed with some of the things that were presented, and I said, ‘this would be a great place to do my sabbatical.’”
In addition, his interest at that time was moving more towards the study of intermittent rivers, which make up approximately half of the river networks worldwide. “They are understudied systems, and the Australians, particularly in Outback Australia, had really focused on studying these intermittent systems,” he said. “I wanted to have the opportunity to combine those two efforts.”
“One of the things that I really liked about the Healthy Waterways Partnership was not only the monitoring program and the science program that they had developed, but also the communication program that they had developed,” said Dr. Dahm.
The area under study in Southeast Queensland is comprised of 15 major catchments, about 23,000 square kilometers and 2.5 million people in one of the fastest growing parts of Australia. It includes a lot of different jurisdictions who contribute to a fund that actually pays for the research and monitoring. One of the iconic parts of this area of South East Queensland is Moreton Bay, which is a focus of recreation and tourism.
“There were growing problems within the bay,” Dr. Dahm said. “They had problems with sediment, with increasing turbidity within the system, and with nutrients and nitrification. They also had problems with the loss of some of their beneficial sea grasses that were in the estuary. These were the drivers that basically caused them then to start this program.”
The program was initiated in the late 1990’s; there were concerns for the iconic species in the Moreton Bay ecosystem, such as the dugong (a marine mammal in the family of manatees), turtle, and the sports fishery. They initially developed a monitoring program of all the catchments to identify the sources of sediment derived from within the catchments. “One of the things they did was use models and modelling to help them understand the dynamics of the sediments within these catchments,” he said. “It became quite clear that the vast majority of the sediments were coming from a fairly small part of the landscape; 30 percent of the landscape was contributing more than 70 percent of the sediments.”
The next question that generated some direct research was, ‘how much of the sediment that is getting into this bay is coming from sediment that is associated with active river channels, versus how much of it was associated with row crop agriculture, versus how much of it was associated with incision and arroyo formation. What were the sources of sediment?’ That work pinpointed the vast majority of sediment which was coming from a subset of the overall catchment and was derived from sediment that was in the active channel. “These are active channels that were incised, and there were a lot of banks falling in erosion,” Dr. Dahm said.
The monitoring program that was set up then allowed them to get an idea of the amount of material coming in from various catchments. Much of the movement of this material was very much event driven, such as with tropical storms, and this kind of information then directed the focus of their restoration efforts to the most heavily degraded stream systems.
They initially started with some pilot experiments in some areas that they knew were problematic; about 50 percent of the 48,000 km of streams in South Eastern Queensland had poor riparian conditions. They implemented a program of a combination of replanting, getting the grazing animals out of the system, and some re-contouring, and they had some success in getting revegetation back in some of the highly eroded systems.
Early on, they also realized that one of their main problems from a nutrient loading perspective was point source inputs, mainly the wastewater treatment plants associated with the greater Brisbane area; a number of these plants that were identified as being sources. “They used some very interesting and innovating tools to actually figure out how much of the nutrient, particularly the nitrogen, was getting into the various aquatic plants within Moreton Bay by using a stable isotope tracing technologies,” said Dr. Dahm.
“The stable isotope tracing technologies allowed them to basically look at the condition for the upgrades to the plant. They then invested about $400 million in upgrading a variety of these plants,” said Dr. Dahm. “Then as these plants were upgraded, you can see that the effluent point source inputs that can be traced with the N15 stable isotope nitrogen signal were largely removed from the system by a decade later.”
“Some of the things that I think this program did very well were the importance of getting the right message out and timing that message,” said Dr. Dahm. “There was a very active communication program to convince the agricultural interests that reducing the loss of farm land and enhancing the functioning of these ecosystems by basically reducing channel and gully erosion was one of the important things that should be invested in. They also sold the idea that by reducing the sediment sources, there was going to be a significant cost savings in the amount of money that was spent on getting drinking water ready for the population there. It also reduced flood risk and damage, and it improved the overall health and viability of the waterways.”
They had some fairly dramatic examples. Australia’s Millennial Drought ended in the Brisbane area in 2008, and they experienced some fairly substantial floods. “A lot of these restored watersheds did a whole lot better than the ones that had not been fully restored, so there is certainly a monetary payback that they could actually quantify from having healthy waterways and improved waterways in different parts of the catchment,” said Dr. Dahm.
They also set up a very effective Ecosystem Health Monitoring Program, which was designed in stages. The first phase of the program studied and focused on Moreton Bay; that was developed and implemented in 1998 – ’99. There was then a development of a similar kind of monitoring program that focused on all 15 major catchments that was implemented in 2001-2002.
“It’s basically an integrated evaluation program that utilized five compartments: a fish community analysis; invertebrate community health and bio-indicator measurements; nutrients and nutrients effects on primary producers; ecosystem processes such as rates of primary production and rates of nutrient uptake; and then physical and chemical measurements of the system,” said Dr. Dahm. “Each one of those sides of the pentagram produced a series of measurements. Those measurements were then scored and each of those five areas received a 20-point score ranging from 0 to 20. Then the information on this was reported out in a very effective communication campaign; this is one of their report cards that they issue annually.”
The report cards are given out at various locations around the overall basin. Dr. Dahm attended one meeting where there were 100-150 people – the press, local government representatives, and those who were involved with the monitoring. “It was a big deal,” said Dr. Dahm. “That meeting basically allowed the people there to hear how their catchment was doing, and it also in many cases allowed them to help get the resources necessary to do the improvements and to try to improve the score if their score wasn’t up to snuff with some of the other systems.”
The program is underpinned by an adaptive management program that has now been active for almost two decades. That adaptive management program has gone full circle – so it has gone through all the planning and all the other implementation, the analysis, the evaluation, the communication, decision making, and then the modification.
Dr. Dahm presented a graph of how they were doing adaptive management in Australia, noting that it is quite similar to the graphic that’s part of the Delta Plan. “They took some of the ideas that we’ve developed and actually made examples of very concrete things that they’ve done in Australia associated with this Healthy Waterways Partnership,” he said.
Dr. Dahm concluded by saying that they have one of the most effective communication programs that he’s ever seen linked to their evaluation and monitoring program. “It is targeting all levels of the population: there are things for kids, high school students, and adult, continuing education. There’s a series of study guides, and then there’s this report card that they report out on,” he said. “Here’s a place where I think adaptive management has worked, and I think it’s been around long enough that’s it’s been shown its value and its efficacy. It’s a nice international example of some of the things we’re talking about here.”
QUESTION AND ANSWER
Question: The monitoring program that they were implementing; were the scores based on a reference site, or were the scores based on a threshold of achieving certain goals or functions at a certain score level?
Dr. Dahm: “The program had 120 locations that are sampled twice a year. The samples are collected for a variety of parameters. Each one of those five areas has at least five metrics that they are measuring within the system. Then they use the measurements themselves to score; it grades from zero, poor quality, to four, good quality, and then that is how the information gets summed up into the overall grade that the site gets.”
“They basically have criteria that are written up for each one these metrics, so, for example, if it were suspended sediment level, they would have a threshold that would be zero, a threshold that would be one, a threshold that would be two, a threshold that would be three, a threshold that would be four. Each one of the metrics has a range that sets the score.”
“They have in some parts of the catchment things that they consider to be reference-like, so they, in some cases, we use a reference. In some cases, they also have long enough-term data that they can actually look to a baseline. They use a bit of both: both reference conditions and baseline conditions.”
Question: It seems like there could be a very strong role for citizen science in this. Could you speak to any role that citizen science plays in Australia?
Dr. Dahm: “Fairly substantive, in that the 120 sites where they’re doing the measurements and making the determinations to give the grades also then get adopted by school systems. Then they actually use these systems for actually getting kids out into outdoors, and in some cases actually in some of the monitoring programs.”
PANEL 1 DISCUSSION
- Dr. Clifford Dahm, Lead Scientist, Delta Science Program
- Scott Phillips, USGS Chesapeake Coordinator
- Dr. Steve Brandt, Professor at Oregon State University; member of the Delta Independent Science Board
- Dr. Denise Reed, Chief Scientist, Water Institute of the Gulf
- Dr. Steve Lindley, Director, Southwest Fisheries Science Center, NOAA Fisheries
- Dr. Nick Aumen, USGS Regional Science Advisor, Southeast Region
- Dr. Jay Lund
Question: What is important in developing science strategy for a basin, and how can a strategy be made adaptable?
DR. NICK AUMEN, USGS Regional Science Advisor, Southeast Region, began by noting two key elements that have served the Florida Everglades well:
1. Interagency FACA exemption. Federal Advisory Committee Act (FACA) prohibits federal agencies from interacting with the public when they are receiving public comments; having that barrier removed through an exemption improves ability to receive stakeholder input. “I think the Act was created for good reasons, but having that out of the way for the purposes of what we’re trying to do with ecosystem management and restoration, I think, is really important,” Dr. Aumen said. The Florida Everglades received FACA exemption through enabling legislation in the Water Resources Development Act of 1996 that established the South Florida Ecosystem Restoration Task Force. The Gulf Restore program also has FACA exemption.
2. Provide structure for interagency interaction. Need a framework that has a leadership role and connects at senior management level at federal, State, and local government level. Sometimes there are framework constraints by law, or there may be litigation issues that inhibit collaboration, but a framework similar to the Task Force in the Everglades can help. “Whoever or whatever entity is the leadership of that has effective connections both higher up the chain and lower down the chain, because in the end you need an advocate that goes all the way back to the administration and the Congress to advocate for particular things.” When science enterprises are therefore empowered, it is then possible to leverage and pool substantial resources.
“For example, we have a political direction in Florida right now that’s against a lot of big public investments, tax increases, or anything like that but we were able through effective collaboration to get the governor of Florida to commit $90 million to building another set of bridges over Tamiami Trail to restore flow to the Everglades at a time where no new money was being put anywhere else,” Dr. Aumen said. “We came up with some unusual partners, including the Florida Department of Transportation, and that in turn, forced the park service, who has not had a major infrastructure project like that in decades, to put up a match. It’s a $150 million project co-funded by the Department of Transportation from Florida and the National Park Service, and that’s moving forward because of the structure that was put in place.”
“Having that structure in place and a group of entities that are willing to talk to each other, do the hard work to overcome barriers, it sets a good stage if you do that right at the beginning,” Dr. Aumen said. “None of this is easy … sometimes it just takes a lot of hard, hard work, and a lot of talking, and inter-personnel interactions, and perseverance, and not taking no for an answer.”
DR. STEVE LINDLEY, Director, Fisheries Ecology Division, Southwest Fisheries Science Center, NOAA Fisheries
Besides working in the Central Valley with threatened and endangered anadromous salmon, the NOAA Southwest Fisheries Science Center is also heavily involved in the management of the California Current Large Marine Ecosystem (CCLME), which can be thought of as an ocean basin. The work they do in the California current comes from the Magnuson-Stevens Fishery Conservation and Management Act of 1976 and a lot of that Act is responsible for the effectiveness of conservation, Dr. Lindley said. “The California Current, which is relatively well organized, has had successful outcomes in that we’ve turned around a lot of fisheries which were overfished and in terrible condition, and many of those are now in good shape or on track to being so shortly.”
1. Have clear goals and objectives. Their goals and objectives are defined mathematically based on theories of population dynamics and ecosystem science. Fisheries management in the U.S. has developed over the last century to be a science based, stake-holder driven, formal process. There are five week-long meetings per year that include representatives from state and federal governments, tribal governments, industry, and non-government organizations; they get around a table and hash things out in a very structured way. They are guided by comprehensive and up to date fisheries management plans, and an ecosystem plan that is more nascent, but in development, he said.“Those plans, how they’re put together and what’s contained in them is remarkably well-defined in the Magnuson Act itself, which has 10 national standards for how these things are to be done, which are laws,” he said. “What those national standards do really defines how science is to be done and organized, and that includes things such as what is the best scientific information available, how to deal with uncertainty and transparency, and extensive and comprehensive peer review, which is really critical to the acceptance of the science that comes out. There are also science and technical advisory committees that are stocked with some of the best minds in the nation on these panels.”
2. Modeling is a central activity: There are well-developed models for fishery stock assessment; they make use of common software tools and a common theoretical basis. “That is really central to organizing the data that is collected and is brought to bear on the questions. These models can also be used to evaluate the value of bringing in new data or what would happen if we were to curtail collecting certain kinds of data.”
3. NOAA has a well-defined role in fisheries management. “It makes it pretty easy to know what we need to be doing there,” Mr. Lindley said.
4. Strong national leadership: Scientists at higher levels of NOAA think broadly about the fisheries management and ocean management issues that are shared across the nation and across the world; they write policy documents and papers that filter down through the ranks. They are currently working on incorporating climate effects and ecosystem considerations into fisheries management; an ecosystem-based fisheries management policy or a roadmap; and a climate action science strategy with a Regional Action Plan. “I think we’re very successful, and it’s in a bit of a contrast to what we do in the San Francisco Bay Delta, which is much more amorphous and involves attending many meetings,” he said. “We can’t even really begin to attend them all and still do any science like that. It’s much more ad hoc and challenging.”
The Magnuson-Stevens Fisheries and Conservation Act was enacted in 1976. “It’d be hard to think you need a better law to be operating under, but those can, at least in some periods of our American history, be written, and maybe we’ll have that again,” said Dr. Lindley.
DR. DENISE REED, Chief Scientist, Water Institute of the Gulf
1. Get the right people to do the work (and this may not be the people that you have): Reed noted that it is fundamental that a science strategy creates a process to get the right people to actually advance the scientific understanding of the system, which may be different than those currently involved. Dr. Reed said she’d been reflecting on budgets, and thinking that a lot of the money is going to staff time. “I’m really wondering how much goes to science and moving the knowledge forward eventually that occurs as a result of that money,” she said. “I think that principle is something that a new program starting from scratch could really build on.”
2. Identifying what research is needed, as well as identifying good approaches to achieve those research needs. Reed said that the Restore Act and the Louisiana Coastal Master Plan seeks to identify research needs and best approaches to achieve research needs. To identify research needs, the Centers of Excellence adopted a process that solicited input from the top (state level) and bottom (university community) and general public (on-going) – and the resulting product was a very long list structured around issues the state has identified as important.
“We have this combination of top down and bottom up,” Dr. Reed said. “It hasn’t necessarily been pretty because what we’ve ended up with is a very, very long list. There’s a very, very long list structured around issues that the state’s already said is very important, but the individual topics could look fairly scattered. Going forward, the research funding will be organized around the input received from this wide audience, which is critical in order to receive innovative ideas from scientists that are actually doing the work – rather than just agency staff. “The key thing is that we’re then going to put the R.F.P. out on the street and see what good ideas we get back. What research ultimately gets funded is a combination of those two things intersecting.”
3. Processes and strategies need to be set up to allow for creativity, innovation, and new ideas to be put into the process. “It is about the people that do the work and can actually best contribute, as opposed to necessarily the people that you have on your staffs,” she said. “I think that’s a really important thing.”
4. Attract the best and brightest. “This is not about money for Louisiana researchers,” said Dr. Reed. “This is about money to solve problems in Louisiana, and if the researchers have ideas, they can efficiently work in Louisiana, and can provide some constructive input, then I think that is definitely worthwhile. Setting up mechanisms to get the best and the brightest across the country and across the world working on these large scale ecosystems is a really important component of any science strategy.”
DR. STEPHEN BRANDT, Professor at Oregon State University; Chair-elect of the Delta Independent Science Board
Dr. Stephen Brandt provided some insights based on his efforts at the NOAA Great Lake Environmental Research Lab for designing an effective science strategy for the Great Lakes and his work in the Chesapeake Bay and Gulf of Mexico.
He said four key elements are critical:
Identify what the big problems are. This is not the responsibility of the science community per se, pointed out Dr. Brandt. “Scientists can be involved, which would be science informed, but really you need to develop a comprehensive approach with stakeholder involvement. The stakeholders really need to agree that these are the big issues because without that there won’t be money for it eventually.”
Dr. Brandt recalled how in the Great Lakes, there were 1500 participants comprised of state and federal agencies and other stakeholders that took one year to develop five key priority areas and then obtained congressional support ($500 million annually). “What motivated the people was that the congressional delegation said, ‘we’re not going to give you any big hunks of money unless you have a priority set and everybody agrees to it,’” he said. “Money is a big motivator and that managed to get it done and get done on time. This eventually involved half a billion dollars a year of new money. So you need to identify what those key issues are that everyone agrees to.”
Identify what the science priorities are relative to addressing those issues. Science-informed forecasting on the key issues is critical in that it forces linkages between drivers and outcomes, rather than explanatory science. “I’m going to suggest that the science, on a big scale, should be focused towards forecasting, even ecosystem forecasting, rather than explanatory science,” Dr. Brandt said. “I think that forces the linkages of the key drivers to the key outcomes.”
Weather is good example as it is very valuable for management purposes. “In the Great Lakes, salmon is stocked on an annual basis on a massive level that supports a $4 billion fishery. It used to be stocking levels were based on hatchery capacity. Now it’s evolved to the stage that the stocking level in Lake Ontario this year was cut back by the state and federal agencies because forecasting models said there wouldn’t be enough food supply three years down the road when these fish reach maximum consumption capability. To get the fisherman and to get the agencies to agree to reduce stocking level based on science, which means ultimately there will be less fish to be caught and less income to the state, is huge, but it took a massive effort of science as well as stakeholder engagement.”
A formal structure for interagency collaboration: The Great Lakes Restoration Initiative (GLRI) Interagency Task Force provides decision-making authority and funding ability to enact the science enterprise – it is critical to have high-level management participation. It is also critical to have a lead agency – in Great Lakes the USEPA ensures action. “You want to have decision makers who can speak on behalf of the agency and can devote resources and people to solving particular problems,” said Dr. Brandt. “That’s a very high level thing, but that’s the kind of level it needs to be to be effective.” He also emphasized that there needs to be a lead agency that takes the lead and the responsibility, in the Great Lakes the US EPA ensures action.
Once the problems and the key science ecosystem forecasting program have been identified, you need to get a jumpstart the enterprise. It is critical to rally collective effort on an initiative to get the work going. For example, The Chesapeake Bay Program developed a 3D hydrodynamic water quality model which led to calculations of nutrient reduction needed to achieve Water Quality objectives. In the Great Lakes the “International Field Year for Lake Erie” was a concerted effort where all the agencies started collection of nutrient loading affected algal blooms and fish production, which then informed forecasting models of how reduction efforts could influence loading and subsequently algal blooms and fish production. Dr. Brandt said, “An interdisciplinary effort was started to look at how nutrient loading into that lake affected the dead zones, harmful algal blooms, and fish production for the purpose of forecasting how nutrient reloading, nutrient reduction and nutrient reloading would impact those things.”
Dr. Brandt noted that in the Bay-Delta, many of these components already exist. “I think we’ve already identified the issues, called the coequal goals, and we have some structures available that could very easily make things happen.”
SCOTT PHILLIPS, USGS Environmental Scientist; Coordinator of the USGS Priority Ecosystems Science program for the Chesapeake Bay
You need to have a systems approach. “I think you really need to be looking at the estuaries of the delta and the contributing watersheds together when you come up with your science strategy, because they’re all interconnected, so trying to have a broader strategy looking at all those pieces would be valuable,” he said.
You need to be thinking about your audiences. It is also important to define the audience – who should the strategy influence (congressional, state funding) – and calibrate the message so that each audience group understands what they are getting out of it and how they can advocate through their own funding mechanisms. Mr. Phillips noted that, “while your strategy needs to hit the two big issues that you guys already have, you need to be able to tailor your messages so each of those audiences understand what they’re getting out of it. We have a pretty strong science strategy from the Chesapeake, but we really have to message it differently depending if we’re sitting down to the congressional staff, or one of our state partners, or even the tribes. You really have to say, “This is how you can contribute, and this is how your contribution in San Francisco will also make your agency look good,” and they’ll then be willing to really advocate for you and try to get funding for their own mechanisms, as well as try to advocate for other funding mechanisms.”
Try to keep your science strategy at a very high level. And finally, the strategy should be kept high-level; agencies are already working together in some fashion – so the strategy should show at a high level where working well, what needs improvement, and what isn’t working well. “You’re just trying to write a strategy and general direction; you’ll have plenty of time for implementation and action plans afterwards,” he said. “I think what others are going to want to see when they walk in here are the science entities that are sitting in this room are working together in some sort of fashion… If you can have a matrix of issues versus science, you can pretty quickly say where your green light sections are, what you’re doing well to address an issue, and look at those as your early successes that you want to focus on to show that you can get different entities to work together.”
To obtain new money from congress, it is critical to show where the successes are to demonstrate organizational ability and efficiency, and where steps are being taken to ensure better resource alignment, he said.
DR. CLIFFORD DAHM, Lead Scientist, Delta Science Program
- Have clear goals and objectives. “If you have clear goals and objectives, things seem to flow well from that.”
- Have a well thought out, long-term sustainably funded monitoring and evaluation program. “I view the monitoring, if done right, as a very important part of the research program,” he said.
- The need to couple modeling with synthesis, analysis, and communication. “If you invest in that modeling, synthesis, analysis, communication component, then you actually have the opportunity to complete your adaptive management cycle. The program has a better chance of being successful, and I think you can generate the will for long term support.”
Dr. Jay Lund described a paper on the history of hydraulic modeling in the Netherlands, “”Strong, Invincible Arguments”?: Tidal Models as Management Instruments in Twentieth-Century Dutch Coastal Engineering” which chronicled the evolution of models for flood planning. Jay emphasized organization of effective science around construction of “invincible arguments” for diverse stakeholders and decision makers.
“I think that’s part of the organization of effective science is to make invincible arguments to these diverse stakeholders and decision makers, so that’s it’s not deniable and that implies a lot of credibility in the good workmanship that has been applied and the organization,” he said.
QUESTION AND ANSWER
Question: It was interesting to hear ecological forecasting or forecasting mentioned. It’s one of the first times we’ve really heard that term. I’m wondering if the panelists would comment on the importance of that as a framing construct for a large system. Are there constraints that keep you from getting there, and what those might be?
Scott Phillips (Chesapeake Bay) said that forecasting is important for looking at future conditions that they are trying to manage towards, in terms of both the impacts of land change and climate variability. Forecasting is also important in a scenario context. “We really need to evolve the science to not look at the root causes as much, because they’ve been studied pretty heavily, but what are the different ecological outcomes we might get from different management interventions. People can see which intervention might have a particular outcome, then you can start to look at which ones do we really want to try to pursue.”
Dr. Steve Brandt said he’s a strong supporter of the concept of ecosystem forecasting; it has been used in NOAA as an organizing principle for at least a decade or more. As an organizing principle, it makes you think about the drivers and the outcomes. It’s the difference between asking ‘do striped bass eat Delta smelt in the Delta?’ or ‘How will a doubling of striped bass impact smelt?’ “It’s a whole different concept that requires you to take the physical environment into account… It really requires you to look at it from a more multi-disciplinary perspective.” It’s not trying to be operational like the weather service; it’s constructing science to do it in a way that is valuable to managers. “I think if you talk to managers, they’re more likely to want to know what might happen rather than what did happen.”
Dr. Nick Aumen said they have always used hydrologic modeling for forecasting, especially when selecting restoration alternatives. Recently, they’ve been using a lot of ecological models such as single species models, vegetation models, and tropic level models in the evaluation of different restoration alternatives. They also did an exercise where they picked a climate change scenario and used that as input to the regional water management model; the outputs produced were then given to a whole suite of ecologists who were asked to use that scenario and tell them what it would mean to their ecological system. “We’re really getting to the point where we’re bringing the ecological forecasting up to the level of what we’ve had the hydrological forecasting up to, at least in south Florida.”
Dr. Denise Reed said forecasting absolutely essential and should be what the science strategy is about. In a system where there are knobs to turn and challenging real-time dynamics, there’s a role for the near-term forecasting model which would be from week to month and the longer term predictive. “They don’t have to be the same level of resolution. You have different kind of needs. I think we’re all on the same page here, that modeling and this way of using these tools to bring our science together to the management need seems very obvious in these other systems.”
Question: For science strategy what are the elements of an invincible argument, and are the scientists the ones that should be making that, or do you need to essentially translate that to somebody else who has to make it?
Dr. Denise Reed noted that there are key pieces of scientific information that are the “backbone” for a successful science enterprise. For example, in Coastal Louisiana, showing projected land-loss in 50 years was critical for showing what will happen unless actions were taken. Because the projection is a science-based forecast – it was credible and motivated action. “If we don’t do something, then science is telling us that something really bad is going to happen, which is going to be a stimulus for action. Then the onus is on science to say, ‘If you take this action, how is that map going to change.’ I think there are some real linchpin scientific products in some of these systems that really are the backbone of the restoration programs – not just the science strategy but the restoration programs.”
Dr. Steve Lindley pointed out that publication in a peer-reviewed journal can make for an invincible argument, for better or for worse; once contentious analyses are published, people tend to move on from them, even maybe sometimes when they shouldn’t.
Dr. Nick Aumen added that having good quality people by attracting the best and brightest; having high quality science, which in part means having well-designed approaches to science and experiments; and peer-review and publication of articles.
Dr. Steve Brandt said experience and proven results are the best way to do it. If you make predictions that hold true, stakeholders will take note; the more reliable the models become, and the more they become the accepted way of looking at it. With respect to his example of cutting the stocking in Lake Ontario, that only came about because they predicted something similar decades ago, and it did result in smaller catches and smaller fish. “It took many years for those forecasts to be accepted as this is what’s going to happen. That’s what they’re telling us, and this happened, so now we have to do something about it. I think continually proven results are the way the stakeholders begin to accept the science.”
Question: What happens when the invincible science proves to be wrong in some respect? What’s your communication strategy when the science that is basis for management decisions proves to be wrong in some respect?
Richard Roos-Collins noted that the Clean Water Act of 1973 required that all waters be swimmable, fishable, and drinkable by 1983 based on testimony that did not include non-point sources. What can the enterprise do when the basis for management decisions proves to be wrong?
Science evolves – Dr. Denise Reed noted that incorporating new information is a critical part of the scientific process; clear articulation of assumptions must be made up front in management contexts. “That is the thing that allows us to move forward without this perfect knowledge – to move forward on what we do know and not be paralyzed by what we don’t know.”
Dr. Reed then added a comment on peer review, pointing out that everyone probably knows of papers that they think are fundamentally wrong that have gone through the peer review process. It’s a way of doing it, but it’s not infallible, she reminded. Consensus across scientists and across publications is a much better way of ensuring invincibility. “I think that consensus across scientists and across a number of peer reviewed journals is much, much more important that then individual peer review of one particular story. I think climate change probably is the best example of that, where the consensus is about the bigger picture that makes it much more powerful and for me an invincible argument, as opposed to an individual paper that might be peer reviewed on a specific aspect.”
Dr. Clifford Dahm agreed and noted that another way that science can be self-regulating to some extent is if studies can be replicated in multiple locations. He gave a specific example of the Hubbard Brook Experimental Forest, which was looking at the effect of clear cutting on nutrient transport into the waterways. They found that when they cut the forest in New England, there was massive amounts of nutrient loss, but these results were not replicated in other locations in North America; more than half showed very little nutrient transport. It was then that they began to get into the mechanistic reasons why these things were happening. “Having a group of people from multiple locations gives us an opportunity to look at how other systems function and to test whether the ideas of our systems work well in these other systems. Cross system experimentation is, I think, another way that science is self-correcting.”
Dr. Nick Aumen agreed that science is not invincible; there are numerous examples where something has appeared in peer-reviewed literature, which was rebutted, sometimes multiple times. “I think good scientists love that in a way because it advances science. Certainly there are times when things head the wrong direction, and that to me, is the key to adaptive management because you’re going to learn as you do, and you’re going to figure out that wasn’t the right way, so let’s head a slightly different way.”
Question: The Role of Uncertainty in Forecasting?
Jayantha Obeysekera noted that the terms “invincible” and “forecasting” could diminish in some ways the role of uncertainty in prediction. As an example, like Lake Okeechobee in the late 90’s, “The lake had been higher for five years, and there was a lot of push to drain the water out of the lake. We let three feet of water go, and then in the middle of that action, we were in a hundred-year drought. The public did not let us forget that. In terms of forecasting, you have to be wrong only one time and your credibility goes down. Hydrologists distinguish between near and long term uncertainty – they used ‘forecasting’ for describing near-term expectations where they had some confidence in the results, and “predictions” for long-term projections that have probabilities. Is it a good idea to promote these concepts which will diminish some of the role of uncertainty in the whole process?”
Dr. Nick Aumen recalled that the executive director of the district lost his job based on that decision to lower the lake; he also noted that that wasn’t the result of any output of an ecological or forecasting model but it was a decision made on best professional judgment and just an ill-timed move. “I think that any ecological modeler will say that uncertainty is a really important element of it… it’s really important communicate to the decision makers that uncertainty associated with that.”
Dr. Steve Lindley pointed out that the Magnuson Act requires scientists and managers to consider uncertainty; there is a formal process for this of being more conservative when forecasts are less certain as a way to buffer against that. “I think also it’s just very important to realize that predictions about the future are very difficult to make. People probably shouldn’t be penalized terribly when they get wrong what’s going to happen with ecosystems, which are inherently unpredictable. We do need to be monitoring carefully what happens and adjusting the models that were using to forecast when they don’t work, which would usually be the case.”
Dr. Denise Reed noted that the NOAA Office of Water Prediction and their National Water Model is doing a great job nationally in providing water prediction outcomes to support decisions. While reservoir operations are not included yet, they have 2.5 million predictive points for all the streams across the United States- which is a large improvement on the 14,000 stream gauging stations. The grand challenge now for the Weather Service is the 30 to 90-day weather forecast which would provide enormous contributions to water managers across the country. Coastal Louisiana is looking forward to working with NOAA OWP and more broadly, science enterprises should seek collaborations with other efforts to collectively advance modeling. “There were some really big developments out there that we need to kind of recognize that we’re not working in isolation of.”
Dr. Steve Brandt said that in his view, ecosystem forecasting is a more conservatively structured way to do the science, because you have to be more careful in forecasting things. You can also do things in a probabilistic forecasting way, as the weather service does where you don’t get a 100 percent chance of anything; that says up front that things may not turn out the way that they’re being forecast. “Some would use the term forecasting and prediction interchangeably. I’m not so sure… I think forecasting makes it more formal and makes you think that you really have to think about it more precisely.”
Dr. Jay Lund agreed and noted that there are ways in decision analysis to characterize various levels of uncertainty to arrive at an optimized decision given various potential impacts and data imperfections.
Question: Characterizing Uncertainties in Human Behavior?
Dr. Bill Labiosa noted that in Puget Sound, some of the greatest uncertainties are associated with human behavior. “For example, in flood plains there is a high degree of confidence from the engineering aspect on what the outcomes will be, while the human population that is living there refuses to change their behavior. I’m just trying to imagine the invincible argument being the thing that you employ in this situation.” A good science strategy should include an understanding of human behaviors and an appropriate system of incentives or penalties to influence behavior.
Dr. Steve Brandt said that involving the stakeholders from the beginning and getting their buy-in up front on the problem is really important. Stakeholder engagement is also an important component of the ongoing effort. “Having that continual engagement is one way to keep the open dialogue.”
Question: Balancing Long term and Immediate Needs in Enterprise Design. Quite often the policy makers are fighting fires every day. They live in a very difficult world compared to most scientists. To what degree, how can you design and manage the science programs so you can address a lot of those immediate needs that they have every day, with the need for longer term perspective, longer term foundational development understanding of the system?
Dr. Nick Aumen said that in a previous position where he oversaw a large research group, their advice was to plan that 25 percent of scientists’ time would go to ‘brush fires’ – those immediate issues that arise because of outside forces such as political forces, natural forces, or natural disasters – whatever that was going to drive 25 percent of your time and you just have to build that in. At the USGS, part of the mission is to be looking far ahead. “I don’t think I’m doing my job well unless I think about what I think our partners need five and ten years from now.”
Dr. Steve Lindley said that at NOAA, there are three people at the national level whose job it is to basically think big thoughts about ecology and fisheries assessment, as well as social sciences. They engage with managers as well as the people who work in the field. “They do have the time to step back and think about what are the emerging issues that are going to be the brush fires 10, 20 years from now.”
Dr. Clifford Dahm said that the Delta Science Program, a portion of the budget is allocated for “directed actions”, ensuring there are resources available for pressing human or nature events. Dahm emphasized that it is critical to defend budget allocation (10-25 percent) for unanticipated events. “It’s a hard thing, though, to defend in your budgets, but it’s critical if you’re going to be able to respond to these events which are going to always happen.”
Scott Phillips pointed out that synthesis can play an important role. When an issue comes up, there’s often a lot of existing information; it can just be a matter of pulling it together quickly and having that capacity where people can use their best professional judgment to try to inform some of the decisions that might be needed to be made in a short manner.
Dr. Steve Brandt noted that even some granting programs can do this; Sea Grant has a pilot project program where they can get money out the door in a few days; likewise, the National Science Foundation (NSF) has Grants for Rapid Response Research (RAPID) Program.
Question: Structure Science Strategy Input Processes
Dr. Ted Sommer noted that several of the panelists identified the need for input for science strategy to be top-down, bottom-up, also stake-holders obviously is right on target, but not much about was talked about who uses that information and how. Although Denise did mention there was a concern that maybe some of the existing staff may not be the right ones to use it. I wasn’t sure if that was conflict of interest, different skill sets, independence?
Dr. Denise Reed clarified that science enterprises must be deliberate in creating processes to attract the best and the brightest; in practice this means bringing in multi-disciplinary and different skillsets, from the private sector for example, which can bring in new and creative ideas – as opposed to just career agency scientists. This is particularly helpful for new, short-term challenges that may require expertise that existing staff do not have. “I think that the idea of getting the best and the brightest is the key thing, and having a process that can bring in the right people to do the job. Having the right people and the right magnitude with the right skill sets available when there are emergencies coming – that doesn’t always happen within the structure of a kind of F.T.E. process.”
Dr. Nick Aumen said that it’s a struggle within the USGS; if they hire a permanent person, it’s a 30-year commitment to whatever skill set that person has, so those decisions must be made carefully. However, he noted that things have changed; probably half the staff at the science centers are either term positions or contractors, which gives them the ability to change directions and bring in new expertise. He also noted that almost all the science centers have a pre-arranged contract with a staffing agency, where he can go to a contractor, identify a person, and have them working the next day. “It’s really important to have ways to be able to adjust because if you lock in, you are locked in.”
Question: Best Practice in Data Management Across Agencies and Stakeholders
Jay Lund asked the panel for some examples of practices for data sharing, modeling, collection, and monitoring in efforts that are shared across agencies and partners. There are some examples in South Florida of an integrated modeling center where several agencies came together in one location with their staff to do this kind of work. What are your thoughts on these community technical efforts and their management administration?
Dr. Denise Reed said that after the oil spill, B.P. put $500 million into the Gulf of Mexico Research Initiative. They set up a competitive grants program, and one of the things they established as requirement for receiving funding were very rigorous data management standards, and a very rigorous process for release and public access of data through the Gulf of Mexico Research Initiative Information and Data Cooperative (GRIIDC). “On the Gulf coast, the emphasis on this by that particular program elevated everybody. Everybody now does that, and all of the money that is flowing to science as a result of the Deep Water Horizon is now rising to that bar.” This is becoming more common across agencies, for example NOAA’s ERDDAP is a data server that gives users a simple, consistent way to download subsets of scientific datasets in common file formats and make graphs and maps. In addition, NOAA’s National Centers for Environmental Information (NCEI) host and provide public access to environmental data. Linking grant funding requirements to use of database standards are critical in elevating professionalism in data management.
Scott Phillips agreed that community modeling or monitoring and having set standards or requirements for comparable information are a really important foundation. In the Chesapeake, they co-locate to achieve that. For example, in the Chesapeake Bay Program office, only a quarter of the staff are US EPA staff – the rest are from different agencies and universities, and this really helps in making progress in data management and integrated modeling. “There’s a lot of benefit to have community involvement, and if you can to co-locate them in one spot, they can really make a lot of progress.”
Dr. Clifford Dahm said that when he was involved with a long term ecological research program which started in 1980, and about 15 years in, they decided that data management was critical to the success of that program. They put in standard protocols; the National Ecological Observatory Network requires use of data management standards, and in California, the Open and Transparent Water Data Management Act (AB 1755) will also help motivate use of data management standards.
Dr. Steve Lindley pointed out that federal agencies are now required to make all of their data available online to the public. A group in NOAA has developed ERDDAP which is a software tool that provides a searchable database of over 9,000 datasets that has allowed for easy portability in data sharing. “It’s basically a magic transformer of all kinds of data types that is fairly easy to use and other people run that on top of their other systems. It can gather data from around the world, and make it readily available to people. It’s really pretty easy to do, and I would offer their assistance to anybody who wants that.”
Question: Guidance for Business Models for Co-Location of Public, Private, Academic Collaboration
Lauren Hastings followed-up on the challenge to develop co-location for integrated modeling – one of the issues in the California Bay-Delta system is that top notch modelers are actually working for private consulting firms. One of the issues that we’ve been exploring is a business model that would allow for that kind of interaction of the private consulting firms with agencies and academics.
Scott Phillips said that while they do have some people co-located, there are others who are located elsewhere. They address that by holding a two-day modeling meeting every quarter where technical experts come in, talk about progress and what needs to be done next.
Dr. Denise Reed said she doesn’t understand why getting the private sector to work with agencies is a problem. She emphasized the need for a clear understanding up front the terms and conditions around intellectual property, and once that is defined, then it is possible to develop the business model. There is a push to work in the open-source, free software environment – and once that baseline is set, it becomes much easier to progress in the agreement. With universities, getting master contracts and agreements established up front enables readiness and nimbleness for emergencies. “By getting contracts, arrangements, and master agreements, and negotiating the terms and conditions up front so when there is an emergency you can move the money fast; then you’re both ready so that you can be nimble.”