PANEL (Nov. 1): Regional Programs

PANELISTS:
  • Dr. Nick Aumen, USGS Regional Science Advisor, Southeast Region (Florida Everglades)
  • Dr. Josh Collins, Chief Scientist, San Francisco Estuary Institute (Bay Delta)
  • Jon Hortness, USGS Coordinator for Great Lakes Restoration Initiative (Great Lakes)
  • Dr. Bill Labiosa, USGS Regional Science Coordinator, Northwest Region (Puget Sound)
  • Scott Phillips, USGS, Chesapeake Bay Coordinator (Chesapeake Bay)
  • Scott Redman, Program Manager, Puget Sound Action Team Partnership (Puget Sound)
  • Dr. Denise Reed, Chief Scientist, Water Institute of the Gulf (Coastal Louisiana)
  • Dr. Ted Sommer, California Department Water Resources Lead Scientist  (Bay Delta)
  • Dr. Tracy Collier, moderator
Question: Do you think your science enterprise is meeting the needs of the management and policy community from their point of view – not your point of view.

Scott Phillips (Chesapeake Bay) answered no, but they are trying.

Dr. Ted Sommer (Bay-Delta):Part of the issue is a lot of the information they get from us is fairly negative gloom and doom stuff. One of the reasons they know who we are is that we create problems for them. Some of us are doing our best to try and change that and communicating more in the way of solutions. We have some ways to go in this.”

Dr. Bill Labiosa (Puget Sound) also had a mixed answer. “There are certain aspects of ecosystem recovery where I think we are doing a pretty good job of meeting needs but as soon as you go up a level, is there a science enterprise meeting the needs of Puget Sound recovery? I would have to answer, what science enterprise?  We really aren’t working at the integrated regional science enterprise-level at this point.” We haven’t really progressed much beyond our individual agency missions and needs in terms of science and monitoring.

Scott Redman (Puget Sound) had two answers. “The first one is, no, we don’t satisfy the managers and the policy makers interests because we hear the question, for instance, how much should we do, where and when?  This is a complex system so the science community is not feeling obliged to answer the question that way so there’s a disconnect. Then I would say, yes, we’re meeting their expectations is because they’re more convinced that we know what needs to be done than the science community is. They’re not so worried about working in the face of uncertainty.”

Jon Hortness (Great Lakes):From the Great Lakes perspective I think I would agree. Especially under the Great Lakes Restoration Initiative (GLRI), I think a lot of them think they know what needs to be done and don’t need the science.  What we’ve had to evolve over time is to get the managers to understand that they don’t have the entire story and that they do need to bring in the science to help them inform as they move forward and adaptively manage. Part of it is I don’t think they’ve asked the right questions, or are willing to ask the right questions. I think we’re getting there over time.”

Dr. Nick Aumen (Everglades) said it depends on the topic. When litigation required establishment of a phosphorous water quality standard, the science was funded by the federal and state governments and the sugar industry – about $70 million spent over seven years to do the background science to establish the phosphorous standard. “With other issues that are pressing right now, I’d say no,” he said. “For example, invasive species. We can’t tell you what to do with Burmese pythons. I’m not sure we’ll ever be able to. No on climate change. We’re not there yet on the science, in terms of certain things.”

Dr. Nick Aumen (Everglades) also noted that another successful effort was with the endangered cape sable seaside sparrow. They developed a Data Viewer for their Everglades depth estimation network that the Fish and Wildlife Service could use. “We took six months to do it; we dropped everything and worked on that and they love it. It really helped to shape what came out of a recent jeopardy opinion.”

Dr. Denise Reed (Coastal Louisiana) said it depends on whether or not they think they are achieving success, which is defined as getting projects on the ground and making a difference in the landscape. She acknowledged that Louisiana doesn’t always have the best reputation for spending money wisely. “A lot of the money that I showed on the slide this morning, while it’s available – the check’s not written yet. A lot of the entities that are the trustees of the Natural Resource Damage Assessment (NRDA) money, they are going to be looking for project proposals that are founded on science. They are going to be even more sensitive to the potential for politically based decision making as opposed to scientifically based decision making, just because it’s Louisiana.”

Dr. Reed (Louisiana coast) noted that it’s a definite yes for the 2012 Coastal Master Plan, but this may not be so for the 2017 Coastal Master Plan.We put climate change in it in a big way. That red map [showing the effects of sea level rise on coast] is not going to be very easy for them to deal with,” she said.  The next hurdle is when these big projects go into permitting, as the NOAA and other agencies will be looking at the scientific basis for what’s being proposed. “I think at that point they’re going to recognize the value added that we’ve provided, but that’s yet to be proven,” she said.

Question (Dr. Peter Goodwin): In that last round, several of you mentioned the uncertainty in these systems. We’re always going to have that uncertainty. From your perspective as leading scientists, what can you put in place to support agency directors, the people making the decisions, to give them some of the backing to help them make these very, very difficult decisions in the face of uncertainty?

Dr. Nick Aumen (Everglades) said this has been an ongoing discussion with their lead policy person. “Her view on that is the way scientists look at uncertainty is very different from the way managers look at uncertainty. She said, ‘most of that stuff you say is uncertain, but there’s plenty of information from which I can make decisions on.’ “The dialogue has to happen in the way we couch uncertainty. We’re going to do that with the management community on our work on climate change and modeling precipitation. We’re never going to be 100 percent certain on forecasting.  We’re finding common language that we can use that makes sense to managers and that scientists can live with that will communicate that better.”

Dr. Collier (moderator) turns next to Scott Redman, and notes, “I think you were saying from your perspective managers can handle more uncertainty than we think they can.”

Scott Redman (Puget Sound):Well I think they seem to be. They don’t share the same understanding of the uncertainties, or maybe they’re just comfortable with that level of uncertainty. The other thing I thought of in hearing the question was having project proponents and reviewers talk about the likelihood of success. Then rolling that up to what’s the likelihood of success of the entire plan, the portfolio of action. I think there’s some room there that we could try to bring in some kinds of uncertainty.”

Scott Phillips (Chesapeake Bay) said they try to present the information in terms of strengths and weaknesses so they can get an idea of how those might affect a particular option. “That helps them, perhaps, narrow down what they feel more comfortable with; we try to say it in that context.”

Dr. Denise Reed (Louisiana Coast) said the best way they communicate that is through fairly simple scenario analysis.  “The general approach is ‘hope for the best, but plan for the worst’ kind of thing. The key message is that we really try not to say that there are any guarantees. I think it’s really about how we communicate our findings and what the results are.” Dr. Reed said that often in a public meeting, people on the other side are expecting guarantees. “Our job is to educate the managers that that’s infeasible from both sides.”

Dr. Ted Sommer (Bay-Delta) said that it helps to have a moderately broad group of people weigh in on the issue. “It’s less effective when one of us goes to one of our managers or directors and says something, whereas with the Interagency Ecological Program, it makes a difference when you have nine agency groups all saying, ‘Yeah, there’s uncertainty but this is what looks like it’s worth doing.’”

Jon Hortness (Great Lakes) agreed that having broad group coming together with the same agenda helps. He also said they sometimes take a portfolio approach if there are several potential projects that could address a specific issue.  “Some may be real high risk but high reward; others may be relatively low risk, so maybe you don’t get quite as much bang for the buck, but you have a pretty good confidence level that it will succeed. We look at it from that portfolio view of several different levels of uncertainty.”

Dr. Denise Reed (Louisiana Coast) recalled that in the work she has done in the Bay-Delta, there were situations where the discourse was tense over decisions where the foundation for the decision was a statistical analysis, such as OMR flows and turbidity habitat for smelt. “There’s a very complex, statistical analysis that’s foundational for the decision.  Inherent in our understanding of statistics is that you can quantify the uncertainty around that.  I do think that in several circumstances we get ourselves in trouble by kind of relying on the line and not the scatter around it. We have to be careful about that.  I wonder if any others have kind of major decision points within their programs that rely on specific statistical analysis that are then subject to endless scrutiny. I think it’s perhaps not the way to go to begin with.”

Scott Phillips, Dr. Josh Collins

Dr. Josh Collins (Bay-Delta) said he has had the experience where an interdisciplinary group of scientists having a coherent perspective addressed people with the authority to make real decisions. “We’re asking them to make trade-offs. Their decision was not based so much upon statistics, but it was based upon their willingness and ability to manage the uncertainty in front of them within their budgets and get to the next step. They were willing to make a step forward without a forever commitment, but to try something a little further together. It was really decisions based upon a common professional agreement more than anything legal or very binding… It’s having the numbers but not relying on them so much because they don’t translate one to another. The rectification from one unit of measure to another seems to be something that’s borderline emotional.”

Dr. Bill Labiosa (Puget Sound) said they are often asked questions framed from the engineering perspective, but there are certain aspects of the system where it is definitively not an engineering problem. “Complex adaptive systems are inherently unpredictable. It’s not, do I have uncertainty in my prediction; it’s that complex adaptive systems just cannot be predicted over the time frames of adaptation. My point being that we have to figure out how to talk about uncertainty in a useful way in these contexts. We still have a State of the Sound report that tells the legislature how ecosystem recovery is progressing in the context of the paradigm that they hold – the engineering paradigm… I would argue we have to figure out how to answer back within the complex adaptive systems paradigm in a useful and clear way. Uncertainty has multiple interpretations in the complex systems paradigm.”

Question: What science tools would be really useful for your system, and how would those be useful across other systems?

Dr. Nick Aumen (Everglades) said they have an effort called Joint Ecosystem Modeling that’s an attempt to take some of these complex ecological models, bring them down to the level of a desktop viewer than anybody in any agency or entity can use to solve complex problems. The cape sable seaside sparrow viewer they developed took the needs of the Fish and Wildlife Service and put that into a desktop application. “It draws on very complex background information but makes it so it’s very usable. I think there’s some approaches like that that can be used as examples across some of these programs.”

Scott Phillips (Chesapeake Bay) noted that a lot of the models that are developed don’t do a good job of transferring this information across different ecosystems. “If we had a more collective approach saying we need ecological models to look at species groups A and B, and develop that as a consortium that we can apply that model in any of these coastal systems, we’d be so much further along. That’s what I see as a big limitation. Whether it’s a model or a web service or a web viewer, there’s too much individual effort in a particular system and not enough collective approach on this.

Scott Redman (Puget Sound) said that some of the models from the Chesapeake sound very similar to some of theirs and he thinks he could learn from them. “I was inspired by hearing about their goal teams and how those are interdisciplinary, where the scientists and the people making management decisions are working together. We implement that sort of thing. We tend to do it on a more ad hoc than standing committee basis… The other is synthesis. We’ve tried things like that; we have taken a 700-page document and brought it all the way up to a two-page management implications document but we haven’t, even in our own system, replicated that through time and through all the topics.”

Dr. Denise Reed (Louisiana) said that she uses the EverView system which was developed in the Everglades and is an example of something that could be more widely used. She also said that conceptual models could also be of great utility if they could come together and have a common way of approaching them. Dr. Reed also noted that there’s a lot of good work on synthesis and report cards across different systems. “It happens more through professional networking than it does through organized dialogue amongst enterprises.”

Dr. Collier (moderator) noted that in his experience in systems where there are many stressors, science synthesis is important for distilling the issues down to something that people can understand. “My view is we don’t have many systematic ways to do that across systems. We haven’t come up with a way. I think each of these systems, the more stressors that they’ve got to deal with, they’ve got to have a better way of achieving the synthesis.”

Dr. Nick Aumen (Everglades) said that the Everglades is a multifaceted system with a lot of stressors. They developed conceptual ecological models for all of the systems. They also started out with 960 indicators and were able to pare that down to 14.  He acknowledged it was a lot of work. “In the end, we used the phrase, ‘get the water right.’ That has four elements: quality, quantity, timing, and distribution. When folks go to Congress to sell that, that’s an easy message, even though underneath it is a tremendous complexity of things. If you have a theme you can focus on that’s very understandable and you can put it down in that and convey that consistently and effectively, it really pays off.”

Dr. Ted Sommer (Bay-Delta) pointed to the need to develop a nexus to the social sciences. He noted that it is relatively easy for the water supply folks to quantify the effects of a loss of water supply or for flood control folks to quantify the effects of a food. “What we don’t do a good job is quantifying the value of a lot of the other resource issues that we deal with; therefore, we don’t have a way to have a dialogue about trade-offs; we don’t have a way to quantify what the costs of inaction are. It’s easy to look at habitat restoration and see this is going to cost a billion dollars. What we don’t quantify is what the cost of inaction is.”

Question: In your experiences, have any of you utilized any structured decision making efforts in your particular systems?

Dr. Bill Labiosa (Puget Sound) said they have tried to use Structured Decision Making (SDM). It’s a resource-intensive process if the program routinely uses it, and there are many contexts in which it could be applied. They have ended up using it, but in an incomplete way. “One is a prioritization of our recovery plan at a fairly course scale. We used a combination of expert elicitations but within a prioritization framework that frames the question as a decision. The problem that we ran into is when you approach the decision makers about expressing preferences across programs and major issues, they don’t want to do that; they wanted the science panel to take a first crack at expressing their preferences for them. That was a tough hurdle to get over so we decided to split the problem into a number of cases so that we didn’t have to impose our own values into the prioritization exercise. This generated more than one set of prioritized actions, grouped by major issues. Multiple lists of priorities are still useful but we can’t compare across them without some expression of importance from the decision makers.”

Another example is a pressures assessment we did for Puget Sound recovery where we tackled the uncertainty using techniques from the decision analysis literature; we treated uncertain variables using Bayesian probability theory and used expert elicitation to create an analysis that could be updated as more information comes in; it could also be plugged into a decision analysis later to allow us to prioritize between the stressors. It’s a tool that’s there for the using but… we can’t get the decision makers, necessarily, to express preferences in the way that we need to take it to the next level,” continued Dr. Labiosa. “Perhaps part of the problem is the way we’re approaching it. I think we need to approach the decision-makers differently, frame the problem in a more useful way. Maybe a decision analyst shouldn’t be running the process. Keep that guy in the other room while emissaries get the needed information in a more palatable way.”

Dr. Josh Collins (Bay-Delta) noted that he’s experienced problems when developing a structured system for a particular permit, such as the Army Corps 404 permit and a system for evaluating potential compensatory mitigation sites in the watershed context. “It sounds good, but once you do that, you’ll find out that they’ve locked down and run the seven metrics they chose the first time and they won’t consider any more,” he said. “Once you meet the requirements that the manager wants, they want to use the same system over and over and over again, even if they move it 1,000 miles.  Managing uncertainty is keeping the uncertainty in front of people. Having group decision around data and enabling them to make another decision to move forward but not the final decision. Just keep advancing the debate. I’ve come to the conclusion that these structured decision support tools are about advancing public debate more than actually solving something, just keeping it going.  The big question is what’s the biggest step you can take and what data do you need to take that step and get to the next one?

Dr. Nick Aumen (Everglades) then had two examples to share. The first was when the Loxahatchee National Wildlife Refuge was very concerned about Burmese python. The python hadn’t arrived yet, and the manager of the refuge wanted to know how much resources should he invest in keeping the python out of the refuge? So they convened a structured decision making process with experts over the course of four days, and in the end, the decision was that it wasn’t worth the investment; the population model showed it’s going to be there no matter what. “The manager was saddened by the news, but also very appreciative of and felt like a well justified decision that that’s not where the priority money should be spent right now for their resources. I thought that was a very good application.”

However, they convened a similar process for another invasive species, again a four-day structured decision making workshop with senior managers from state and federal governments. “I won’t tell you all the outcomes of it but the downside of that exercise was that the managers said, ‘Don’t ever invite me to one of these again.’ They said, “Not that we didn’t appreciate the process and the deliberation, but we could have arrived at that decision in half a day rather than four days.’ Whatever we end up doing, the managers’ view of that is it’s a useful process but it’s got to be applied very carefully. Don’t do it across the board, do it on really important tough problems.”

Scott Redman (Puget Sound) said they were inspired by watershed scale work that had been done in the Willamette Valley in Oregon, so they tried using their version of structured decision making to the sub-Puget Sound efforts called Local Integrating Organizations (LIOS). With the help of a social scientist, they designed a process and guidance for how these groups could go through a structured decision making process in building their local ecosystem recovery plan.  Very few were willing to pilot, but they did find some that did. “We were overlaying structured decision making over these frameworks of open standards for the practice of conservation and integrated ecosystem assessment. It worked pretty well.”

Scott Phillips (Chesapeake Bay) said they brought in consultants who were well-versed in structured decision making, and they found that it might be useful for individual topics where there is a clear decision maker such as water quality and EPA. However, they found that with a lot of outcomes, there isn’t a clear decision maker because so much is done by consensus. “They said, ‘I can come up with some options for you, but since you need so much consensus between federal and state agencies I don’t know if you’re ever going to work it out.’ That was a limitation we saw.”

Question: All of your systems have something to do with the federal government. For Florida you mentioned one of the big advantages you had there was a FACA exemption. What are two ideas for things that would make management of your system, or the science of your system, much easier in terms of changes in federal policy or administration?

Dr. Nick Aumen (Everglades) noted that Florida government is subject to the Sunshine Law, which prevents the deals being cut in the smoke-filled back rooms, but at the same time, it creates hurdles to communication between science and policy makers. His colleague Jayantha Obeysekera has a governing board of nine people, he can’t brief the board members at the same time. He has to do nine separate briefings, because if two or more of them are in the same room at the same time it has to be a publicly noticed meeting and an agenda published; it’s a real challenge.

However, pooling budgets is goal of his. Where there are common purposes and goals, especially in the area of declining budgets or at least flat funding, if they could agree to pool budgets and govern science that way, they could make more progress. “We’re not there yet in the Everglades. I’m not sure if other places are able to do this yet. It doesn’t have to be your whole budget for science but get each entity to dedicate a small portion, or some significant portion, and have that governed by an inter-agency group where you’re really building on the strengths and capabilities of those individual people.

Jon Hortness (Great Lakes) said that the efforts on Asian carp are coordinated at a regional level; the agencies bring their resources and their funding to the table and they determine at that time how they will address that.

Scott Phillips (Chesapeake Bay) noted that after the Chesapeake Bay Executive Order 13508, they worked hard to come with a strategy and a role of each of the federal agencies; they then identified where the gaps were. They went to the Office of Management and Budget (OMB) to describe those gaps and the need for a pooled budget concept. Unfortunately, the OMB could not figure out how to work with a pooled budget. “They just said, ‘We’ll do a piece in this department, a piece in that department.’ It defeated the whole purpose of trying to come up with a pooled budget and be able to collectively say how we could use that money.”

Dr. Denise Reed (Louisiana) acknowledged the problems that can be encountered when working with the federal government, particularly the USACE, in getting them to be on the bigger team. “If you’re not in the federal agencies, they kind of just ignore you.  What about the rest of the people that have an interest in this system? I’m not sure what would have to change there. I think it’s the bureaucracy within which they live. I think it’s more than just budgeting. It’s about our money and your money and that’s not good. It’s really about our issue. … Stu Applebaum used to show that slide of leave your hat at the door.  Well maybe that worked at some points but I don’t think it really works all the time.”

Dr. Collier (moderator) then turned it back to Dr. Jay Lund, who asked the original question. “From your perspective, how would you like to see the federal government be a better partner?

Dr. Jay Lund (audience) pointed out that these problems are all bigger than any one agency. To some degree, the state agencies are no different than the federal agencies. “The problem doesn’t care about our agencies; the problem doesn’t care about the territoriality that has been established in enabling legislation and the budgets. I think it’s just a really interesting problem that should be kicked upstairs, at least a little bit. One problem we have here a lot in doing science is permits … getting permits with endangered species to do science. That’s a problem at the federal level that prevents us all from being able to do science in this system.”

Jon Hortness (Great Lakes) said that as the Great Lakes Restoration Initiative (GLRI) evolved, some had inherent expectations that they would receive funding for them to do work, but as the second action plan developed, not everyone got the money as expected and there was some tension on those issues. “What it finally came down to was the federal agencies as a group made it a point to say, ‘these are the priorities that we have come up with; these are the directions that we have determined we need to take. If that doesn’t fit with the way you think you need to do your work, you have the base program to do your work as well.’ They’re not easy conversations. They’re difficult conversations but I think we’ve addressed that to some degree. I don’t think we’re there yet but it has come up several times and we’ve gotten past some of that.”

Dr. Josh Collins (Bay-Delta) said that in California, federal money hasn’t come from pooling budgets, but instead from intense lobbying. It was intensive lobbying to get Water Resources Development Act (WRDA) funding for science in the bay that lasted nearly seven years. There is a water quality improvement fund which is a special allocation that came about through the efforts of Senator Dianne Feinstein; the money goes through the EPA’s estuary program to about 150 recipients, some of which is for science. In Tahoe, President Clinton sold off some federal lands around Reno and Las Vegas and used the proceeds to fund the Tahoe Regional Planning Agency to do science. “I think it takes political pushing, actually, to get special allocations that can persist for some length of time. That’s one way that I’ve seen fairly large amounts of money flow to different regions of this state that persisted for more than a decade.”

Scott Redman (Puget Sound) said that there’s a lot of promise that the Puget Sound will get to the point of having an interagency task force and working groups. “Just getting to that place would be helpful. I’m very intrigued by the Interagency Ecological Program (IEP). A question for Ted Sommer is, what works and what could work better with IEP?”

Dr. Bill Labiosa, Dr. Ted Sommer, Scott Phillips

Dr. Ted Sommer (Bay-Delta) said, “We do a good job developing a broader voice to get policy back. We do compare resources and strengths and I think that helps us leverage, avoid overlapping, and wasting too much. We’re not very good, necessarily, at doing collaboration as we’d like. There are certain pots of money that have certain flavors or colors with different agencies. Some of those are just plain hard to put in a common pot and be used by everyone. One of the bigger needs we have is a funding source that the group, as a whole, could use more clearly. The last thing I’ll mention is outreach. There are a lot of organizations that do a whole lot better job of outreach than we do. We have a long way to go with sharing our data and getting our data out to the public.”

Question: In the science process, since many of our issues come out of litigation and much of that litigation is about how the science is done, how do you engage those stakeholders who don’t necessarily have the resources to participate but could have the resources to litigate in development of science that’s collaboratively developed and implemented?

Dr. Nick Aumen (Everglades) said the workshops that they’ve used in their two major restoration efforts have been successful. They held workshops where anyone could come in and put an idea or concept on the table, and the scientists and engineers would review the proposal, come back and say why it would or would not work. “In the end we developed a consensus alternative that’s now just recently been authorized by Congress. I don’t want to say this is an easy process. It was extremely labor intensive and also very expensive. It took agency employees weeks of their time dedicated just to this process… I would think many of those folks that were most opposed to this before would say this process to date has been the best way to bring them into the mix.”

Dr. Denise Reed (Louisiana Coast) said that there are often multiple ways of doing things. One might agree to pursue a number of different paths together and understand what those different paths, what the different assumptions were, or the different approaches were, and the pros and cons, and illuminations that those different paths might pursue. It’s also important to have some sort of external review to ensure credibility of the analysis; not necessarily peer review, but some sort of external eyes. They used over-the-shoulder reviews quite successfully. “You also have to be careful that you are doing something which has scientific merit and credibility and is going to stand the test of some third party looking at it. I think talking is good and collaborating is good but that’s not the whole process. The outcome should be the solid science, not just the collaboration.”

Question: About the role of citizen science and traditional ecological knowledge, are the other systems are making systematic use of those things? We are in Puget Sound, for example.

Jon Hortness (Great Lakes) said that in the Great Lakes, they are trying that but they haven’t gotten there yet. It’s especially an important component with the tribes and the First Nations and traditional ecological knowledge. “We’re just starting that conversation… at the federal level we are trying to determine what does that look like and how do we incorporate it.”

Dr. Josh Collins (Bay-Delta) recalled how when they were working on the historical ecology reconstruction, they realized they were drawing pictures of somebody else’s landscape, so they invited indigenous people in to interpret the landscape. “I began to understand that there was a cultural landscape. We are now trying to figure out how we build cultural story and cultural understanding into the landscape maps.  Frankly, one of the things we’re discovering is that the whole landscape is so rich with cultural understanding that there’s nothing left out. We began thinking we’ll do this in order to redirect roads away from cultural resources, but it’s all a cultural resource.”

Dr. Collins added that there are two tribal people getting their PhDs in environmental science who are managing this duality between their way of understanding the environment through 12,000 years of trial and error with western experimentation. “I think it’s a very rich thing to explore. We’re trying to, we’re just beginning. Nowhere near where you are with First Nation people but we’re not shying away from it around California. We’re starting to look into it.”

Question: A lot of the science enterprises to me seemed to be about these large scale broad monitored programs, which I think are great. As you’re starting to get some of these large-scale restoration projects on the ground, how are you meshing the project specific monitoring with the broad scale, more ambient monitoring? Are you tying into those mechanisms that you were monitoring? Are they still specific to the project?

Dr. Denise Reed (Louisiana) said that while monitoring is important, monitoring should not be considered science as it’s just data collection. “I don’t think that’s science; it’s not science until you use that data in some kind of analytical process.  We have to be really careful, especially when we think about the funding for science, that that science is actually advancing knowledge, not collecting data. They are not the same thing.” Dr. Reed referenced Dr. Sommer’s earlier slide showing the large amount of funding spent for compliance monitoring of the state and federal projects. “If I were you in IEP, I’d

Dr. Denise Reed, Dr. Nick Aumen, Jon Hortness

carve off that money and put that in a separate pot and then say, “Okay, we’re going to learn something with the rest of the money. We might lean on that data but that data collection that has to be done to comply with the project should not be considered funding for science.”

Scott Redman (Puget Sound) said it’s best to remember the question they are trying to answer. It’s something they are struggling with. They haven’t really invested in monitoring in a while. “We are getting to grow towards effectiveness assessment but it’s a challenge. The programs who invest money, do they have questions about their investments? That’s kind of our answer.”

Dr. Josh Collins (Bay-Delta) said he wouldn’t carve data collection away from science, but he does believe in science turning data into information. “It’s an essential step and without that, I don’t think data has much value at all. It’s the minimum amount of science.”  Effective monitoring and ambient assessment in order to show effectiveness and permanence across projects has become important. “That’s the way to judge the effectiveness of whole policies, in fact, and programs. Is the Clean Water Act effective?  You have to have projects, to some extent, monitor in a way that’s consistent with an ambient monitoring program so you can compare projects to each other, to themselves over time, and to ambient conditions. The two go hand in hand.”

Jon Hortness (Great Lakes) then gave two examples of where monitoring has been part of the science and not just status indicators. He said that in the Great Lakes, they have both long-term ambient monitoring for status and trends as well as project or topic specific monitoring that is more focused on effectiveness and adaptive management (GreatLakesMonitoring.org). One example is monitoring the effectiveness of the agricultural BMPs to see if it changes the amount of nutrients running off the field. The monitoring data is put into models by USGS and NRCS modelers, and used to inform decisions on where and types of BMPs to place in specific areas. Another example is the connected wetlands which are the coastal wetland areas where the lake meets the nearshore. Monitoring has been important to determine where they can gain the most impact on specific wetlands. They use monitoring, modeling, and then analyze that across the entire system, looking for other high impact areas for additional wetlands to start to recover.

Dr. Josh Collins (Bay-Delta) added that with performance monitoring, they know it can’t be done in all the projects. “So we do try to pick what we think are representative areas so we can transfer and translate that information to other projects that are in a similar hydrogeomorphic setting.”

Scott Redman (Puget Sound) noted that the Puget Sound Vital Signs include human well-being vital signs, so they will be relying on both objective data collection by the Bureau of Labor, the GDP from natural resource industries, and others, but also on subjective surveys of people’s well-being, their sense of place, cultural practice, and other factors. “Those will just be data streams; we’ll need the social scientists to help us make sense of that.”

Dr. Nick Aumen (Everglades) said that monitoring and data are a very important component of science; it’s a three-legged stool: monitoring, research, and modeling. “I would challenge anybody to do a good model without data. I would also challenge someone in climate change. Where would we be without some of the long-term data we’ve collected that helps inform where we’re going to go in the future?  The answer is that there needs to be a balanced approach; it needs to be the right combination. If you indeed put all of your information or your money into collecting data points you certainly are not going to advance the science. The problem we all face is when budget cuts happen, what goes first?” Oftentimes, it’s research, the modeling, or predictive work; compliance monitoring usually can’t be cut as it’s mandated. “The challenge we face is getting the education to the point of where from resource management that there’s equal recognition importance of all three legs of that stool.

Dr. Denise Reed (Louisiana Coast) had the last word. She agreed that effectiveness monitoring is important. “It’s compliance monitoring that is a permit condition or something like that; that should not be part of the science enterprise, the science budget. It should be a resource that the science enterprise can lean on but if the project doesn’t go ahead without that monitoring, that’s not the same thing.”

Click here to return to the Science Enterprise Workshop main page.
Print Friendly, PDF & Email