Special feature: Prominent scientists discuss “collaborative science” at Interagency Ecological Program Annual Workshop

Bridge over American River
The Interagency Ecological Program’s Annual Workshop was held April 24-26 at the Lake Natoma Inn in Folsom.

Science, once siloed and hyper-competitive, is becoming increasingly collaborative in the face of the high cost of research and the need for speed in discovery,” wrote Reuter’s News Service in their 2012 Annual Report.

Indeed, “collaborative science” is the new buzz word that everybody’s using, but the term can mean different things to different people.  Since the Interagency Ecological Program (IEP) recently adopted new guiding principles that state that they will provide “collaborative science leadership,” a common understanding of what the term means is needed now more than ever.

So what is collaborative science and how can it be implemented?  At the IEP’s Annual Workshop held last month, a panel of prominent scientists gathered to share their vision of what collaborative science is as well as what it isn’t.

The moderator was Bob Lohn, former Regional Administrator for the National Marine Fisheries Service’s Pacific Northwest region.  Seated on the panel were Byron Buck, Executive Director of the State and Federal Contractors Water Agency; Dr. Peter Goodwin, Lead Scientist with the Delta Science Program; Tina Swanson, Director of Science Center at Natural Resources Defense Council; Tim Vendlinksi, Bay-Delta Program Manager for the EPA; Jeff Kay, Deputy Regional Director for the Pacific Region of the USGS; and Ron Lohoefener, U.S. Fish and Wildlife Service.

Bob Lohn began by giving a sort of reverse Miranda warning: “The statements by these parties cannot be held against them.  They are their personal opinions and shouldn’t be attributed to their agency – unless the agency wants to claim them – or the speaker expressly says it’s the position of their agency.”

Bob Lohn
Panel moderator Bob Lohn

The Interagency Ecological Program logo bears the phrase, “Cooperative Ecological Investigations since 1970”, which implies people working together to understand the ecosystem; however, according to the new guiding principles recently adopted, the IEP will provide “collective science leadership.”  Maybe this is just a modernized version, but the undertaking implies a lot more, said Mr. Lohn.

It implies to me, at least, the IEP is proposing to use its knowledge, its structure and its professional relationships to not only gather and produce more accurate information about the species of interest in this ecosystem, but also to, through leadership, provide an example to the region about how to address and resolve these very difficult issues,” said Mr. Lohn.  “It’s a fundamental change in position that is being proposed here.  It’s not just ‘let’s get along better.’  The leadership component really moves this to a different level.”

[pullquote align=”left|center|right” textalign=”left|center|right” width=”30%”]“To the extent that one side or another, that the science steps too far into policy or that the policy is using science to pretend that it’s the sole basis for the decision, I think great confusion occurs and great angst then plays back upon the scientific process … What you want is a process that is free to turn up the facts, such as they are, however they come, without fear that somehow it’s going to torture a policy decision one way or another.” –Bob Lohn, moderator [/pullquote]

Mr. Lohn said that he’s seen it succeed elsewhere so he knows it can be done, but it’s definitely hard work.  There are many reasons why this is so, he said.  “Fishery science is not rocket science.  It’s much harder,” he said.  “These are not a series of easily replicable static physical environments in which all of the components remain nicely in place and you put some chemicals together and some forces together and you measure the results.  You’re dealing with a very dynamic system in which the interactions are huge, moving parts are many, it’s very difficult to isolate key factors.  This is really hard work.”

Secondly, the work is being done in the context of the Endangered Species Act, which is not always a friend of science, he said.  “The ESA requires decisions to be made based on the best available scientific and commercial information.  That’s not the same as saying you can wait until there is good information available or that you can ignore information because it is skimpy and may not be reliable.  Rather, it says take what you have and make a decision,” he said, noting that isn’t the real problem with the ESA.

It can be tempting as a decision maker in the face of uncertainty to say ‘the data made me do it; I just didn’t have a choice here,’ said Mr. Lohn.  It’s important to articulate where the appropriate role for the science is, and where the appropriate role for policy decision making is, and to be candid about where that line lies.  “To the extent that one side or another, that the science steps too far into policy or that the policy is using science to pretend that it’s the sole basis for the decision, I think great confusion occurs and great angst then plays back upon the scientific process,” he said.  “What you want is a process that is free to turn up the facts, such as they are and however they come, without fear that somehow it’s going to torture a policy decision one way or another.”

Mr. Lohn then told an anectdotal story:  “There was a carpenter, a teacher, and a scientist traveling from London to Edinburgh, Scotland, and as they crossed the Scottish border, the carpenter looked out the window and said, ‘Look!  Scottish sheep are black!’  The teacher said sort of evasively, ‘I think we should say some Scottish sheep are black,’ and the scientist glumly shook his head and said, ‘all we know is that there is one sheep in Scotland, and one side is black,’” he said.  “Now I’m not suggesting you confine your knowledge quite that narrowly, but being candid about what you don’t know and exposing the boundary at which surmise or policy is playing is really important to this.”

Thirdly, we belong to different institutions and we have divergent institutional views, which carries over an  element of bias or interest or fear or alignment.  “Learning how to set that aside and finding the respect and freedom to do that is really important to the success of collaborative science.”

It can be tempting to resolve factual disputes in a court of law, rather than on the ground with empirical information, he said, but there are definite limitations of that.  Judges are used to pronouncing something and that makes it so, which works well in contract disputes but not so well in biological matters.  “The judge rules, bangs his gavel, pronounces the facts such as he sees them, and the fish don’t care.  The fish don’t care what the court ruled, the fish don’t care about your goodwill, the fish don’t care about your good work, the fish only care if the ecological systems they need are present, and if they aren’t present, the fish don’t care.”

Moving beyond simple measurement and working towards solving the underlying issue often leads best to collaboration.  “It’s one thing to say I am gathering information to defend a position, it’s another thing to say I’m working with you to find a solution so we can resolve this factor in the ecosystem that is limiting these critters.”

You are part of a profession in which these kinds of problems are both ordinary and historical, Mr. Lohn said.  “I was struck by the words of a man, James Conan, who was struggling with these issues throughout his career, and here’s what he wrote:  “The stumbling way in which even the ablest of the scientists in every generation have had to fight through thickets of erroneous observations, misleading generalizations, inadequate formulations, and unconscious prejudice is rarely appreciated by those who obtain their scientific knowledge from textbooks.”  You are now part of that group stumbling, fumbling, challenging, suffering, but the truth will emerge, I am confident.

Mr. Lohn then turned to the panelists to begin the discussion.

REN LOHOEFENER

Ren Lohoefener
Ren Lohoefener
Ren Lohoefener is Regional Director for the Pacific Southwest section of the US Fish and Wildlife Service.  He oversees programs in California, Nevada and the Klamath Basin that administer the Endangered Species Act and the Migratory Bird Treaty Act, and manages 51 national wildlife refuges and three fish hatcheries.  He has been with the USFWS since 1989.  Mr. Lohoefener earned his doctorate from Mississippi State University.

Moderator: Ren, what would you say collaborative science looks like and what is your vision for this region as we try to achieve it?

[pullquote align=”left|center|right” textalign=”left|center|right” width=”30%”]“For collaboration to succeed, for people to succeed in the collaboration, everybody has to come with a vested interest.  And I think that’s pretty much true, because everybody has to want collaboration to succeed.” –Ren Lohoefener[/pullquote]

One of the things I find interesting is that I continue to work to form collaboration whether it’s up in the Klamath Basin, the Bay-Delta or the Salton Sea, is that there are all different forms of collaboration, and the literature on collaboration is growing, said Ren Lohoefener.  “I don’t know that I was looking, but 10 or 15 years ago, if I had gone to the literature and looked for master degrees and doctorate degrees where the thesis or the dissertation was an analysis of collaboration and collaborative techniques, I don’t think I would have found much, but now, virtually every university out there is turning out graduate students who are studying collaboration, and it’s really amazing to me.

When you start looking at what collaboration means to people, the range is huge.  “Collaboration can be pretty much whatever you want it to be,” he said.  “I think sometimes we benefit from failed collaboration in the fact that we got people to get together to talk and share ideas, and maybe it didn’t bear fruit and many collaborations don’t, but maybe that in and of itself is success.”

The IEP is a great example of successful collaboration and hopefully it will be an example of maturing collaboration, said Mr. Lohoefener.  Certainly the needs and views of collaboration weren’t the same as when the IEP started 40 years ago.  “Today, we feel increasing pressure to expand this collaboration to include stakeholders, to have conversations right from the beginning about what are we going to do, how are we going to do it, if it bears results, how are the results going to be used, especially in the regulatory world. … We set priorities for research for IEP in a collaborative fashion; nevertheless, it has to be heavily weighted towards answering the pressing regulatory questions that we’re dealing with now because they are eating us up.”

Is collaboration when you start by saying, we’re going to do research, we’re going to agree on research, that has to be weighted towards the research of what we want to do to answer regulatory questions, is that collaboration?  I don’t know, but I feel strongly that collaboration in terms of the IEP has to start from that point.”

One example of successful collaboration is the upper Colorado River recovery program for four endangered fish that involves a number of states, Native American tribes and water interests, which was started in 1987 and has been hugely successful.  “When I talk to, Tom Pitts, who has been instrumental  in the success of that program, and ask him, Tom, why has the Upper River Colorado River program succeeded, and he says, Ren, it’s simple.  Vested interests.  For collaboration to succeed, for people to succeed in the collaboration, everybody has to come with a vested interest.  And I think that’s pretty much true, because everybody has to want collaboration to succeed.”

TINA SWANSON

Tina Swanson
Tina Swanson
Tina Swanson is Director of Science Center at Natural Resources Defense Council, where she oversees the center’s dual mission of expanding NRDC’s scientific capabilities and increasing the visibility of the organization’s environmental advocacy efforts.  Prior to joining the NRDC in 2011, she served over 10 years at the Bay Institute; she was named the organization’s Executive Director in 2008, where she led a number of diverse programs, including a successful campaign for the organization to acquire the Aquarium of the Bay.   Ms. Swanson holds a PhD in biology from the UCLA.

Moderator: Tina, what do you see as collaborative science and what’s your vision in particular as part of that vision? What role are you and your organization willing to play in achieving collaborative science?

I think every single scientist in this room understands what collaborative science is, but I think a number of us don’t understand what is really being meant by this particular term and this collaborative science process that is being proposed for the current planning and ongoing management of this particular system,” she said, “because as far as I can interpret that, really, it’s talking about the collaboration with non-scientists, managers, policy makers and stakeholders and that, I believe, is something different then collaborative science.”

[pullquote align=”left|center|right” textalign=”left|center|right” width=”30%”]“Many of us in this room have probably seen examples where two different parties can look at the same data and the same graph and come to wildly different conclusions, and those conclusions are not based on disagreement about the science; they are based on disagreement about whether or not to believe the science, and I think that’s a really serious problem we have going forward.”  –Tina Swanson[/pullquote]

Making progress on collaborative science would be a valuable thing for the system, but I believe there are some essential foundations necessary for this to work, she said.  “The way it is currently being described gives me the impression that the premise for setting up a collaborative science program to try to deal with this very complicated system is that in fact we don’t understand anything about this system and we don’t have any good science and therefore we need to do science in order to be able to manage and to fix the system, or at least manage it in such a way that it meets our needs and our goals, some of which are legal and some of which are resource.  And I think that’s a very faulty premise.”

The very first element of a successful program is going to have to be the recognition and the utilization of the vast body of existing science that we already have on the system, she said.  “The reason we need to do that is that unless you start from the existing body of understanding that you have, you’re not going to be able to develop useful questions to address with your scientific processes, and you’re not going to be able to identify those problems that you need to address.”

The second essential element of a strong foundation for the process of collaboration between science and management and stakeholders is to come to an agreement as to what constitutes sound, credible, useful science at the beginning.  “Some of that is recognition of the existing science, some of that is an agreement as to what constitutes a well designed experiment and analysis, as well as what are the criteria that you’re going to be using to evaluate and analyze those data and those scientific results.”

Many of us in this room have probably seen examples where two different parties can look at the same data and the same graph and come to wildly different conclusions, and those conclusions are not based on disagreement about the science; they are based on disagreement about whether or not to believe the science, and I think that’s a really serious problem we have going forward.”

There also needs to be some agreement on what is meant when scientists and non-scientists are talking about uncertainty versus disagreement.  “A lot of the drive to establish this collaborative science program is based on people claiming that they disagree, or they think that the existing science is highly uncertain, and or they disagree with the results.  From a scientific perspective, those two terms mean something totally different from the ways they are being used by the non-scientific parties in the room,” she said.

Uncertainty is something that we can measure and we statistically analyze with quantitative data.  Disagreement is when you have two different scientific studies that disagree with each other, and how you address it is by digging into the studies and figuring out what each one has missed,” she said.  “What we see instead is a lot of disagreement among the parties about what is going on in this system.”

For example, you can have a regression analysis showing a relationship between two variables, and scientists will say it’s highly statistically significant and stakeholders will say they don’t think that relationship is important.  “If you have that kind of sort of discontinuity among the parties in this collaborative process you are not going to get anywhere, and if that is allowed to perpetuate in this process, I think this collaborative science process is at great risk of not being successful.”

The major problem that we have been having with science in this system is not the science itself, but the application of the science in the making of the decisions and the deciding of decisions,” she said.  “Perhaps that can be done with collaboration but I haven’t seen it very much.”  Organizations like the NRDC are an important middle man in the process that serves the purpose of partially translating, integrating and synthesizing the science, and then developing and advocating for policies, programs, or actions, she said.

TIM VENDLINSKI

Tim Vendlinski
Tim Vendlinski
Tim Vendlinski is Bay Delta Program Manager for Environmental Protection Agency.  He has been with the EPA since 1984.  His numerous accomplishments include working in the early 1990s to establish the scientific basis for X2 salinity standards, a key underpinning for the Bay-Delta Accord.  He holds a degree in Environmental Policy and Planning from UC Davis.

Moderator: How do you define collaborative science, what does it look like from your suspect, what’s your vision for it, how do you see EPA using collaborative science?

When he first heard the term ‘collaborative science’, Mr. Vendlinksi thought, why do we need another word, and but then eventually decided it was a good word as it describes what EPA has been trying to do for the last 20 years on the Bay-Delta.  Unfortunately, today he has a cautionary tale on the obstacles to collaborative science.

Back in the late 80s and early 90s, there was a lot of concern about the needs of the fish in the Delta, and there were many various processes going on.  As part of the San Francisco Estuary Project, Mr. Vendlinski said that they convened a flows committee to which the question was posed, what flows do the fish need to start recovering and to have abundance and survival?  That started a three-year effort with four facilitated workshops.  “In the collaborative spirit, we invited really top scientists with a great deal of expertise from a range of different scientific arenas, and we also sought a diversity of affiliations, so that we really be trying to take advantage of the breadth of experience out there.  After three years or so, we were formulating a final report on the outcome of that and that report ended up with 11 conclusions and recommendations with justifications for those,” he said.

[pullquote align=”left|center|right” textalign=”left|center|right” width=”30%”]“Even though by EPAs standpoint, we are a true believer of collaborative science, and we always have been, there are many obstacles to thatSome of the obstacles are put forth by the very people who advocate for collaborative science.”  –Tim Vendlinkski[/pullquote]

Here’s the dark side of collaboration.  When it was becoming clear that report was going to traction and actually maybe make a difference, we were told that two of the top agencies that had participants at those workshops were not going to be allowed to endorse the report, even though the people from those agencies had participated in the writing of the conclusions, recommendations and justifications.  They were integral the wording of those, but they were removed from the list of people who would stand behind the report.

They were not deterred, he said.  They took the names off the report and published the report anyway, and as they were getting ready to distribute it, it was dismissed by one of the top agency leaders as ‘pop science.’  “A person from that agency later approached me and urged me not to release the report.  We went ahead and released the report.  It finally saw the light of day,” he said.  “Ultimately, this became the X2 salinity standard, which is studied even today.

Mr. Vendlinski’s career took him elsewhere for several years, and when he returned a couple of years ago, there was discussion about making adjustments in the standards.  We wanted to take a look at the original X2 standard and take advantage of all the great science that had been done since then.  “Again, we sought a diverse group of people with a diverse group of affiliations.  We had the meeting planned, and a week before the meeting, we received an email from some of the stakeholders in the water user community asking for extensive revisions to the purpose statements and the questions that were going to be asked,” he said.  “We thought that there was some value in that so we hurriedly huddled and rewrote the purpose statements and rewrote the questions, and then distributed that out just a few days before the workshop was going to be held.”

A few days before the workshop was to be held, we heard that a call had been made to the regional administrator’s office to call off the workshop.  “We did not call off the workshop; we went ahead and had it, but the reasons I bring these anecdotes up is that it shows you that even though by EPA’s standpoint, we are a true believer of collaborative science, and we always have been, there are many obstacles to that,” he said.  “Some of the obstacles are put forth by the very people who advocate for collaborative science.”

JEFF KEAY

Jeff Kay
Jeff Keay
Jeff Keay is Deputy Regional Director for the Pacific region of the US Geological Survey.

Jeff Kay began by stating that he has the advantage of working for the U.S. Geological Survey, an agency that doesn’t manage anything and doesn’t regulate anything; they just do science and so he has the pleasure of only having to look at one side of the single sheep at a time.

[pullquote align=”left|center|right” textalign=”left|center|right” width=”30%”]“What we discovered is when you take scientists from different disciplines and you put them together, they really can’t talk to each other very well at first.  …  there are just tremendous differences but once they start learning each other’s language … they start to ask questions that neither of them thought of before.  And that to me, that’s when the science starts to get exciting.” –Jeff Keay[/pullquote]

Collaborative science is a broad umbrella that encompasses this multi-dimensional aspect of colleagues coming together to conduct scientific investigations,” he said.  “To me, the unit of science is the study plan.  It is something that scientists come together to work on; they agree upon a study plan and implement it to conduct science, and to me, that’s what collaborative science is.  It suggests something more than just sharing results at the end of a study, but actually mixing it up at the beginning and through the study and making something different out of it.”

Previously, when he was working on creating the Florida Integrated Science Center, some scientists came to him with a proposal to fund a new study.  “I said that’s really interesting, but I’m not going to fund this until you spend three days talking to a hydrologist.  So I hired a hydrologist, brought him up to Gaines, and made them sit down in the same room and talk to each other a few days, and they came up with a new study and proposed something different,” he said.  “What we discovered is when you take scientists from different disciplines and you put them together, they really can’t talk to each other very well at first.  The language is different, their measures of success are different, there are just tremendous differences but once they start learning each other’s language, learning the culture, learning the science, they start to ask questions that neither of them thought of before.  And that to me, that’s when the science starts to get exciting.”

The USGS has been restructuring itself over the last decade so that “instead of being four stovepipes of scientific disciplines of geology, geography, water resources and biology, we are able to do this interdisciplinary science and look at whole systems and ask questions that people haven’t asked before.”

The essence of collaborative science is being able to work across scientific disciplines and being able to work across agencies, but there are a lot of details and challenges, such as who owns the data that is produced, the differing policies and procedures of different agencies, determining who is going to be the lead, who has the final word on where it will be published, what is a sufficient sample size, and more.

Mr. Kay highly recommends the article, Employing Philosophical Dialogue in Collaborative Science, from the 2007 issue of Bioscience.  “The article talks about the philosophical challenges of collaborative science.  They provide a toolbox.  I recommend it to anyone that’s interested in collaborative science.  Before you start the study you ought to the run through their series of questions to help set a framework of commonality.”

BYRON BUCK

Byron Buck
Byron Buck
Bryon Buck is the Executive Director the State and Federal Contractors Water Agency.  He has over 30 years experience in water resources and environmental planning, and has served in executive capacities for water agencies, special districts and non-profit corporations.  He holds a master’s degree from California State University Long Beach.

Fundamentally I think we would all agree our management of the Bay Delta system is really failing our ecological and water supply needs.  And that’s not a failure of science, per se; it’s a societal failure.  We need to be doing a lot better,” said Byron Buck.

[pullquote align=”left|center|right” textalign=”left|center|right” width=”30%”]“IEP has been doing collaborative science, but it needs to broaden the tent and bring in more views so that we start at the ground floor with shared design and shared questions.  That will lead to more shared interpretation and that will help the policymakers going forward.”  –Byron Buck[/pullquote]

There is robust debate on the relative importance of ecological stressors and the practical measures that need to be taken to address ecosystem needs and water supply reliability needs, he said, and we have to be careful with our use of the limited resources we have, but our historic dynamic of mistrust and disrespect is interfering with our ability to protect the ecosystem and water supply.  “That’s led to court actions and I don’t think anybody here would agree that the courts are a good place to do science.  So we really need to break down our gated communities in which we do work and bring in people that have different viewpoints to work together on these things.”

Mr. Buck said that he agrees with a lot of what the previous panelists have said.  “Collaborative science is really a broad and systematic commitment to transparent hypothesis formation, testing and analysis on ecosystem and water management questions at all levels, from the ground level technicians all the way up through the senior decision makers,” he said.  “It really embraces and seeks out differing points of view and channels those differences productively in a structured format to produce what is known, what is not known, and what are the areas of controversy.”

Collaborative science then takes that which is controversial or is simply not known and provides a transparent framework for investigation.  “Ideally, nobody is excluded from the scientific process and in fact, you seek out and invite those differing opinions into the scientific process.  Fundamentally, it brings in the notion of respect of those differing viewpoints throughout,” he said.

Cooperative investigations are just a subset of that. “IEP has been doing collaborative science, but it needs to broaden the tent and bring in more views so that we start at the ground floor with shared design and shared questions.  That will lead to more shared interpretation and that will help the policymakers going forward,” he said.

There is a line between science and policymaking, he said.  “If we do collaborative science well, we’re narrowing that range of differences and we’re identifying where the areas of controversy and disagreement are, and then the policymakers are better able to deal with those things instead of having people coming at them from many different directions at once.”

The promise is that if we can improve our collective understanding and make much more efficient use of resources by doing this, we can improve data sharing and problem solving and ultimately we’ll have improved decision making out of it.”

DR. PETER GOODWIN

Peter Goodwin
Dr. Peter Goodwin
Dr. Peter Goodwin is the Lead Scientist for the Delta Science Program.  He is also the founding and current director of the Center for Ecohydraulics Research at the University of Idaho. Dr. Goodwin has worked in river restoration, flood management, and estuarine ecosystem restoration projects throughout the United States and internationally, and is recognized internationally for his research with important contributions in the field of modeling flows, sediment transport, and river channel evolution.  Dr. Goodwin earned his PhD from the University of California Berkeley.

Dr. Peter Goodwin began by saying that the teleconference call that was set up to plan for this session was very interesting, and afterwards, it caused him to step back and reflect a little bit.  “One of the things I did is I went and reread the simple eloquence but very deep philosophical implications of Aldo Leopold, and I would certainly encourage folks to read that again, because he really captured the essence of what we’re about in the land ethic,” he said, explaining that Leopold is talking about land and water and everything within the landscape.  “The challenge that he threw out is can the land adjust to the new order imposed by anthropogenic change? Secondly, he recognized the critical importance and danger of not having critical understanding of both how the land responds and the economic consequences.”

The interesting that Leopold said was that the land ethic evolves:  “It is not static in time, but it evolves in the minds of the thinking community.”  Given how rapidly perceptions have changed in the Bay Area, not just the science community and the community as a whole, “if we think about collaborative science, it’s really to develop an open community of scientists that can inform the broader thinking society.

[pullquote align=”left|center|right” textalign=”left|center|right” width=”30%”]“If we think about collaborative science, it’s really to develop an open community of scientists that can inform the broader thinking society.”  –Dr. Peter Goodwin[/pullquote]

As part of Mr. Goowin’s definition of collaborative science, he sees it as a means for advancing a common knowledge through an open community, and to solve problems whose solutions are beyond the scope of a single discipline or institution.

The Delta Science Program is working to develop a Delta Science Plan, and he appreciates everyone’s suggestions and concepts that they’ve submitted.  “IEP is really a shining light among CWEMF and some other activities of how a functioning science community works, and there’s a lot we can build on,” he said.

We need to embrace legitimate scientific disagreements.  And here I would like to quote the famous Danish physicist Neils Bohr who made the comment in a very contentious meeting in early 1950s, ‘how wonderful we have met these two competing paradoxes; now we can make some real progress,’” said Dr. Goodwin, adding that throughout the history of science, the big steps forward in understanding have come from competing ideas being tested in a structured manner.

We need to this build this open community of scientists.  Good scientists are a really valuable resource, and we need to create the environment for them to be successful,” he said, noting that he was concerned for ‘soft money’ scientists right now.  “Are we going to lose these people with expertise; are they going to move on to other areas?

With eight of the top fifty universities in the world here in California, there’s no stronger intellectual capacity anywhere in the world, Mr. Goodwin said.  “So what sort of mechanisms can we put in place to engage that brain trust, and it comes down to time.  So we need to free up some the time of those folks and we need to do this to make it attractive to include the next wave of scientists.”

We need a clear articulation of the problem at the science-policy interface, because if we don’t where we’re going, we’ll never know if we got there, and I think that is really key.”

The National Academies recently said of collaborative science that communications is first, second, and throughout, so we need to think about the infrastructure to allow this communication, said Dr. Goodwin.  “We need to provide access to data to facilitate the synthesis … and on top of that, how do we develop the community models and data mining tools so that everyone can get access to it.”

AUDIENCE QUESTIONS

Panel 1 croppedAudience question: Collaborative science from the perspective of academia is not about bringing in different opinions, which are not that relevant anyway, but different expertise.  In what way can and  should we improve on that?

Tina Swanson said she would answer that question but not relate it back to academia, because she didn’t that that was particularly relevant.  “If our purpose here is to talk about and to plan for collaborative science in the context of management of this system, we need to recognize … that some of us are talking about collaborative science and some of us are talking about collaboration to apply science to management of the system,” she said.  “In my view, we’re doing a really good job of collaborative science … we have many really shining examples of very good multi-disciplinary cross-agency collaborative science, so that’s not our problem.  Our problem is the application of that science to the management of the system and I think what this process is intending to do is to build that collaborative link between the science and the management; that is where we’ve got big problems.”

Ren Lohoefener added “Collaborative science is like biology. It’s a subjective, it can be whatever you want it to be … Collaborative science has to embrace the stakeholder’s interests … as population grows and pressures on the environment change.  If we want to be successful and if we want to get funding and if we want to get public support, we’ve got to continue to embrace all stakeholders that are willing to come to the table in a cooperative manner to collaborate with us.  We just absolutely have to, because otherwise we will fail.”

[pullquote align=”left|center|right” textalign=”left|center|right” width=”30%”]If we want to be successful and if we want to get funding and if we want to get public support, we’ve got to continue to embrace all stakeholders that are willing to come to the table in a cooperative manner to collaborate with us.  We just absolutely have to, because otherwise we will fail.”  –Ren Lohoefener[/pullquote]

We need to make sure there is a clear delineation between the science and the value judgments that have to go on many of the decision,” said Dr. Peter Goodwin.  “There’s still some really big questions when we start changing such a dynamic and complex system, and ensuring that there’s the science, the understanding and the tools in place to respond to those challenges I think is critical.”

I think getting a mix of expertise and a mix of disciplines all qualified in the same room talking about things is going to bring in different perspectives and that’s healthy, but the goal here is not to come out with science by consensus,” said Byron Buck.  “What you do want to is bring in all those disciplines and maximize that learning experience, but then at the end of the day, clarify what is known and not known, agreed and disagreed, and let the policy makers deal with that.  That is someone else’s job.”

Audience question: How can collaborative science be more than “I want more of my stuff in your work plan”?  What if your stuff is distracting or divisive? In other words, how are you going to resolve disputes as you begin to form this science plan?

That’s a tough one.  I think the way we’ve done competitive grant science tends to engender those kinds of fights,” said Byron Buck.  Having a facilitator can really help, especially when you’ve had scientists that have been competing for the same funding.  “It’s just not their nature.  They come from different tribes, different silos, different universities, or wherever.  We have to break that down and we may need some facilitation to make that happen, but at the end of the day, I think you’ll get a much better product.  Again better design on the front end, you’re probably going to have better agreement to what the results mean.”

Audience question: Is it appropriate to ignore or dismiss a scientific hypothesis on the basis that it “conflicts with conventional wisdom” or it “violates scientific consensus”?  Is in an appeal to conventional wisdom a clue that a research collaborative is not pursuing objective science?  Does collaboration suppress new ideas?

Dr. Goodwin said that this is a big topic at the national level.  “There’s a lot of concern that if you go through the regular peer review process, how many brilliant ideas by an early career faculty member or researcher are just getting passed over because it is outside the conventional wisdom?”  Funding needs to be set aside for high–risk research.  “For those of you have sat on NSF panels, it is so competitive.  There’s a 6% success rate.  You get five peer reviews, four give excellent, one gives very good, it goes to the panel, it’s not quite there, it’s out.  So how do you maintain that innovation?

I do think we need to always be stretching and testing science to make sure it doesn’t get into that conformity thing because that’s certainly dangerous.  At the same time, when the science has been stretched and tested over a long time and we’re dealing with like a major environmental crisis, do we really have the luxury to allow the science and the decision makers to catch up with the crisis?” said Tim Vendlinski.  “Whether it is the Bay-Delta or the global climate, we have to make decisions based on uncertainty.  We need to look at the preponderance of science.  But by all means, we need to encourage all of the out of the box ideas as much as possible so we need to always make sure that the science community doesn’t get too comfortable.”

Jeff Keay said he can appreciate the with time and resource constraints, the need for management and regulatory agencies to invest in the product that’s going to get them where they need to go.  “That’s one advantage that we have as a science agency, we can put a little investment into the savings account and fund some of those opportunities. … We hire scientists because we want people who are independent thinkers, who are willing to challenge the norm, who are willing to say you know, the assumptions we have been making the last 10 or 20 years, are they really valid?  Maybe we should challenge this one or that one.  We have to keep pushing and prodding because the scientific method never tells us that we know something, just that we failed to disprove something, so there’s always some uncertainty and there’s always a need to double check and make sure we’re on the right track.  We need to appreciate and foster and encourage differences of opinion and thought in the process.”

Testing all sorts of new hypotheses and answering questions, that’s what scientists do.  That’s our trade.  You can’t stop us from asking questions,” said Tina Swanson,but I am a little concerned with this process,  that the asking of the questions has the function of delaying action.”  Given the context of the management system and the legal requirements under the ESA to use the best available science, “I think it’s really important that we not let new and interesting questions, most of which are based on how we’re perceiving and monitoring the system to work, to stop us from taking action because that’s basically what we’ve been doing for the last couple of decades As we do that, we’re not learning from adaptively managing because we’re not adapting, and we’re allowing the existing conditions to continue, which I think every single one of us in the room would agree are nonsustainable for the system and for the two coequal goals to which we’re supposed to be trying to achieve.”

If we engaged in this and the outcome became collaboration equals conformity, we would have failed, because that is certainly not what it’s about,” said Byron Buck.   “Peer pressure is very powerful tool; we need to be sure that’s held at bay.  Again, the facilitation part of the process could help with that as their job is to essentially to make sure differing opinions are drawn out and dealt with systematically in the scientific method,” he said.  And speaking to the comment about delaying action, “From our perspective, we’re in the worst spot.  Our water supply reliability is hinged upon the recovery of the Bay-Delta ecosystem, so we have more at stake than perhaps any other constituency to make sure it’s fixed.   That’s why these questions really have to be developed in the right way.  We have to take actions, but they have to be the right actions that work.  We’ve taken a lot of actions over the last 30 years that have really not produced much of anything of value.”

Question: Can you propose one single solution that would effectively remove the mistrust that exists today in the Delta community?  Which solution would lead us to collaboration?  Is there a solution?

Actually I think the solution is collaboration,” said Ren Lohoefener.  “It doesn’t happen quickly, it takes a long time, but the only way we can get past this incredible litigation prone science we’re in now is by working out the differences of opinion through collaboration.”

I totally agree with that,” said Byron Buck.  “The only thing I would add is that for most people, all they want is a fair shake.  They want their data reviewed in an open, fair and impartial manner.  And that everybody gets to air their views, at the end of the day, they will still feel a lot better and won’t be more inclined to use other processes when previously they get shut out or ignored.  That’s what gets people angry and that’s what makes the system dysfunctional.”

I do want to say one thing,” said Tim Vendlinski.  “I was at the last major State board hearing on the SED for phase 1 for the lower San Joaquin River, and I was struck how there was a complete kind of tone-deaf persona around the agency people.  There were literally busloads of people brought into that hearing room; the hearing got so big they had to move it to a bigger hearing room.  One after another speaker from those communities said this action by the state board is going to devastate our communities.  Then it was the time for the agencies to speak, and it was almost as it all those concerns had never been made.  And so when we talk about trust, I think agencies and decision makers do need more empathy towards the communities they are going to affect while at the same time, we need to take some bold decisions and we need to take some risks.”

LIGHTNING ROUND

panel 2 croppedEach participant was then given a question to read and answer.

Byron Buck: ‘What do you think should be the relative distribution of analytical technical expertise between management agencies versus private consultants?  For example, should management agencies increase the number of data analysts, statisticians, and modelers in their agencies?’ (answers)  “I guess so, sure.  More expertise is always good.  I don’t have a real bias between management and government agencies; I’ve worked both sides of that street for 35 years now … clearly more resources are necessary on a lot of these questions, and certainly better use of those resources and again, a more structured process with those resources would help.”

Peter Goodwin: ‘Question, does collaboration mean working with scientists who are experts in their fields or who are from other parts of the U.S.?  “In my opinion, with technology, we are now part of a global science community, and one of the things we would love to see is opening opportunities as is happening elsewhere in the world for exchange of scientists who are working on similar problems.  So I would say absolutely, you create a community of science that is open then everyone should be able to contribute.”

Tina Swanson: “My question reads, given the definitions of collaborative science by panel speakers, describe how the best available science is used to make policy decisions and explain how collaborative science can improve that process?  “I love this question.  And the reason I love it is because my experience working with a very effective collaborative science process that was hosted initially by the Calfed Bay Delta program did in fact develop the tool that you need in order to figure out how to apply best available science,” she said.  “The Delta Regional Ecosystem Restoration Implementation Plan (DRERIP) brought together agency and non-agency scientists who together developed decision support tools for Delta ecosystem restoration.  …  First of all, they built a synthesis of the available science on how the system worked, next they built a set of tools that could be used repeatedly and consistently and would document the process underneath them to evaluate proposed management actions.  Then they evaluated the actions in the context of their magnitude, their certainty, their predictability, whether they were worthy relative to the goal, whether they were risky, and whether they were reversible.  Using the answers from those evaluations we developed a decision tree where you essentially tested each of those answers and it worked you through a decision tree to suggest that you should either fully implement the proposed action or throw it away because in fact is was too risky or it wasn’t going to achieve your goal, or you implement it in a pilot study. …  We have a lot of the tools out there to allow us to evaluate proposed actions, and remember the context of this whole exercise is management of this system and using best available science with a tool to help you go there.”

[pullquote align=”left|center|right” textalign=”left|center|right” width=”30%”]”It makes as much sense to manage an estuary based on arcane legal decisions as it does to manage wetlands based on the commerce clause.  And that’s what we’re doing.”  –Tim Vendlinski[/pullquote]

Tim Vlendinski: ‘In the recent science news, the article on the IEP Annual Workshop talks about improving collaboration “in light of legal and institutional constraints.  What are these constraints?’  “The legal stuff, if there’s an attorney in the house, that could probably be a sidebar in the next room.  One thing I could say about that is that it makes as much sense to manage an estuary based on arcane legal decisions as it does to manage wetlands based on the commerce clause.  And that’s what we’re doing,” he said.  “On the institutional constraints, … I get alarmed any time I hear somebody say we want EPA to make the best decisions based on science, that’s when it usually means not based on science because there’s something behind the curtain.  And sadly, … science is the little puppy, the little mutt in the room anytime there is a big policy and management decisions to be made.  … Really what drives the daily grind of the decisions are these intense entrenched interests and the status quo and politics of almost brutish force, and its people like me who have spent their entire career always trying to bring the science and saying if we are going to make a decision and you want it based on science, this is the decision you have to make.  But I’ve been disappointed many times in my career and much of my career has been spent controlling damage to the environment, not necessarily making the environment better.”

Jeff Keay:  “I am going to tackle two questions because I think it’s the same answer and these are both very insightful questions.  First one, ‘Tina mentioned that a major obstacle to collaborative science is that not all parties agree to believe the science.  The science is good but some people in groups are predisposed not to believe it, so how do we overcome this obstacle?’  So how do we help people believe the science.  The second one is about publications that scientists base their research problems that are often hidden from nonscientific stakeholders, cutting to the end, is there a way we can make that more available and would it foster a more productive and trusting environment?’” He said that when working back in Florida on the manatee issue, they had a similar kind of confrontational problem where the boating industry and the conservation industry were on opposite sides and they kept things tied up in court.  “The Fish and Wildlife Service and the Florida Fish and Wildlife Commission got together and sponsored a series of seminars. They got 11 people from the boating industry that represented all facets of the boating industry and 11 that represented all the rest of the conservation community in a facilitated conversation, and they invited scientists from universities and the USGS to come in and start teaching them about the science.  And so they sat down together, they heard the same lecture, they engaged in conversation, they asked questions, they heard the same responses and the same answers, and that went on for a period of several years … very interestingly the issues start to fall away as people start to understand this significance of the science, what it really means, what the options are that we really have and how do we get from here to there.  So I think those questions are very insightful.  I think the more we understand the science, the more people will be willing to believe it.  We have to publish our science in those peer reviewed exclusive journals in order to establish their value, their validity, their longevity  – that’s the currency of exchange for a scientist.  That peer reviewed stamp gives it it’s credibility, but we also have an obligation to make that information available in ways that others can use – managers, general public and whomever.

[pullquote align=”left|center|right” textalign=”left|center|right” width=”30%”]Do policy makers and decision makers dump their problems and challenges on science?  Absolutely, and they’ll keep doing it.  Scientists owe it back to us to give us the best of their ability, their best objective recommendation and the probabilities that go with that recommendation.”  –Ren Lohoefener[/pullquote]

Ren Lohoefener: I am going to paraphrase my question … do policy makers and decision makers dump their problems and challenges on scientists?  “The short answer is yes – but I’d like to elaborate on that a little bit.  Having come up from science 30 some years ago to where I don’t do science anymore, I am expected to make hard decisions hopefully based on science.  Let me just relate some of the problems I run into.  Everything breaks down to a probability for me.  When I am looking at a hard decision, I am always saying, what is the real probability of this being true, how sure are we of the results, and how much science is out there to make me sure of the results.  Sometimes there’s not much and you have to go with the best information you have,” he said recalling a recent decision he made that was based on what turned out to be a misinterpretation of a very important table.  He said he wouldn’t have made the same decision had the information been intrepreted correctly.   “Although the cost of my decision probably wasn’t huge because it was probably short-lived, nevertheless it was sizeable in terms of money, money lost to the stakeholders.”

Ren Lohoefener continued: “Not all science is good science, and not all science that ends up in peer reviewed publications is good science, so you have to be challenging all the time.  Scientists are human and scientists that work for agencies are human, and one of the things I have to guard against all the time is am I being presented with only part of the results to influence the decision that the people who are giving me the results want me to make?  Because they are not trusting me as a policy maker or decision maker to make the best decision.  That’s common, and I’ll be honest, thirty some years ago I did the same thing.  So do policy makers and decision makers dump their problems and challenges on science?  Absolutely, and they’ll keep doing it.  Scientists owe it back to us to give us the best of their ability, their best objective recommendation and the probabilities that go with that recommendation.”

CONCLUDING REMARKS

I think you can see from this panel there is great hope for regional collaboration, but it’s not an easy thing,” said moderator Bob Lohn.  “You have people desiring and willing and wanting to work together – intelligent people who are seeking the best for the region as they understand that best to be.  It gives me great hope for the future of this; it doesn’t make it easier, but the fundamental issue – are people willing to try? Is there support? Is there desire? Is there the intelligence behind it? I think from this panel you see that.”

As you leave today, you have opportunities both as participants in a process that seeks to be collaborative and participants in the region to further this; it depends much on your own goodwill, but it also reminds us that underneath this, is a restless search for trust. …  In his book, Genome: The Autobiography of  Species in 23 Chapters … Ridley sort of captured some of the joy and fun in science, and as you leave, before that, just wanted to give you a thought of what we’re about here.”

Ridley writes, “The fuel on which science runs is ignorance.  Science is like a hungry furnace that must be fed logs from the forest of ignorance that surrounds us.  In the process, the clearing that we call knowledge expands, but the more it expands, the longer its perimeter, the more ignorance comes into view.  A true scientist is bored by knowledge.  It is the assault on ignorance that motivates him.  The mysteries that the previous discoveries have revealed, the forest is more interesting than the clearing.”

FOR MORE INFORMATION:

Print Friendly, PDF & Email