There’s a tremendous amount of focus out there on how terrible, sucky, whack, wild-ass, and just plain poor fisheries science is. Now, there’s nothing wrong with questioning science. In fact, science is all about questioning things. But too often in our nation’s fisheries debates, the back-and-forth isn’t serving a constructive end. And at least part of the reason is that scientists are not always good at communicating science in a clear, understandable fashion to people who are too busy fishing to follow the many, intricate, convoluted steps that it takes to do good science. In short, we’re in desperate need of more plain spoken, spin free science in the fisheries world.
In the ongoing back and forth between fishermen and scientists, there appear to be frustrations on both sides. I think that many times scientists say “I don’t tell fishermen how to fish, they shouldn’t tell me how to run models.” Unfortunately, this is just as unhelpful as the other side — folks like the Recreational Fishing Alliance’s spin machine that stokes fires of government mistrust in order to pad their membership — using tales of troll-like enviro-scientists plotting to ruin coastal economies and put fishermen out of business just for fun.
There’s clearly often a disconnect between fisheries scientists and fishermen themselves. There have been efforts to break down the barriers — cooperative research being a highly successful one — but we need to do better. So let’s talk, no bullsh*$, about fisheries science, and let’s start by all getting on the same page about our expectations. I once had a fisheries science professor who said that if we could see through water, there would be no “science” in fisheries, just a census. The fact is that we don’t understand very much about the ocean and the fish that live there partly because we can’t see them. The models that scientists use to determine a fishery’s status as overfished and/or undergoing overfishing are forecast probabilities of different outcomes, sort of like hurricane models forecasting different storm tracks.
These models take into account biological information about the fish like how fast they grow, when they reproduce, and how well they survive after being caught. They also account for removals by fishing and natural deaths. These models do not give us a clean answer, such as: there are 3.451 million red snapper in the sea. Instead, they respond to “random” events and variables like ocean currents, temperature shifts, prey availability etc. in order to find the range of possible answers as well as the answers that are more likely than other answers. Let’s look at the results of a stock assessment below. This is a graphic representation of the stock assessment for South Atlantic black sea bass from 2011.
Each dot represents the model’s results, given one set of assumptions and circumstances. Look at how many dots there are! That means that there are so many variables at play, that it’s possible to get any of these answers, but where the dots cluster, is a set of more likely answers. Any dot to the right of the vertical line indicates overfishing (taking fish faster than they can reproduce). Any dot below the horizontal line indicates an overfished population (the population is at a lower level than can sustain itself). So for this stock, almost all of the dots are below the line—this means that only a few sets of variables conspire to leave the stock at a healthy level. The vast majority of circumstances result in a stock at a depleted level. The overfishing status is not as clear, which we see because the dots are centered around the vertical line. The green lines indicate the most likely status of the fishery, and as you get further from those green lines, the less likely that answer is.
Yup, that’s it. This is the poor-ass science you’ve heard so much about. Not so scary up close. Now, if I want to catch the maximum number of fish possible without leaving a single extra one out there, then this chart of dots is poor indeed. All it tells me is the likelihood that the stock is healthy. Managers (and their science panels) have to make some decisions about future catch levels that have a good chance of moving the stock and its fishery into the healthy, productive area of this chart.
Our best science can’t give us the exact answers we want from it, so we are left with choices. Here are just a few.
- Manage our fisheries to account for uncertainty in the science, meaning that on occasions we’ll forgo some immediate-term fishing opportunities
- Gather more data to reduce scientific uncertainty, often meaning that we spend a ton of tax-payer money to do so
- Disregard the science and rely on some other information
There are problems with each answer, but certainly number 1 is the easiest and quickest, and I think that’s why we’re seeing it. We are also seeing some movement on number 2, such as additional monitoring and data collection, and revamped science programs. But when we choose door number 3, and say that we shouldn’t listen to the science, I would like to hear the alternative. To what or to whom should we listen? To you? To me? To some other guy or gal? That doesn’t seem all that much better, does it?
The truth is, choosing door number 3 is one of the things that got us into this overfishing mess in the first place. The fact is, science is a tool that we use to make hard decisions. Sure, the sharper the saw the less difficult it is to cut down the tree. But without the saw at all you’re plain out of luck.
In the often heated discussion about how we manage specific fish stocks, calling out the science as bad is kind of a dead end right now. Maybe there aren’t enough data points, maybe we could monitor certain interactions differently, maybe we can’t afford to pay for enough eyes on the water, or analysts in the lab. But the science itself isn’t bad. It’s just science. Imperfect, yes, but still ultimately the most useful tool we’ve got.
I’m not being sarcastic or snarky (this time) when I pose the question: what would work better? I’d genuinely like to hear your concerns and suggestions, in the comments, or on our Facebook page.