-reasing the quality, of - University of Arizona

4
1- -reasing the quality, gor of teidl, Stephen DeStefano, and Abstraci nurszcu definitiu~ls ul ~L~C~ICC and recipes for quantitative analyses are no sub- stitute for critical thinking in wildlife science. Inadequately understanding the phi- losophy of science and the principles of sampling and experimental design,not appreciating the differences between research hypotheses and statistical hypothe- ses,and between biological and statistical significance,and not viewing research questions within the context of ecological processes limit the quality of research efforts in wildlife science. Increasing conceptual understanding of these issues will help wildlife scientists,managers, and students develop the powerful tools neces- sary for creative,critical thinking. Key Words adaptive management,experimental design,graduate education,hypothesis testing, sampling design,science here has been a remarkable increase in the amount of information available to make decisions about all aspects of natural resources management, as well as a concomi- tant increase in quantitative methods available to scien- tists in all fields. Although these developments have been overwhelmingly positive, they pose at least 2 prob- lems for scientists. First, scientists need to realize that there is considerable variation in the quality and reliabili- ty of available information. Second, quantitative tech- niques necessary to assess the quality and reliability of information can be used gratuitously and sometimes as a substitute for critical thinking. This fundamental error provides the opportunity for unreliable, biologically triv- ial information to infiltrate many disciplines, including wildlife science. This error results from a disconnect between application of quantitative techniques and appli- cation of basic principles of science and research. Although we believe that quantitative methods will provide an essential means for wildlife ecology to progress, these tools must be applied thoughtfully. To use and evaluate information critically, managers and administrators will need to be trained to separate reliable from unreliable information and future researchers must master techniques to collect information that is reliable and meaningful to applied ecologists. Many undergraduate and graduate students in wildlife science receive insufficient exposure to formative princi- ples of science necessary for them to develop into superi- or professionals. Helping students to develop into critical thinkers will advance applied wildlife ecology, especially regarding its foundation in basic science and quantitative ecology. It might be best for these principles to be devel- oped in undergraduate studies, but realistically they may not be treated sufficiently until students reach graduate Address for Robert j. Steidl and William J. Matter: School of Renewable Natural Resources,University of Arizona, Tucson, AZ 85721; e-mail for Steidl: [email protected]. Address for Stephen DeStefano: United States Geological Survey, Massachusetts Cooperative Fish and Wildlife Research Unit, Holdsworth Natural Resources Center, University of Massachusetts, Amherst, MA 01003.

Transcript of -reasing the quality, of - University of Arizona

Page 1: -reasing the quality, of - University of Arizona

1- -reasing the quality, gor of

teidl, Stephen DeStefano, and

Abstraci nurszcu definitiu~lsul ~ L ~ C ~ I C Cand recipes for quantitative analyses are no sub-stitute for critical thinking in wildlife science. Inadequately understanding the phi-losophy of science and the principles of sampling and experimental design,not appreciating the differences between research hypotheses and statistical hypothe-ses,and between biological and statistical significance,and not viewing research questions within the context of ecological processes limit the quality of research efforts in wildlife science. Increasing conceptual understanding of these issues will help wildlife scientists,managers, and students develop the powerful tools neces-sary for creative,critical thinking.

Key Words adaptive management,experimental design,graduate education,hypothesis testing, sampling design,science

here has been a remarkable increase in the amount of information available to make decisions about all aspects of natural resources management, as well as a concomi-tant increase in quantitative methods available to scien-tists in all fields. Although these developments have been overwhelmingly positive, they pose at least 2 prob-lems for scientists. First, scientists need to realize that there is considerable variation in the quality and reliabili-ty of available information. Second, quantitative tech-niques necessary to assess the quality and reliability of information can be used gratuitously and sometimes as a substitute for critical thinking. This fundamental error provides the opportunity for unreliable, biologically triv-ial information to infiltrate many disciplines, including wildlife science. This error results from a disconnect between application of quantitative techniques and appli-cation of basic principles of science and research.

Although we believe that quantitative methods will provide an essential means for wildlife ecology to progress, these tools must be applied thoughtfully. To use and evaluate information critically, managers and administrators will need to be trained to separate reliable from unreliable information and future researchers must master techniques to collect information that is reliable and meaningful to applied ecologists.

Many undergraduate and graduate students in wildlife science receive insufficient exposure to formative princi-ples of science necessary for them to develop into superi-or professionals. Helping students to develop into critical thinkers will advance applied wildlife ecology, especially regarding its foundation in basic science and quantitative ecology. It might be best for these principles to be devel-oped in undergraduate studies, but realistically they may not be treated sufficiently until students reach graduate

Address for Robert j. Steidl and William J. Matter: School of Renewable Natural Resources,University of Arizona, Tucson, AZ 85721; e-mail for Steidl: [email protected]. Address for Stephen DeStefano: United States Geological Survey, Massachusetts Cooperative Fish and Wildlife Research Unit, Holdsworth Natural Resources Center, University of Massachusetts, Amherst, MA 01003.

Page 2: -reasing the quality, of - University of Arizona

Quality, reliability, and rigor of wildlife science Steidl et al. 519

school. In this case, students should work with these Distinguish research hypotheses from principles intensively in the first semester of graduate studies. statistical hypotheses

What follows is a collection of issues that we believe need to be central themes in graduate curricula for emerging applied wildlife ecologists, issues we wish had been stressed during our own undergraduate and graduate studies. These issues are important for all who plan to design, conduct, use, and interpret research. We believe that the concepts and techniques embodied in these issues will help future generations of wildlife scientists increase quality, reliability, and rigor of wildlife science.

Understand science Science is a word that we all recognize but few under-

stand and truly appreciate. This is partially because the scientific process can be difficult to define precisely. Over 50 years ago, Simpson (1947) concluded that attempts to define science would fill a

Research hypotheses and statistical hypotheses are dif-ferent. Research hypotheses, as exemplified by the hypo-thetico-deductive method (Romesburg 1991), are deduc-tive propositions about how nature works, candidate explanations of the mechanisms that lead to the outcomes and patterns we observe in nature. Statistical hypotheses are statements made in terms of measurable and testable parameters used to investigate specific questions that may or may not relate to a more general research hypothesis. Results of statistical analyses, including statistical hypothesis tests, are inductive, not deductive. We can use statistical hypothesis tests to probe nearly any question (e.g., How many? How far? Which area has more? Have conditions changed?) related to wildlife; however, only questions about causal mechanisms require formu-lating explicit, deductive research hypotheses (Matter and

whole library and that many definitions were contradictory. However, Simpson (1947:91) ultimatelv settled on a defi-

The greatest gains in reliability and efficiency come during the design of research studies,not in the col-

nition: "Science is an exploration of lection or analysis of data. the material universe that seeks natural, orderly relationships among observed phenomena and that is self-testing." In general, then, sci- Mannan 1989). Practicing statistical hypothesis testing, ence is the process of developing reliable explanations therefore, is not equivalent to practicing the hypothetico-for observed phenomena. An obvious but, in our experi- deductive method and not every study demands a re-ence, often overlooked issue is the need for emerging search hypothesis, although every study must be focused professionals to understand the process of science and on a meaningful question. Also, understanding that why it is the only proven, effective approach to reliably hypothesis testing works by falsification and not by proof ascertain facts, patterns, and explanations. Although was an important developmental step for science, as it is being able to define science is important, learning to for the conceptual development of future professionals. think scientifically and to understand the process of sci-ence are absolutely fundamental.

Phrase testable research questions explicitly

Many students and others involved in applied research fail to recognize the value of phrasing research questions explicitly. Even when there is a clear research question, it often is not phrased in a way that is testable. This is especially true for observational studies. Although there is usually a general idea of what we would like to study (e.g., effects of recreationists on animal behavior), rarely are these ideas developed formally into a series of point-ed questions. The first step in the research process, therefore, should be thoughtful development of explicit questions or testable hypotheses, long before considering methods, logistics, or statistical analyses.

Understand the principles of sampling and experimental design

Successful acquisition of new knowledge of ecological relationships depends largely on the efficacy of methods used to collect data about natural phenomena. There are few concepts more important for future professionals, or for faculty who teach future professionals, than principles of sampling and experimental design.

What is design? In an experimental context, design is the assignment of different treatments to experimental units. In a sampling context, design is the method by which samples are chosen from a larger group of entities (the population, frame, or universe). The principal objec-tive in any design process should be to establish a strate-gy that minimizes effects of extraneous sources of varia-tion on the response variables of interest. The greatest

Page 3: -reasing the quality, of - University of Arizona

520 WiZdZije Society Bulletilz 2000,28(3):518-521

gains in reliability and efficiency come during the design of research studies, not in the collection or analysis of data. In fact, careful design can reduce the amount of field work necessary and will always simplify data analy- ses. Further, if a design is sound, an incorrect statistical analysis can always be redone; if a design is flawed, not even the most sophisticated statistical analysis will be able to resurrect it completely.

Undergraduate students have more opportunities than ever to experience the excitement of field and laboratory research. Given the many demands on undergraduate curricula, however, few can offer adequate treatment of the intricacies of sampling and research design. There- fore, every beginning graduate student should take a course in research design before they begin their re- search. In a short time, the payoff for students and for wildlife science would be apparent.

Focus on cause-and-effect and on experimental approaches

When possible, design studies in such a way that cau- sation (cause-and-effect) can be established. Experi- mentation, which by definition includes a manipulation induced by the researcher, is the only mechanism avail- able to establish genuine, direct causation, because it can eliminate the adverse effects of confounding variables. This is one of several powerful benefits of experimental studies over observational studies. Amassing observa- tional data about a phenomenon rarely allows us to estab- lish causation, and large correlation coefficients do not imply causation.

Further, we have heard ecologists of all ilks call repeatedly for increased experimentation, only to follow that call with the caveat that "well, experiments in ecolo- gy can be very difficult." We suggest that wildlife scien- tists need to be more creative. Certainly, we cannot always perform critical experiments, but we could do so far more often than we acknowledge. There is nearly always a way to manipulate, in a meaningful way, a sys- tem under study if enough thought and creative energy are targeted at that system.

Never perform a management manipulation unless it is part of an

adaptive management program We advocate the need, indeed the responsibility, for

wildlife scientists to establish monitoring programs to quantify results of management manipulations. Many of us have witnessed large-scale management manipulations

with no follow-up efforts to assess their effectiveness. These are tremendous opportunities to gain information that are lost due to poor planning and short-range think- ing. Management and monitoring must be linked inti- mately if we hope to increase our knowledge about the response of larger, complex systems to perturbations.

View research within the context of applied ecology, not as questions about

species Perhaps the greatest distinction between programs in

applied ecology, including wildlife science, and those in basic or "pure" ecology is the way we view research questions. When students are asked about what they study, those in wildlife science often reply with the name of the species that is the subject of their research. In contrast, students of basic ecology consider ecological relationships first. They focus on concepts such as forag- ing strategies and trophic interactions and mention the target species only as an afterthought. We all eventually collect the same kinds of data and use the same tech- niques, but we view our research and ultimate contribu- tions differently.

As superficial as this issue may seem, we believe that wildlife science would improve if we changed our view from species-centric to process-centric, Viewing research questions from the broader context of ecological process- es makes more obvious the contributions of all research efforts, including single-species studies, to the larger sciences of ecology and biology. When we view our re- search from a "species" perspective, we can easily over- look potential contributions beyond increased under- standing of species life-history characteristics.

Distinguish between biological and statistical significance

No statement is more meaningless than "population A was significantly different than population B." Because this verbiage is used so often, we may believe that it con- veys useful information. Actually, it provides evidence that we can become enamored with quantitative tools that appear objective but in reality are simply a poor replace- ment for critical thinking.

Virtually no 2 biological entities are identical, and even small, meaningless differences can be detected with large sample sizes and high precision of measurement. Wildlife scientists who contemplate debates about power analysis and the role of statistical hypothesis testing (Harlow et al. 1997, Steidl et al. 1997) realize that strict

Page 4: -reasing the quality, of - University of Arizona

Quality,reliability, and rigor of wildlife science Steidl et al.

hypothesis testing (i.e., testing that yields a dichotomous result: things are either different or they are not different) has been shunned by statisticians for many years because it provides so little useful information. The genuine question is whether 2 populations differ by a biologically meaningful quantity. Was the relative difference ob-served between populations 1% or 65%? Biological information is contained in the measurements we make, not in probabilistic statements about null hypotheses of zero effect. If these ideas are new to you, we suggest reading Johnson (1999), who provides a good overview of the relevant issues.

No scientific discipline is likely to abandon hypothesis testing anytime soon, or even should, but we believe that appropriate roles of hypothesis testing, estimation, and research design need to be better understood. We need to establish what constitutes a biologically meaningful differ-ence between groups, an act that requires careful and criti-cal thought. There will never be a simple, general solution to this necessary step. When we think about and report research results, we must do so regarding size of the bio-logical effects measured, rather than only P-values and test statistics; the latter should be viewed only as useful sup-portive information, relegated to parentheses. Our knowl-edge of ecology will be increased most by assessing the measurements we make, not by performing statistical tests.

Conclusions The bad news is that many students attracted to the

wildlife profession are not attracted by or excited about learning and using quantitative tools. The good news is that the greatest gains in research do not rely on mastery of mathematics, but rather rely on developing a better understanding of the concepts of experimental design and sampling design, scope of inference, establishing cause-and-effect, and the philosophy and processes of science. Further, most statistics and other quantitative tools, although often based on complex mathematics, are not dif-ficult to understand conceptually. Utilitarian understand-ing requires only that you realize how they work concep-tually and that this conceptual base gives you insights into interpretation. In the courses we teach, therefore, we encourage students to approach quantitative measures con-ceptually first rather than mathematically; we encourage

faculty to consider teaching these measures from a con-ceptual perspective. If students understand concepts, com-putations come more easily; the reverse rarely is true.

This incomplete list of issues and ideas that we pres-ent represents mistakes or oversights we have made our-selves. However, new professionals need not repeat these mistakes as part of rights-of-passage into the field of wildlife ecology. Further, we must address these issues at all levels in our profession. Administrators will need to understand these issues so they can recognize and fund quality proposals; managers will need to understand these issues so they can monitor the results of manage-ment manipulations; and researchers need to understand these issues so they can design studies with great preci-sion and reliability.

Acknowledgments. We appreciate the constructive comments provided by R. D. Brown and C. Boggis.

Literature cited HARLOW,L. L., S. A. MULWK,AND J. H. STEIGER.editors. 1997. What if

there were no significance tests? Lawrence Erlbaum and Associates,Mahwah, New Jersey, USA.

JOHNSON,D. H. 1999. The insignificance of statistical significance test-ing. Journal of Wildlife Management 63:763-772.

MATTER,W J., AND R. MAUUAS. 1989. More on gaining reliable knowledge-a comment. Journal of Wildlife ;Llanagement 53:1172-1176.

RO~IESB~RG,H. C . 1991. On improving the natural resources and envi-ronmental sciences. Jourilal of Wildlife Management 55:744-756.

SIMPSON,G. G. 1947. This view of life. Harcourt. Brace,and World,New York, New York, USA.

STEIDL,R. J., J. l? HAYES,AND E. SCHAL-BER.1997. Statisticalpower analy-sis in wildlife research. Journal of Wildlife Management 61:270-279.

Robert). Steidl is assistant professor of wildlife and fisheries sciences in the School of Renewable Natural Resources at the University of Arizona. Stephen DeStefano is unit leader and adjunct associate pro-fessor with the Massachusetts Cooperative Fish and Wildlife Research Unit and Department of Natural Resources Conservation, University of Massachusetts, Amherst. William 1. Matter is associate professor of wildlife and fisheries sciences and coordinator of undergraduate advis-ing and curricula in the School of Renewable Natural Resources at the University of Arizona.

Associate Editor: Krausman