Improving and evaluating reasoning

In January 2017, we commenced work on the SWARM project, which is the University of Melbourne’s team in the larger CREATE program – Crowdsourcing Evidence, Argumentation, Thinking and Evaluation- aims to find ways to improve reasoning by taking advantage of (a) the wisdom of crowds, and (b) structured analytical techniques, implemented in cloud-based systems.

Four research teams have been selected to participate in a number of rounds, in which the systems they create will be tested by an independent evaluation group. The analyses produced by test crowds using the systems will be evaluated in a number of dimensions: 

  • Do they make correct or accurate judgements?

  • How rigorous and comprehensive is their analysis?

  • How clearly is the analysis communicated?

  • How user-friendly is the system?

You can read more about the project or listen to some interviews about it. If you’d like to stay informed about the project, sign up here.

SWARM is led by Prof Mark Burgman, Tim Van Gelder, Fiona Fidler and Richard de Rozario.

Posted in Uncategorized | Leave a comment

New publication: Meta-research for evaluating reproducibility in ecology and evolution

screen-shot-2017-02-08-at-8-41-27-pmOver the last few years we have learned a lot about the reliability of scientific evidence in a range of fields through large scale ‘meta-research’ projects. Such projects take a scientific approach to studying science itself, and can help shed light on whether science is progressing in the cumulative fashion we might hope for.

One well known meta-research example is The Reproducibility Project in Psychology. A group of 270 psychological scientists embarked on a worldwide collaboration to undertake a full direct replication of  100 published studies, in order to test the average reliability of findings. Results showed over half of those 100 replications failed to produce the same results as the original. Similar studies have been conducted in other fields too—biomedicine, economics —with equally disappointing results.

It’s tempting to think that this kind of replication happens all the time. But it doesn’t. Studies of other disciplines tell us that only 1 in every 1,000 papers published is a true direct replication of previous research. The vast majority of published findings never face the challenge of replication.

As yet, there have not been any meta-research projects in ecology and evolution, so we don’t know whether the same low reproducibility rates plague our own discipline. In fact, it’s not just that the meta-research hasn’t been done yet, it is quite unlikely to ever happen, at least in the form of direct replication discussed above. This is because the spatial and temporal dependencies of ecological processes, the long time frames and other intrinsic features make direct replication attempts difficult at best, and often impossible.

But there are real reasons to be concerned about what that meta-research would show, if it was possible. The aspects of the scientific culture and practice that have been identified as direct causes of the reproducibility crisis in other disciplines exist in ecology and evolution too. For example, there’s a strong bias towards only publishing novel, original research which automatically pushes replication studies out of the publication cycle. The pragmatic difficulties of experimental and field research mean that the statistical power of those studies is often low, and yet there are a disproportionate number of ‘positive’ or ‘significant’ studies in the literature—another kind of publication bias towards ‘significant’ results. The rate of full data and material sharing in many journals is still low, despite this being one of the easiest and most obvious solutions to reproducibility problems.

In our paper, we argue that the pragmatic difficulties with direct replication projects shouldn’t scare ecologists and evolutionary biologists off the idea of meta-research projects altogether. We discuss other approaches that could be used for replicating ecological research. We also propose several specific projects that could serve as ‘proxies’ or indicator measures of the likely reproducibility of the ecological evidence base. Finally, we argue that it’s particularly important for the discipline to take measure to safe guard against the known causes of reproducibility problems, in order to maintain public confidence in the discipline, and the important evidence base it provides for important environmental and conservation decisions.

Paper citation:

Fidler F., Chee Y.E., Wintle B.C., Burgman M.A., McCarthy M.A., Gordon A. (2017) Meta-research for Evaluating Reproducibility in Ecology and Evolution. Bioscience. doi: 10.1093/biosci/biw159l available at https://doi.org/10.1093/biosci/biw159

Posted in Uncategorized | Leave a comment

The role of scientists in public debate

facts-speakMonday Feb 6 2017, 9am – 5pm, Storey Hall, RMIT University

A one-day workshop for graduate students and early to mid career scientists in conservation and environmental research areas, who are interested in public engagement for practical and/or philosophical reasons. RSVP: fidlerfm@unimelb.edu.au

What are the bounds of being a scientist, and how will I know if I overstep them? Is advocacy at odds with being a good scientist? What is the public’s perception of scientists, and how do they react to scientists who break the ‘honest broker’ model of engagement? Do we simply need more knowledge brokers and NGOs—is it unreasonable to expect scientists to be involved in public debate, as well as their day job? How is objectivity maintained in science, if scientists are people with values? 

We’re here to help with these questions! Dr Kristian Camilleri (History and Philosophy of Science, HPS); Associate Professor Fiona Fidler (BioScience|HPS); Dr Darrin Durant (HPS); The HPS Postgraduate Society; Dr Jenny Martin (BioScience); Dr Georgia Garrard (RMIT, Interdisciplinary Conservation Science); Associate Professor Sarah Bekessy (RMIT, Interdisciplinary Conservation Science). We’ll also have a panel of media experts to take questions on the day.

Public engagement is something strongly encouraged by most universities, and there are many existing resources for effective science communication. However, most focus on expert information provision, where a scientist has some new knowledge that they wish to communicate to the public. Engagement advice typically focuses on news-style science communication; it less often deals with other forms of engagement, such as entering public debates or speaking out for or against new policy proposals. In those cases, the advice scientists receive often amounts to ‘separate the facts from your own personal values’, and ‘don’t speak outside your direct domain of expertise’. In practice, most scientists don’t know how to interpret that advice, or implicitly understand that it is impossible to follow. Underdeveloped guidelines, sometimes coupled with warnings from colleagues who have bad prior experiences, can be enough for scientists to withdraw from public engagement. We’d like to talk about that…

In this workshop we have two main goals. First, we want to find out from scientists, in their own words, what the dilemmas they encounter when contemplating engagement. Do scientists worry about their scientific credibility in the eyes of their peers, or the public, or both if they take a position in public debate on policy issues? Is it beyond the scope of their role of scientist to do this? These are thorny issues that we’ll tackle in a focus group style discussion (structured elicitation exercise) in the first session of workshop.

Second, we aim connect scientists with relevant expertise in philosophy and sociology of science, to help unpack some of the deeper conceptual issues underlying those dilemmas. We will explore questions like: How is objectivity maintained in science, if scientists are people with values?  What is the public’s perception of scientists, and how do they react to scientists who break the ‘honest broker’ or ‘information provision only’ model of engagement? After exploring these questions in the workshop, we will also discuss how to set up longer term peer-to-peer networks and online resources that take can take our workshop discussions to a broader audience.

Workshop program

9am                Intro (Fiona, Sarah)

9:15am           Background to our interest in engagement (Georgia)

9:30am           What are the dilemmas scientists face when contemplating engaging in debate and/or policy advocacy? Semi-structured elicitation exercise. (Fiona, Georgia, Sarah)

MORNING TEA

10:30am        Legitimate values in science, objectivity and the value-free ideal. Seminar, with Q&A. (Kristian)

11:30pm        Public perceptions: what the public expects of experts? Seminar, with Q&A. (Darrin)

12:30pm        LUNCH

1:15pm          Media interactions. Panel discussion with media experts.

2pm                Follow-up session on this morning’s elicitation exercise. What issues remain outstanding? What haven’t we addressed in our previous sessions? How else can philosophy and sociology of science help with these dilemmas? Discussion. (All)

3pm                Philosophy of Sc engagement network building. Discussion. (HPS postgrads)

AFTERNOON TEA

3:30pm          Science engagement support (Jenny)

4pm                Workshop evaluation (Fiona)

Please contact Fiona Fidler (fidlerfm@unimelb.edu.au) for more information.

RSVP: fidlerfm@unimelb.edu.au

This workshop is supported by a University of Melbourne Engagement Grant.

Posted in Uncategorized | Leave a comment