June 2016

By Marina Joubert

Designing an evaluation is like charting a journey. There are many roads that may get you to your destination, but also many potholes and troublesome turnoffs along the way. Some shortcuts may seem attractive, but could cause you to miss your destination completely. These are some of the core messages from a science communication evaluation workshop presented by Warwick University’s Professor Eric Jensen at CREST on Friday 17 June 2016.

One of the most important reasons to do evaluation is to find out whether the activity or intervention you are looking at has value and is achieving its objectives. “Evaluation is not about patting yourself on the back, but rather an open and humble acceptance that you can do things better,” Jensen said.

The critical starting point for any evaluation is to have clear and measurable outcomes that can be evaluated. Vague objectives like “raising awareness” or “attracting interest” will not work. You need to be very specific about the impacts or outcomes that you want to measure.

Prof Jensen took the audience on a whirlwind tour of tools and methods of approaching and designing proper evaluations and explained the benefits of combining qualitative, quantitative and ethnographic approaches. He explained why planning and piloting are critical parts of the process, and why pre- and post-event testing is crucial to achieve proper evaluations of actual impact. His practical examples of some of the most common mistakes people make when they design and implement surveys were particularly eye-opening.

It may be a hard lesson to learn, but this is the bottom line: Do it well, or don’t do it at all. A quick-and-dirty, surface-level evaluation is likely to provide questionable data (at best) resulting in incorrect conclusions and potentially damaging advice.

Prof Jensen has published extensively on this topic. Here are some open access articles on science communication evaluation.

One of these commentaries focus on the dangers of poor quality evaluation methods that are routinely employed, even by private sector companies who position themselves as experts in the field. Download “The problems with science communication evaluation”.

His latest book “Doing Real Research” deals with social science research methods and is published by SAGE.

Find out more about his research.