How You Know You’re Ready to Do Evaluation
Evaluating audiences’ experiences in a museum program, exhibition, or initiative, like most things, requires thoughtfulness and planning to be useful. But, when things get busy and deadlines loom, sometimes the last thing you want to do is slow down and consider why you are doing something.
It’s often our job as evaluators to help people slow down and consider that question of why. Why do you want to do evaluation? Why now (and for what purpose)? With that in mind, here are three key things anyone should consider before embarking on evaluation.
The usefulness of any evaluation will reflect the level of intention that went into it
Perhaps the most important question to ask yourself is “What do I want to learn from this evaluation?” In other words, what is the purpose of doing it? We recognize that someone (like a funder) may be asking you to conduct an evaluation, but you stand to benefit the most in terms of new learning about your audiences. To decide what about the evaluation is important to you, carefully consider these questions:
Who are your stakeholders and what do they hope to learn from the evaluation?
Do you need data about your audience to inform the design of your program or exhibition?
Are you piloting an initiative and need to understand what works (and what doesn’t)?
Is this a long-standing initiative where you need to measure its level of impact?
What can you realistically change based on the results of the evaluation?
In particular, your answer to the last question may make you realize that you are not ready to do evaluation. If you don’t have the resources to realistically make changes to a program or exhibition at this time, you shouldn’t use the limited resources you do have on evaluation.
Your audience is not everyone because your resources are finite
When planning an evaluation, one of the first things we ask our clients to do is identify their target audience. Defining the target audience for an evaluation is tricky. Making a difference in your audience’s lives is very hard to do under the best of circumstances and having too broad an audience will limit your ability to achieve impact.
Envision your resources as a bucket of sand. If you try to spread the sand over a larger and larger area, the shallower it becomes. The same is true of audience impact; the more audiences you have, the fewer resources you have to devote to each one, and the shallower your impact.
Unfortunately, you cannot be everything to everyone and achieve meaningful impact. So, the more narrowly you can define your audience, the better (i.e., specific age/grade level, geographic location, background characteristics). For instance, your audience could be 4th and 5th grade students who identify as female and have limited access to STEM (Science, Technology, Engineering, and Math) programming.
Embrace evaluation as a learning opportunity rather than a judgment tool
We encourage you to embrace evaluation as a learning tool (not a judgment tool). Don’t think of evaluation as a way to prove your success. Where is the fun in that? If everything was presented as perfect, no one would ever learn anything.
One goal of any evaluation is to look at your findings and ask “What did we learn?” and “How can we do better?” In our experience, this process of reflection can be invigorating because it deals with that most important question of why you do what you do for the audiences that you serve. So, go ahead and embrace the idea that evaluation may reveal your successes and failures. Be vulnerable and share those failures and learning with others. Others will appreciate and benefit from the opportunity to learn alongside you!
One of our main goals is to help our clients think evaluatively at the outset of an evaluation. Being thoughtful about an evaluation’s purpose, audience, and learning opportunities can help you get the most out of an evaluation and avoid the trap of doing evaluation just for the sake of it (or because someone asked you to).
For more on this topic, check out the book Enhancing STEM Motivation through Citizen Science Programs, which includes a chapter written by Emily Skidmore and Stephanie Downey on the process of planning an evaluation, including how you know you’re ready to do it.