Evaluating Your Evaluations: Getting Better at Getting Better (Facilitation Friday #28)
Meeting and workshop evaluations are only useful if they help us get better at the work of meeting, community-building, and learning. How well do yours perform on this simple metric?
When did you last examine the meeting, workshop, or conference evaluation questions you ask and how much the responses actually inform your design and facilitation choices?
I’ve long been disappointed in both the timing and quality of organizations’ event evaluation efforts, as well as how poorly some decision-makers use the feedback gathered to inform their future event designs. The considerations that follow are meant to help you rethink and refresh your evaluation efforts so that they better gather actionable insights to shape the design and facilitation of your future gatherings.
Revisit what you evaluate and why
Evaluations usually address one or more of three categories:
1. Performance on metrics relevant for all (or most) meetings, workshops, or other gatherings
In my experience, most evaluation questions that organizations ask fall into this category and do not vary much from event to event. They typically gather feedback about event logistics (venue, room set, food, AV, online platform, et al); satisfaction with the speakers or facilitators; and relevance of the content and conversations.
Making these questions more specific and assessing more than just satisfaction usually improves the usefulness of the input received.
For example, event attendees are often asked to evaluate the food offerings: How would you rate the food at this event? While an overall assessment provides a quick benchmark, learning that 41% of attendees rated the food as unsatisfactory doesn’t really tell you what needs to change.
More actionable intel about food (or any other event element requires more detailed assessment. For food you might couple an overall rating with more specific input about variety, quantity/portions, offerings for different dietary needs, taste/quality, service options (banquet, buffet, boxed), et al.
2. Performance on metrics tied to the unique objectives or desired outcomes for a specific meeting, workshop, or other gathering
Not assessing performance on the specific outcomes or objectives for an event is problematic yet I rarely see it done. Without this feedback we don’t really know how well we did on what we set out to accomplish, information that is critical to gather.
A meeting or workshop meant to achieve A, B, or C must have evaluation questions related to each of those outcomes or objectives: Please rate how well this meeting achieved each of the following objectives …
Provide a relevant scale for participant to assess each outcome and offer an opportunity for more open-ended feedback as desired.
3. Marketing research: needs, preferences and other intel for future efforts
Effective evaluations should not exhaust respondents’ attention, but do consider asking a question or two to gather new intel. The purpose of these questions is to collect info not reflected anywhere else in the evaluation.
Possibilities include:
recommendations for future event locations or online platforms
speaker or facilitator suggestions
ideas for future meeting or workshop formats
a current need for which people would value more content
other sources people are turning to for their learning or content (competitive intel)
Be strategic about the evaluation timing
Asking the right questions is important. When we do so also matters. Three timely assessment moments come to mind:
Real-time: how are we doing so far?
Useful for longer meetings, workshops, or conferences when real-time adjustments might be made to improve the experience.
Example: conducting a quick evaluation at the mid-point of full-day event with two simple questions: (1) what is working well for you so far? (2) what changes might help us be even more successful in the second half?
For in-person events, I distribute large index cards and ask people to use one side for each questions. For online or hybrid events, consider using Google docs, whiteboards, survey forms or the chat function.
Event ending (or shortly after): how did we do?
Evaluation distribution seems to most commonly occur right as an event ends or shortly thereafter.
An evaluation right at the end has the advantage of leveraging people’s existing attention, but you probably can only ask a handful of questions.
An evaluation emailed immediately after the event allows for both more comprehensive questioning and thoughtful responses, but may get set aside or ignored resulting in a lower response rate.
A hybrid approach can leverage the advantages of both approaches while working around their respective weaknesses. Simply brainstorm all the feedback you’d like to gather and then prioritize which questions matter most and for which questions you want the maximum response.
Your “at the end” evaluation can contain the questions that matter most and for which you want the maximum response rate.
Your emailed evaluation shortly thereafter can include all the additional questions for which feedback is sought.
I’ve significantly increasing response rates by promoting that evaluation completion is the only way to receive the link to something participants value (speaker slides, bonus content, en try into a drawing for free registrations, et al). Respondents receive the link on the thank you page after the evaluation is completed.
The follow-up assessment
The missed assessment opportunity, particularly for learning experiences like workshops or conferences, is reaching out to people 30, 60, or 90 days after an event. Only by doing so can we learn what, if any, of the event experience led to any lasting mindset or behavioral change.
Asking a limited number of questions at one or more of these extended time intervals allows you to gain insight into how participants have used what they learned at the conference or how their assessment of the experience might have changed.
Questions I often use at this evaluation interval include:
What is the most significant result you've achieved as a result of your _______* experience? (*insert webinar, workshop, conference, etc.)
Describe how your mindset and/or behavior changed as a result of your participation in the _______.* (*insert webinar, workshop, conference, etc.)
What is a question, idea, or insight that has stayed top of mind, something you are still thinking about or are actively engaging with in your conversations and work?
What is a post-conference action item you still need to address and how might we support you doing so?
What is something that you really enjoyed or valued about the _____* but you’ve done little or nothing with since? (*insert webinar, workshop, conference, etc.)
More open-ended questions like these are perfect for a limited number of post-conference telephone conversations with participants, but written responses also will be illuminating.
Bonus Idea
Recruit a diverse mix and/or strategic selection of participants to serve as internal “secret shoppers.” Provide them with some basic training and specific areas/questions for which you seek their feedback. This increases the odds you will receive useful observations and input and that they have a meaningful volunteer experience. Do offer them a registration discount or some other reward commensurate with their efforts.
Bottom Line?
Meeting and workshop evaluations are only useful if they help us get better at the work of meeting, community-building, and learning. Don’t waste people’s time by gathering input you’re unlikely to apply. Ruthlessly review and refine your evaluation efforts to ensure you ask the right questions at the right time and that you use the input gathered to create more worthwhile meetings, workshops, and conferences.
Getting in Action
Look at your existing evaluations. What input are you gathering that is not effectively utilized? Does this reflect a problem with how data is used in the event design, with the questions themselves, or both? Adjust accordingly.
What participant input would best help inform future design and facilitation choices? What questions might most elicit that feedback and when should you ask them?
How might you innovate your meeting, workshop, or conference evaluation process to gather more actionable intel from more people? What experiments might you try to learn which innovative approaches hold the most potential?
© Facilitate Better and Jeffrey Cufaude. All rights reserved.
To affordably license this content for reprint on your site or in electronic or print communications or to contact me regarding customized facilitation skills workshops or consultations, complete this form.