Six Tips for Evaluating Your Nonprofit Training Session
I’m co-facilitating a session on Nonprofit Training Design and Delivery with colleagues John Kenyon, Andrea Berry, and Cindy Leonard at the NTEN Nonprofit Technology Conference on Friday March 14th at 10:30 am! Join Us! Our session will share lots of great advice about what to do before, during, and after leading an effective technology training. Using the ADDIE for designing your workshop, you arrive at the “E” or evaluation. It is tempting to think of this step as only doing a survey to answer the questions, “Did the workshop accomplish its objectives? Can participants apply the skills?” While a participant survey is an important piece of your evaluation, it is critical to incorporate a holistic reflection of your workshop. This includes documenting your session, reviewing your decks and exercises, analyzing your instructional design, and figuring out how to improve it. There are two different methods to evaluate your training. If you think of your training as making a soup, your participant survey is like the food critic’s review of the soup. But, if you use reflection, documentation,pilot tests, and iteration, it is like tasting your soup while there’s still time to make adjustments. I use both. Evaluation is one of my favorite parts of the instructional design or training process. Sometimes at first you feel like – oh crap – this didn’t work and don’t even want to look at it again. Alternately, you may feel so good about it and say my job is done. But disciplining yourself to reflect afterwards always gives the reward of improvement. Here are six tips that will help you deliver highly successful technology training workshops by using effective evaluation techniques. 1. Use Learning Theory I have written a lot about how it is important to understand how the brain works, how people learn by using learning theories to guide the design of your workshops. To guide evaluation, there are several learning theories, including The Four Levels of Evaluation, also referred to as the Kirkpatrick Evaluation Model, was created by Donald Kirkpatrick, Ph.D. to define the four levels of training evaluation. It defines the outcomes of training in four different areas that you can measure. The four levels are: Reaction. Learning. Behavior. Results. By going through and analyzing each of these four levels, you can gain a thorough understanding of how effective your training was, and how you can improve in the future. Bear in mind that the model isn’t practical in all situations. For example, you would not necessarily want to go to the expense and time of applying this evaluation model to a 30 minute brown bag lunch session. However, if you were doing a 6 month peer learning program, you would definitely want to look at the four levels of outcomes and measurement them. Too often, our participant surveys only focus on the first level. 2. Formative Evaluation I use a simple exercise at the end of every training session I facilitate that asks the participants to reflect on what they learned and how they will apply it. I also ask for feedback on what exercises or content in the workshop helped learning and what should be changed. If you ask open-ended questions, you typically only get criticism. To do this analysis, you need to ask for both positive and constructive criticism. The technique is called Plus/Delta. Here are some examples of the questions you can ask adult learners to write down and then discuss in small groups. You can also handout index cards and ask people to write “positives” on one side and “please change” on the other. Or, have them fold 8 x 11.5 paper into quarters, number the spaces, and answer the questions or draw pictures. What really struck you as interesting, new, provocative, or meaningful during this workshop? What is one change that you can make in your practice or one idea that you will put into practice as a result of this workshop? What part of the workshop was most useful to your work? What part of the workshop should be changed to improve learning? You can ask people to share it anonymously. Sometimes I ask them to include their names and I do a drawing for a free book during the learning culmination exercise (see below.) After the session, I spend a few hours doing an analysis of what people wrote on the cards and use it to review with my lesson plan. 3. The Evaluation Survey Click to Amazon If possible, I always include an evaluation survey for workshops. The evaluation survey asks questions that lets participants evaluate in general areas: Has the instructor has been an effective teacher? Has the workshop been effective in advancing the participant’s learning? The first area gets feedback on the instructors facilitation and presentation skills, course content, instructional design, and other areas. The second area includes a cluster of questions designed to gather data about whether or not the learning objectives were, in fact, met. This data can be collected with a series of statements, asking the respondent to agree or disagree. I tend to keep my evaluation surveys short to avoid survey fatigue. Jane Bozarth’s “From Analysis to Evaluation: Tools, Tips, and Techniques for Trainers” includes some excellent templates and examples for training evaluation surveys. This book is a must-have for any serious trainer. 4. Documentation You also want to document your training. My lesson plan includes a process documentation strategy of what and how I will document. I use photo documentation and social media. I also collect all the “evidence” of learning – any notes, sticky notes, drawings, or other items that students created as part of the workshop. I share these back with participants as part of the resources that I provide, often digitized and put on a wiki. Here’s an example from a recent workshop I did on Content Curation at Scoop.It. You can also ask your participants to help with the photo documentation, but you need to make sure that you give them specific instructions on what to photograph and have an organized system for collecting the photos. Sometimes I might document a lesson plan visually with photos and powerpoint as the example above illustrates. 5. Learner Culmination and Celebration Closing Circle at Workshop in New Zealand Closing Circle Group Photo Certificate Awards - Pakistan NGOs As I have written before, it is important to not only have good openers for your training, but also good closers. This is technically part of the design and implementation, but helps learner’s pause for a second and celebrate what they’ve learned. I do this in a few ways: Closing Circle: Depending on the size of the group, each person gets to say a few words to the group about what they learned and how they will apply it. I like to have participants pass an object. If you have a large group, you can have people share “just three words” and make sure you record those as part of your evaluation. This is a spiritual closing the training and can be a bridge if the training is a kick-off to an online peer learning group. Award Certificates: If participants are receiving a credit for the workshop, you can incorporate a certificate of completion. If I’m doing multiple day training, this is another opportunity for learner culmination and to acknowledge others who helped host the training. Group Photograph: I like to have a group photo at the end to include with the documentation and resources so people have a memory of the training. 6. Trainer Reflection A colleague who is a trainer once said to me, “If you are good at what you do, you are always learning.” After every training that I deliver – whether it was a brief 30 minute session or multiple days and months, I have conducted an after action review. I find that investing a few quiet hours in reviewing the survey results, documentation, materials, and my lesson plan, helps me improve the instructional design and content of a training session. Sometimes I write up a brief reflection called a “Do Over” that describes how I might design exercises or facilitate the instructional process differently and what changes I’d make to the content. I use this as a trainer’s journal as well to pick out one or two training skills that I want to improve and spend time reflecting on what I do differently the next time I facilitate a training. Jane Bozarth’s “From Analysis to Evaluation: Tools, Tips, and Techniques for Trainers” has some excellent templates and examples of the trainer’s evaluation of their workshop. If you deliver training, how do you evaluate your work to improve it or prove results?