Marie Wallace has enjoyed a fulfilling career as a librarian, beginning in 1951 in academia with the University of California and transitioning in 1971 into the private law library world until her 1995 retirement from O’Melveny & Myers. She is the 1997 recipient of the American Association of Law Libraries‘ highest honor, the Marian Gould Gallagher Distinguished Service Award. Throughout her professional life, Marie has been a guiding force in the Southern California Association of Law Libraries, Practising Law Institute’s programs for law librarians and Teaching Legal Research in Private Law Libraries (TRIPLL).
Today, Marie has commenced on a new path she terms “Life in Progress,” which enables her to pursue a diversity of interests as a master swimmer, law librarian, trainer, storyboarder and designer of wearable art. She continues to be a dynamic speaker and prolific writer on such topics as private law library management, presentations and training. She is a member of Toastmasters International and is active with the American Society for Training Development (ASTD) and in continuing education for private law librarians. She devotes her “free” time to various non-profit and civic activities.
Professionals who do occasional training in their area of subject expertise often do not give much thought to evaluation. They know it should be done but wait until the last minute to whip up a participant reaction survey, commonly referred to as a smile sheet. At the end of the training, they beseech participants to complete the survey (but don’t provide time or motivation to do so), look at the responses received with apprehension, heave of a sign of relief when they are positive and then consider evaluation complete.
As an evaluation tool, the smile sheet is familiar, inexpensive, quick and easy to create. Yet evaluation is a complex job and no single tool can measure training. A broader and newer view of evaluation is that it must be multi-level (distributed throughout the organization), continuous throughout the training cycle, focus on business results and support the organization. The one-shot smile survey does not meet these requirements but can be combined successfully with other tools. Here is how and what you need to know.
ü Take a crash self-taught evaluation course. Start with Jane Holcomb’s Make Training Worth Every Penny: On Target Evaluation. Familiarize yourself with the Kirkpatrick model which uses the participant reaction survey as the first level. Donald Kirkpatrick Evaluating Training Programs: The Four Levels
1. Get participant reactions (smile sheet)
2. Measure participant learning
3. Observe change of on-the-job behaviors
4. Look for organizational results
ü Begin the evaluation process in tandem with the analysis or needs assessment phase of the Instructional Systems Design (ISD) and continue the process though the objectives, design and delivery phases.
ü When specific training objectives are identified, consult with others in your unit and organization, including the potential trainees, to evaluate whether the objectives are specific, measurable, achievable, realistic and doable in the time frame available.
ü Evaluate the instructional technique(s) to use. How can new on-the-job behaviors be measured? Is a pre-test necessary? How long after training should a follow-up be made? If soft skills such as customer service are involved, will the participants be open to role-playing? Do the training facilities need to simulate the workplace reality?
ü Select evaluation tools for all the program components: Materials, instruction methods, facilities, arrangements, equipment, media used, costs, and organizational support. An excellent program can be ravaged if offered at a time that conflicts with other organizational priorities or is held in cramped, uncomfortable facilities.
ü Explore whether training is the best solution to the problem you are trying to solve. Sometimes mentoring, coaching, counseling, or a job aid can solve the problem as well.
ü Determine what you, other decision makers and stakeholders want the evaluation process to do and how evaluation fits into your knowledge management plan?
Provide feedback
Compile a database of who is trained, licensed or credentialed
Determine return on investment
Measure costs and variables to administer the training budget
Report relationships between training and improved job performance
Give positive reinforcement to trainees to help them stay motivated
Show trainers where to improve
Build a learning community built on shared experiences
Indoctrinate new staff
Promote change
Recognize accomplishments
Test skill levels
Identify fast-trackers
Focus energy on specific personnel issues
Provide data to play the power game to gain a share of the organization’s resources
ü At the design phase, consult with a colleague or professional trainer to review your design and assumptions for the choice of instructional technology. Will the participants have the background information or vocabulary? Will they be familiar with the processes related to the procedures they learn? For instance, if you design a cite-checking course, will the participants be familiar with state and federal court structures? Are they already trained on Lexis and Westlaw? Will they be familiar with the format of legal documents to be cite-checked? Do they know citation terminology? Are they familiar with the Harvard Blue Book and other citation manuals?
ü Include a quiz with the training announcement so the participants can self-test and self-discover what they need to know. People remember what they discover for themselves.
ü Familiarize yourself with research reported in professional training literature indicating that there is little correlation between favorable participant reaction and amount of learning that occurred. The “smile sheet” euphemism comes from the observation that this tool is generally supportive of the trainer. It encourages instructors to entertain rather than focus on participant learning, suggests to participants that learning is passive rather than active and comes too late in the training cycle to correct any design flaws.
See Don’t Smile About Smile Sheets and Going Beyond Smile Sheets…How Do We Know If Training Is Effective?
ü Design an end-of-course participant survey to assess the immediate impression of the program but only ask questions that give you information you intend to use or report. Tailor a form specifically for each program. Avoid generic forms. They send a “ho-hum” message regarding evaluation. Participants will respond accordingly.
ü Keep the form simple and to one page. Make the form visually attractive. Convert reactions to numerical ratings for easier tabulation, i.e., 1 = poor, 2 = fair, 3 = good, 4 = excellent. Fewer categories are better than more. Ratings are impressions, not calibrated measurements. Ten categories afford no more accuracy than four.
ü Include several open ended questions asking participants to make comments or suggestions. “What are the three most useful things you learned?”
ü Provide time for the participants to complete the form before the training ends. “Take aways” never get returned. Devise a motive or reward to complete the survey: job aid, promotional item, or homemade memento of the training experience.
ü Identify the program, trainer and date at the top of each survey. To encourage completely honest responses, the respondents should be anonymous along with the return procedure. In other words, don’t have someone standing at the door reading the responses.
ü Explain at the beginning that the evaluation is an integral part of the training. Participants are part of the training team and their input is vital. Transition to the survey with a brief wrap-up or brainstorming session. The group’s input will verbally “jump start” individual thinking on the written form.
ü Use the smile sheet immediately after a closing module on completion of an Action Plan, a first step to utilizing learned skills and knowledge. This puts participants in a posture of taking responsibility for their learning when they respond to the survey.
ü Before and after the survey use other evaluation tools: interview, test, focus group, job performance observation, or brainstorming. Interview not only the participants but the other stakeholders (trainers, supervisors, managers, co-workers, customers) after six weeks and three months to find out if there are new on-the-job behaviours. At this point, you may discover that there are environmental impediments, such as lack of equipment, supervisor support or authority, to using the newly acquired knowledge and skills.
ü Understand that learning is often stressful and confusing, especially when learning complex skills or relearning old proficiencies. The state of having learned usually has emotional rewards but the process of getting there can be downright humiliating. Adult learners with bad previous experiences may freeze up at the prospect of formal training.
In a rapidly changing world, evaluation tools need to be constantly refined to measure the value of training to your organization.