PDI  |  ABOUT  |  TRAINING  |  DRIVING  |  EVENTS  |  NEWSROOM  |  CONTACT


   
Measuring Up

Measuring Up byJohn Murray

PDI delivers training across the many regions and locales. We strive to engage and delight all of our learners. The question that comes up though is how do we know we are doing what we set out to do? John Murray III, PDI’s Director of Business Analytics, shares some insight on PDI’s analytic methodology and why it’s so important to successful learning events.

Is there a specific strategy you use when developing a learning measurement strategy?

PDI puts the client at the center of our development for building successful learning events and strong measurement tools. Measuring learning is a progression, not a one-time event. We ask questions at the beginning and throughout the development process. This helps us define success and check on the progress as we build the training.

We engage subject-matter experts early in the process to help ensure the accuracy of content and validity of questions as well as align the content and questions to the objectives of the course and the department.

Lastly, we solicit best practices, when available, to demonstrate the details of the content. This helps us refine the training and the measurement to the team and individual.

What makes PDI surveys different from the typical smiley sheets?

PDI attempts to capture the whole experience, not just the emotional satisfaction with the course. We ask the basic questions about the content and trainers as well as the pace and length of the class. We also add in details and questions related to specific portions of the content. This helps us better predict the use of the information shared from day one, rather than day 91 when the learner has forgotten where he or she learned that information.

How do you create knowledge assessments for each course?

We like to create tools that help reinforce the material for the learner. In our Checks for Understanding, we try to only create questions that are covered well in the course. We make sure the questions make sense to others who may not be familiar with the information covered in the training. We also align the type of questions to the type of material being covered. So, if there is more data shared in the course, there would be more questions related to the data shared. Lastly, we monitor the answers on each Check for Understanding and recommend changes to the question structure, answers, or how the information is being presented if a learner is confusing similar answers. We use our own data collection process to provide continuous improvement for each of our clients.

What type of reports do you share with your clients?

Reports are unique and are based on the information being collected, as well as the objectives our clients have shared with us for their program. We have some standard templates that provide much flexibility in how data is displayed and shared. We create interactive Excel tools for client programs with established and stable data requirements. For those clients who wish to delve a little deeper into the data and learn what change will make a difference in their program and the learning environment, we conduct a full analysis using a variety of presentation tools. In fact, we strongly encourage that level of engagement with our clients.

Are there any recommendations you share with your clients prior to building measurement tools and analyzing the data collected?

Having clear objectives is a great way to start your learning plan. It is tough to truly measure the program, class, or event without clear objectives. Those objectives will help you choose the right tools to collect and analyze the information.

It’s critically important to use the information you collect to improve the learning or future learning events. Participants will thank you for making changes based on their feedback, and will grow tired if you do not. When making changes though, align them to the feedback of the many, not the few. Make the comments part of your analysis and give the learner’s voice a data point. This voice is just as important as the quantitative data you will calculate.

Lastly, share your results with the stakeholders, leaders, partners, and the learners themselves. Sharing the results helps gain support for future projects and encourages learners to continue completing the surveys that help the programs improve.