Learning Evaluation – A Good Process is Common Sense and Simplicity

Sometimes over thinking something can lead to unnecessary complexity.  In learning evaluation, I spoke to 180 organizations not so long ago and a dominant theme across them all was a need to get back to basics, focus on simplicity and just use common sense in the process without trying to outsmart it.

So, what’s happening to deploy common sense and simplicity?  A lot actually.  Here are a few noteworthy trends in learning evaluation so if you’re thinking about modernizing your process think about integrating some of these into it.

Simplify the evaluation itself.  Make it less questions (more than 10 is too many).  Make the questions straight-forward.  Use Net Promoter Score (NPS) to increase validity.  Deploy user-design principles to enhance the respondent experience.  I’m happy to share a few links to our evaluations, just email me if you’d like them.

Think ‘less is more’ when it comes to reports.  Don’t build giant pivot tables. Avoid the over-engineered dashboards that work for some but not others.  Focus on a report for really tactical people like instructors that need immediate evaluation data, a report for program managers that need to stack rank by attributes of their programs (course, modality, location, instructor etc.) and by learner demographics (years of service, business unit etc.) and a report for stakeholders in the business that has a small set of KPIs using business results they care about along with evaluation results and activity information.  So, 3 reports are really all you need.  I am happy to share some report examples too, if you email me.

Don’t lose sight of the point of the process, to improve programs and the people impacted by them.  Focus your use of the simpler reports mentioned above so that the majority of effort is on taking action where the data shows a real need to do so.  If a business result is below goal, don’t debate whose fault it is, collaborate with the business to improve it.  If an instructor has a low score vs. other instructors, put an improvement plan in place to make the situation better.  If learners with over 10 years of experience had lower impact during the recent management training, get creative around how everyone from instructor to program manager can change what they do to make those managers more engaged and seeing greater impact.  Simply put, don’t over analyze the data, where performance needs to be improved, improve it.

I hope this short blog post is helpful.  I kept it short and pretty reasonable in content.  I guess I just like common sense and simplicity.

Words by Jeffrey Berk, COO, Performitiv, reach Jeffrey at jeffrey.berk@performitiv.com

Blog Archives

September 13, 2021

Get That Scrap Outta Here!

Read More ➝

August 31, 2021

Manager Support: The Unsung Hero of Learning Impact

Read More ➝

August 16, 2021

What Every Learning Leader Should Demand from Their L&D Dashboards

Read More ➝

July 22, 2021

Performitiv Summer 2021 Release

Read More ➝

July 21, 2021

Management Reporting: Ideas for Impact (Scorecards)

Read More ➝

July 1, 2021

Management Reporting: Ideas for Impact (Performance Statement)

Read More ➝