Simpler Surveys, Fewer Reports, More Data Action – that’s the new face of learning evaluation

A question was hypothetically posed to many L&D professionals that use learning evaluation as part of their role:  ‘If you could wave a magic wand and change some things about your learning evaluation process what would they be?’

The answers were staggeringly similar across a wide variety of L&D professionals regardless of industry, geography or organizational size.  Three primary points were raised in their response to the question.

Point 1:  Make surveys simpler.  They are too long, too complicated and not applicable for how modern day learners and stakeholders respond to information.

Point 2: Run fewer reports.  Reporting has become too cumbersome, too complex and not consumable for practitioners to quickly find insight.

Point 3:  Act more on the data.  There is a lot of data (sometimes too much data) and when put into too many reports it becomes an afterthought to actually use the data once all relevant resources exhausted their limited efforts on data collection and report generation.  Not much energy is left to improve where the data says we should.

Given the above, it really makes sense to rethink the whole evaluation process.  It makes sense to fuse modern day performance improvement methodology with historically reliable learning evaluation methodology.  If we do that perhaps we have a better user experience to collect data, reporting becomes an exploratory and insightful process and automation and technology create visibility and accountability to act on data that pinpoint improvement areas.

One need only study the facts about surveys asked to employees:

51% of survey requests are responded to on a mobile phone, so the ability to complete numerous or difficult questions drops dramatically for these respondents most likely responding on their mobile phone while going from one meeting to another in their busy day

22% of survey data gathered is usable, so collecting tons of survey question responses from surveys that branch, change scales, or require significant thought are going to be part of the 88% of survey data that is unusable

5% of survey data gathered is used for improvement, so expending money, time and people collecting and reporting data that goes 95% unused is a waste of money, time and people

If we understand this about the modern day employee responding to a learning evaluation why not change it?  Here are 3 ways to counteract the 3 most common complaints about learning evaluation.

Suggestion 1.  Significantly reduce the number of survey questions and the complexity of them.  Our standard evaluations have under 10 questions, some of them have 5 questions and some have just 1 question.  The questions are straight-forward, use a consistent scale, don’t branch and are minimal for both a good user experience and a better reporting experience.  They leverage Net Promoter Score (NPS) aligned to historically sound learning measurement methodology to accomplish this.  Yet they are enormously agile.  Surveys can easily be tweaked for unique learning experiences (digital, on-the-job, etc). The goal is simple to author, administer, experience and report.  Also, surveys aren’t the only data source either.  You must use operational data, utilization data, and business data too.

Suggestion 2:  Reduce the volume of available reports.  Why make 30 iterations on your data available in reporting when 3 will do the trick?  If a person wants to do a deep dive, give them a CSV file.  Reporting should appeal to 3 audiences: 1) Director/VP – Give them a summary dashboard with aggregates, internal and external comparisons, 2) Managers – Give them an exploratory pivot table that color codes demographics and attributes to spot trends and profiles of impact and opportunity and 3) Administrators/Tacticians – Give them summaries that drill into questions that drill into demographics for easy navigation and thoughtful progression through highly tactical information.  Too many reports equals too many choices and modern day practitioners don’t have time to make choices.

Suggestion 3: Close the loop on areas where improvement is needed.  The golden rule of NPS is to ‘close the loop.’  NPS is not about collecting data using an NPS question nor is it about calculating the data using the NPS formula. At its heart, NPS is a tool and process to improve performance.  Leverage automation and workflow technology to link a poor performing KPI to a specific improvement plan assigned to specific people that will be accountable to solve the problem.  Change occurs when it is formal, accountable and visible.  Otherwise we have forgetfulness and finger-pointing while the original issue persists or gets worse.

Finally, change the narrative.  The story to tell to management (VPs and Directors in lines of business) is one of performance not how much credit you can take for what the program did.  Supply chain, finance, IT, and sales communicate metrics with a common theme: performance.  Where is performance at or above goal? Where is it below goal? When it is below goal, what are we doing about it?  That’s the conversation worth having that won’t raise eyebrows with skepticism but will result in the executive leaning in to understand the performance results as it is a common and consistent conversation they have with every other person bringing them a dashboard or scorecard.

In the end, modern day learners likely need modern day evaluation.  Using the steps above can help evolve learning evaluation from a non-value added data collection / report running exercise into a thoughtful, persistent process that focuses on performance and acting on data when performance isn’t where it should be.

The End.

Blog Archives

February 20, 2018

Turn Passive Learning Evaluation Respondents Into Active Ones

Read More ➝

February 8, 2018

Upcoming Webinars for L&D Professionals Interested in Learning Measurement

Read More ➝

February 6, 2018

Free Webinar: How to Manage Supplier Performance and Risk

Read More ➝

Free Webinar: The 4 Report Tools You Need to Manage Learning Performance Improvement

Read More ➝

February 5, 2018

Meetings About Learning Metrics Should be Constructive Not Controversial

Read More ➝

January 15, 2018

Learning Measurement Workshop

Read More ➝