Many companies are exploring how to transition from using traditional learning methods to a digital-first approach. Much of this involves a move away from old-style classroom style teaching to the use of online technologies, such as watching videos and answering questions to evaluate learning outcomes.
To be effective, organizations need to collect data on a number of factors related to their training: about the applicability and usefulness of the training and to assess how the training was perceived; the extent of employee engagement; and the effectiveness of the learning outcomes. The gathering and analysis of data by training managers can also be used to show stakeholders in the business what happened after the training concluded. Moreover, the reviewed data can be used to predict what will happen with new learning programs.
The need for analytics
Quoted by the website Training Industry, Christopher Yates, who is the head of learning and development at Microsoft, sees learning and development data analysts as essential for business digital transformation:
“It’s essential. I can’t imagine having learning and development team today that is not supported by dedicated data analysts. Without learning and development analytics, you’re basing your decisions on luck or the way we’ve always done things. Without insight, all you have is a guess, a hunch or a feeling in your stomach about what’s working or not working.”
While many businesses have sought the opinions of trainees through course evaluation questionnaires, to be truly effective a shift is required away from training metrics towards using learning analytics. Here, according to the website ‘elearning industry’, there are three key questions for employers to know the answers to following a training course:
-How did the training help to achieve business objectives?
-What are the long term positive business impacts of the training?
-How will the training directly enhance team performance?
Measuring learning outcomes at the workplace
This is to improve the impact of learning programs on actual job outcomes. The limitation of training metrics, Scott Weersing reports to the Association of Talent Development, is that this only shows what has happened. However, where businesses need to be is with what is taking place right now after the training has concluded.
Big data can also be used to assess factors like:
-Course starts/course completions
-Course access points
-Time on system
-Clicks and scrolling
How might this work in practice? Weersing gives an example whereby a group of employees are given training in email security. The following day a series of fake phishing emails were deliberately sent to determine if the employees practiced the appropriate behaviors when faced with the security risk. Here the effectiveness of the training course could be determined by how many employees correctly identified the phishing email.
READ MORE: E-learning solutions for business success
Learning analytics can also be used to improve the effectiveness of learning. Here businesses should avoid the temptation to use analytics to prove that “learning is working” and instead develop actionable insights for improving the business impact of learning. The latter leads to better learning outcomes.
According to education technology writer Pete Schroeder, return on investment from training can be assessed by collecting and analyzing data using electronic systems. The types of information required include training costs and employee time as stacked against post-training measures of productivity and error reduction. To this other measures can be used like employee retention and revenue generation per employee.
In terms of software solutions for such analysis, there are a host of small startups together with major players like PwC’s Talent Analytics and Predictive Services, offering the tools for learning analytics. The best software allows data to be presented in different ways. This is especially important as analysis becomes more complex; here graphical representations of the data, such as in dashboards and other dynamic interfaces, makes the findings more intelligible.