In a bid to rush to the next project, many E-Learning professionals carelessly skip over the evaluation process. Because we believe it is vital to learn from experience, our mantra of the moment is:
“Sometimes you win, sometimes you learn”.
Therefore, today we want to take the opportunity to unpack best practice in E-Learning evaluation.
Evaluation is often overlooked because people lack the time, patience or clarity on the right approach or tool to use. Often, they are unsure of what data to collect or how to use it once they have collected it.
The first step in evaluating an E-Learning campaign lies in identifying the success metrics or performance parameters. These are formed by aligning or comparing organisational aims and goals with the results of the learning campaign.
Following this crucial first step, there are a number of ideas and approach methods to evaluation. One of the foremost ideas being propounded currently is Kirkpatrick’s taxonomy.
Kirkpatrick identified four levels or criteria for successful training evaluation:
Level 1: Learner reaction
Level 2: Learner knowledge
Level 3: Learner behaviour
Level 4: Learning results or impact
Sadly, many learning and development departments move no farther in their evaluations than level one or two.
They listen to how a selection of learners feel about the learning campaign and measure how much information they have retained but fail to research how the newfound information is being applied or what impact it is having on the organisation.
So what are the best tools to use to evaluate success?
We recommend a combination of the following:
- Knowledge assessments both before and after completing training to assess learning progress;
- Conducting a pilot to spot flaws early on;
- Using big data to track learner engagement as well as weak points within the E-Learning content;
- On the job assessments by management to gauge behavioural changes;
- A forum (such as Survey Monkey) to receive feedback from learners about their personal experience;
- Encouraging learners and stakeholders to give feedback in the long term; and
- Brinkerhoff’s Success Case Method suggests that we must look at success at an organisational or system level rather than just the individual learning programme. Brinkerhoff also asserts that it is useful to look at the “outliers” i.e. those who have been particularly successful or unsuccessful. By looking at these outliers, it may be possible to identify the differentiators between E-Learning success and failure on an individual level.
After using this toolkit to evaluate your E-Learning success, you may find that learners are happy, clued in, applying their knowledge and your organisation is flourishing.
Hooray for you!
If you have a different result and find issues with your E-Learning, then take a deep breath, it’s going to be okay!
There are a number of reasons why your E-Learning campaign might not have hit the target this time.
It could be as a result of lack of support from management, the learning campaign being too theoretical or inaccuracy in identifying knowledge gaps pre-training. Luckily, your evaluation should point you in the right direction.
An important nugget of knowledge to keep in mind is not to throw the baby out with the bath water. In the majority of situations, change can be made in small, incremental amounts.
We recommend asking the people who have given you feedback and identified issues to become part of the solution by suggesting what they would have needed or preferred to better their learning experience.
There is always room for improvement and evaluation is one of the most consistently helpful tools we have at our disposal as learning and development professionals. Just remember our mantra, “Sometimes you win, sometimes you learn”.
Please let us know your comments or share with others who you think may benefit from this. We’d love to hear about your biggest E-Learning successes and failures. What is your strategy for recovering and fixing your mistakes? Follow us on twitter @aurionlearning for our latest blog articles and updates.