As long as learning has played a prominent role in organizations, metrics have played a prominent role in learning. If CLOs want to be among the heavy hitters at their organizations, they’ll have to make reliable and meaningful measurement a priority.
Metrics: Some CLOs swear by them, others at them. Chief Learning Officer magazine columnist Mike Echols sounded the clarion call this February for CLOs to rededicate themselves to measurement, whereas former Goldman Sachs learning executive Steve Kerr called measurements like ROI “silly.” Love them or hate them, though, metrics likely will play a key role in resolving a serious challenge for CLOs. Few CEOs and CFOs are satisfied with the impact of their organizations’ learning programs. There is a need for discussion on the value of metrics and which metrics are valuable.
We’re All in This Boat Together
First, there needs to be reconciliation between the two camps: those who believe metrics are our salvation and those who believe they’re a waste of time and energy and a feeble attempt to justify the value of learning. Secretly believe what you want, but the heads of your organization — you know, the ones who dictate the budget for learning — are dissatisfied and want evidence of success. Two years ago, when Accenture asked executives about their learning programs, only 10 percent were very satisfied. So after innumerable conferences, classes and advances in technology, CLOs still have a long way to go.
ROI Is Dead
Most organizations don’t bother to measure the financial benefits of learning. To a large degree, this is because measuring the ROI of learning is perceived as difficult, time-consuming and labor intensive.
More than half of the organizations surveyed by Bersin & Associates cited the need to improve alignment with the business as one of the top learning and development challenges for 2008. Why would we consider using the same ROI method that has a two-decade-plus track record of indistinction?
Simply put, it’s because the method being used for ROI is too complicated. Some, for example, attempt to isolate the effect of training and separate it from other interventions (e.g., technology, marketing, process improvements) as if conducting a laboratory experiment. A lot of hard work and effort gets applied to scrub financial data and tease out the contribution of the learning function. This is but one small example of how we make Level 4 evaluation needlessly difficult on ourselves.
This apparent overcomplication drives many learning leaders to abandon ROI. But they still must do something to report on the training function. Many reach for readily tracked data, such as course-completion rates and training hours — none of which do much to establish the value of learning at the C-level.
Offering up these metrics in some ways demonstrates a fundamental disconnect with the business. Course-completion statistics show nothing in the way of alignment with the goals of the business. There is no inherent value solely in the completion of courses.
Long Live Impact of Learning
For those who want to demonstrate the value of employee development programs to the business, meet Impact of Learning (IOL). IOL is simple enough to be practiced in 100 percent of organizations, not just those elite few that have the time and resources to engage in ROI.
There are three basic steps to IOL as a way of demonstrating the value of learning:
1. Insight: Create a map of the linkage from business goals to training initiatives and identify success metrics.
2. Individual: Gather overall data on success metrics, in addition to personal success stories.
3. Impact: Develop an impact report rich with compelling stories of individual and overall success so executives clearly see the impact on organizational issues about which they truly care.
Start With Insight
The road to business alignment needs to be mapped early on. For each major initiative involving training, it is vital to gain insight and identify the business drivers up-front and in writing. Robert O. Brinkerhoff describes a delightfully simple one-page format to represent business alignment in his book, The Success Case Method. IOL insight involves understanding the business case during the initial phase of work, when the training staff first starts to understand how they can support an initiative.
Insight essentially means metrics — specifically, effectiveness metrics that business leaders are committed to achieving. But it’s important to point out that insight metrics rarely are as broad as overall sales or revenues. For example, for the leader of sales at a manufacturing company, sales from new customers were a key focus. That nuance was critical both in planning the learning curriculum and any future assessment of impact. In this case, substantial resources needed to go toward training to sell into new markets.
Or in another example, sales metrics can be targeted specifically to a channel such as independent dealers vs. big-box retailers. Whatever the insight gained from front-end planning, learning leaders must establish from the start that they have clear intent to deliver results.
Mapping this makes our intentions even more visible and lays the groundwork for assessing impact. Typical maps of the linkage to success can be four to six columns of information. Sample column headings may include:
• Who will be trained (audience, job role or title)?
• What will they need (knowledge and skills, competencies or expected behaviors)?
• What they should do (key actions)?
• What should individual results be (outcomes or job results)?
• What are the company’s goals (business objectives or strategic results)?
The value of the insight map is that it documents the shared understanding of what needs to happen and how and when success will be measured. Armed with that, a CLO can enhance the curriculum accordingly — adding where needed, pruning courses that no longer fit and planning learning strategies that make the most sense for key audiences.
Individuals Make It Happen
The CLO with insight sets the stage, but it is the individual learner who makes it happen. The key to the second part of the IOL process involves a celebration of how people create results. Instead of poring over tedious stacks of data or creating cumbersome financial models to calculate training’s ROI, we prefer to listen to the anecdotes of the individuals we have trained. Are they creating the important outcomes we mapped earlier? If so, are they doing so by virtue of the concepts and skills we provided?
The CFO of a major health care system articulated how she used what she had learned to create a balanced set of outcomes, including margins that were within financial targets. This was achieved without sacrificing the core mission of serving those who could not afford health care. The powerful story she told clearly demonstrated success in the context of the organization’s goals and fully acknowledged the value of what had been learned.
This is the most compelling type of data describing the impact of learning. The individual perspective demonstrates the alignment of employee development with the goals of the business. If you can’t find those individual examples in the organization, then you need to take a hard look at what you are teaching. Perhaps the curriculum failed to keep up with what really matters in the current environment.
Business goals change and so should your training content. That’s important data, as well. Individuals also can tell you if you are not teaching them things that achieve results. They are a great source when it comes to showing the need to make important improvements in training delivery. This side of the equation is not supplied when learning functions rely on an ROI approach.
I have never seen an ROI study fail to return an answer of less than an 80 percent gain. (Given that, it’s amazing that CFOs don’t clamor to double the training department budget each year.) Yet, overall, individual performance stories are far more credible to executives than the fantastic results often reported in ROI studies. No doubt, it’s still worth looking at whether the sales group is meeting its sales targets and if the finance department is managing the business within margin targets. At a high level, this context is important to see if the business is making progress toward its larger goals. But assessments at the individual level should not be overlooked.
Say It With Impact
Learning leaders can’t forget to do the most important part at the end of the IOL process. They need to let their internal customers in the organization know they care about the results. The best way to do this is with a written report: an impact statement. If you just spent $500,000 to improve customer service, your internal customers need to hear something more important than the fact that employees sat through 3,000 hours of training.
For example, a large health insurance company is in the process of teaching more than 200 customer service representatives how to more quickly and consistently answer questions about coverage. The organization is geared up to analyze expected improvements in their first-call resolution one month after implementation. It’s critical to get that message out in a timely fashion. Few would care if the learning department reported the number of course completions. Not many would listen if it decided to take the next two months analyzing data and then touted some complex rationale around 120 percent return on investment.
But when it tells the story of the 15-year veteran customer service representative who applied what she learned and immediately increased her ability to provide full and complete answers to callers, now they are really saying something of value. Her story, combined with overall results of call accuracy and decreases in escalated calls, is a solid statement of impact.
An impact statement should include at least two or three cases, but it ought to have as many as are needed to lend credibility and clarity to the situation. It also should include the original map constructed in the insight phase. And most importantly, it needs to be produced as soon as outcomes can be seen in the organization. Business leaders need to get the facts quickly before their attention moves on to the next great challenge facing them.
Every couple of decades or so, thought leaders unveil a model that transforms learning’s approach to evaluation. In the late 1950s and early 1960s, it was Donald Kirkpatrick’s four levels of training evaluation. In the 1980s, the new idea was to prove the return on investment of training. Comparing the cost of a learning intervention to the performance gain it produces seems like a worthy endeavor. ROI might seem viable to a small number of companies willing to put forward the time and money for such an analysis. But even if properly executed, it may overlook whether the learning intervention was organizationally aligned to begin with.
And that’s the challenge facing our industry right now: proving our contribution to business goals. IOL is a method that ensures linkage to business goals while yielding faster and more compelling evaluation results. It’s time for another sea change in learning measurement.