CLOs’ ability to assist their organizations in surviving economic crisis corresponds with their ability to measure and demonstrate impact.
CLOs have a measurement dilemma. They must demonstrate the business impact of training, but they can’t garner required resources without showing the connection between training and performance. While the economic environment has made it even more difficult, some progress has been made in this endeavor.
Every other month, IDC surveys Chief Learning Officer magazine’s Business Intelligence Board (BIB) on an array of topics to gauge the issues, opportunities and attitudes that make up the role of a senior learning executive. For this article, more than 800 CLOs shared their thoughts on the topic of learning measurement.
Training as a Response to Crisis
When this survey was conducted in January 2010, the economic crisis was abating in the U.S. After 17 months of economic difficulty, almost 40 percent of CLOs felt their companies were performing better than they had expected, compared with about 25 percent who felt their companies were doing better than expected a year ago. This positive shift represents a welcome relief from the months of dreary economic and development news.
More important, however, is that 75 percent of CLOs felt they played a moderate or significant role in their enterprises’ response to the economic crisis. At least part of this role can be attributed to the ability of CLOs to align their programs to organizational need and to document the impact of their development programs.
Unfortunately, almost one-quarter of learning organizations felt they played only a minor role in the enterprise’s response to the economic crisis, so there is room for improvement.
Training professionals aren’t particularly satisfied with their organizations’ approaches to and successes with measurement, but there is little disagreement among learning professionals about the value of measurement. When done properly, measurement can demonstrate training’s impact on the company’s top and bottom lines and set the stage for further increases in relevance. As one CLO reports, “Our metrics [have] enabled us to grow and, unlike other areas of the company, increase head count in 2010. We have demonstrated our value to the company and have strong support from senior management. Our department reports directly to the senior executive of our company.”
Key metrics may include employee performance, speed to proficiency, customer satisfaction and improved sales numbers. As the BIB survey shows, the challenge lies in gaining access to these key metrics and finding the time and resources to conduct measurement.
In 2008, this survey reported that a majority of enterprises indicated a high level of dissatisfaction with the extent of training measurement that occurs within their organizations. Two years later, as seen in Figure 1, that feeling has moderated, and for the first time, more CLOs feel satisfied with their measurement programs than dissatisfied.
The relationship between the role a learning organization plays within the enterprise and its satisfaction with measurement is stark. About 60 percent of companies that were very dissatisfied with the state of measurement in their organization felt they played only a minor role in their company’s response to the economic crisis. On the other hand, about 60 percent of organizations that were very satisfied felt they played a significant role in their organization’s response. Learning organizations that can demonstrate their impact can have a greater impact on their organization when needed.
Compared with prior years, the common forces working against satisfaction remain a combination of capability and support. Specifically, CLOs believe their ability to effectively deploy effective measurement systems is limited by:
- Level of resources.
- Level of leadership support.
- Availability of technology.
- Level of funding.
- Culture of indifference.
- No means to automate process.
- Lack of understanding of how to implement measurement.
Leadership support is essential for developing an effective measurement program. But getting that support is challenging. The link between executive support and a learning organization’s ability was clearly illustrated by one respondent who said his executive staff fails to align training with business needs. Others report that their measurement program is not as far along as they hoped because the level of understanding of measurement and evaluation among their learning and development team is weak.
For several years, learning and development professionals have complained that their biggest issue is gaining the support of management, that they face a certain level of apathy toward the quality of what they’re generating. A combination of the need to generate impactful measures and the willingness of executives to leverage them presents a challenge.
With various forces aligned to retard advancing measurement, there has been little change in the use of measurement. About 80 percent of companies do some form of measurement — with a bit more than half of those using a manual process and a bit fewer than half using some form of automated system.
CLOs in this survey have been reporting a mix of processes for measurement as far back as 2004. There has been a slight decrease in the percentage of respondents who indicate that they use a manual process and an increase in the percentage of enterprises that use a combination of LMS and ERP systems to develop their learning metrics.
Organizations generally report that a technology-based learning platform gives them a greater ability to make correlations between training and performance and that the absence of technology can be a significant burden. One CLO stated, “Due to lack of automated resources, metrics and measurements are done per time available. As such, the data is often dated.”
The barriers to correlation remain consistent: lack of resources and an inability to bring data together from different functions. And sometimes, technology is not the best answer. As one CLO reported: “Actually, our training metrics [are] generated partially manually and partially automated from PeopleSoft. We use Questionmark to measure students’ performance and SurveyMonkey to assess students’ satisfaction with the course. These tools are not integrated.”
There has been a meaningful change in the number of enterprises correlating training assessments to changes in the organization. As seen in Figure 2, 64 percent of respondents reported correlating training with employee productivity, up from 33 percent last year. Fifty-seven percent of respondents reported correlating training with overall business performance, up from 42 percent last year. Respondents correlating employee performance and overall customer satisfaction showed slight increases from 2009 as well.
Other research suggests organizations that can consistently tie training to specific changes are more likely to train less. They focus efforts on training the most appropriate people on the most appropriate topics; getting rid of useless training; and spending less on training overall.
Reason for Hope
Despite measurement challenges in the learning and development space, the survey showed a certain level of optimism. As seen in Figure 1, almost 40 percent of CLOs reported being either satisfied or very satisfied with the extent of measurement going on at their companies. This group typically has wider support for measurement within their organization and enough manpower to consistently tackle measurement initiatives.
Almost half of all organizations, both satisfied and dissatisfied, have plans for some additional measurement initiatives in the next 24 months, far less than the 90 percent who reported planned activities last year. On its face, this seems like a decline in interest for measurement initiatives, but 24 months ago, CLOs reported their intentions to increase the measurement of employee productivity. In 2010, the ability to correlate employee productivity with training increased dramatically. What we measure, we improve.
Room for Improvement
For reasons already cited, measurement will remain a significant challenge for learning executives going forward, but it continues to be on the agenda. CLOs can take several steps to demonstrate relevance. Three of the most significant practices are:
- Define success early. By defining with stakeholders what success will look like upfront, learning professionals are more easily able to identify and benchmark key metrics for measurement before training is delivered, making post-training results easier to quantify.
- Establish metrics at the project or business-unit level. While it may be tempting to go all out in attempting to demonstrate training’s value at the enterprise level, successful measurement programs typically start off as smaller initiatives that focus on the project or business-unit level. When working with smaller groups there are typically fewer obstacles to interfere in the measurement process as well.
- Set expectations upfront with stakeholders. Help stakeholders who are interested in measurement to understand the commitment that is required to see assessment projects through to the end. This way less resistance will be encountered during the measurement phase.
Companies that incorporate these guidelines into their assessment methodology should see a marked improvement in the success and relevance of their measurement initiatives.