Moving from "butts in seats" to business-focused impact metrics requires coordinated planning, meaningful measurement and effective communication.
by Site Staff
January 24, 2010
Busy metrics are irrelevant and uninspired — at least to executives outside the training function. Though hours spent in training, videos watched and “butts in seats” reports may still allow training leaders to get some indication of efficiency, executives today require a new framework focused on effectiveness.
Long gone are the days when rigid reports pulled from registrations and tracking technology would suffice. You want your good work to be validated, right? Well, old-school metrics from just one department — the training department — haven’t worked in the past and aren’t going to make it happen.
These activity-focused measurements, called busy metrics, are often still valuable to training departments, individuals and compliance officers, but as learning executives align with the organization’s greater objectives, they must evolve to relevant, agile business metrics. Looking at the changing learning landscape, metrics will continue to be problematic until learning practitioners embrace measurement systems that are informative and adaptable to the entire enterprise.
“Our industry is inundated with activity metrics, and we think that as long as we see how much we are doing, we must be doing something right,” said Doug Harward, CEO of Training Industry Inc. “The real advances in measurement theory will help training managers understand the difference between activity and achievement.”
The Crutch
A common complaint about learning management systems (LMSs) is their lack of reporting flexibility. In a survey of LMS customers for the Bersin& Associates study “Learning Management Systems 2009,” 45 percent of respondents cited reporting capabilities as the No. 1 challenge with their current LMS. Reporting was also the highest ranked driver of dissatisfaction in the study.
According to David Mallon, senior analyst for Bersin& Associates, the best systems in this area are able to capture and report on all completion data — all LMSs do this, but some struggle with providing the same level of support for imported content, such as SCORM e-learning versus native activities; aggregate data by manager or other user grouping — most LMSs do this; compute completion percentages; and generate reports that show exceptions (e.g., who has not completed, who is more than 30 days late and so on) by group and manager.
“LMS providers cannot account for every customer’s reporting needs, so built-in stock reports will only be so useful,” said Mallon.
Traditional LMS metrics are generally made up of isolated data focused on training delivery or deployment. LMSs were created to address the needs and reporting metrics of training organizations. However, because they are training-focused technologies, they are not integrated with the functional performance elsewhere within the enterprise.
Sadly, the learning function has used this limitation as a crutch for years. We often hear, “This is where we put our data and pull reports; it’s too hard to get data from other parts of the organization.” This excuse is wearing thin with senior executives. We can and must collect data from across the enterprise to validate training impact and success. Combine cross-organizational data with relevant learning metrics and we just may have a recipe for success and executive validation.
Training Is an Island — But It Shouldn’t Be
Training is a profession without the rules, guidelines and consistency of other functions such as accounting, HR and engineering. Practitioners tend to do things differently and are often defensive because “it’s different here” — even if “here” is just another part of the same company.
I recently co-hosted a learning executive think tank and found that this conversation is still hot. Everyone, regardless of industry, felt passionately about one of two things, and most often both:
- We need to have different methods of reporting based on our industry or organization.
- We need to have some kind of similar reporting structures and guidelines, regardless of where we work.
This might require similar, generic training metrics that are meaningful to the enterprise, regardless of industry, and metrics specific to industry or function.
Before getting started, it’s important to plan ahead, gather data appropriately and present it effectively to executives.
Step 1: Plan Differently
Learning leaders aren’t new to dialogue about business impact metrics. But defining and utilizing these metrics often takes herculean effort. Instead, reporting has long been stagnant, which has been detrimental to the industry’s collective professional reputation.
It’s important to think about business impact before getting started. If we were to step back for a moment and consider what data matter in a certain industry or function, we might be surprised by how different our entire reporting structure would look.
“My experience is that training professionals are unclear as to what they should measure and think that using a technology will solve the problem,” Harward said. “Instead of measuring the activity of training, we need to better learn how to measure the achievement that occurs as a result of the training.”
Learning leaders generally plan to measure efficiency, which is, for the most part, all that executives of the business world expected in the past. Maybe learning leaders like to measure efficiency because it’s easy to do and it doesn’t require cooperation, collaboration or buy-in from other functions.
Gathering relevant data and creating impactful information requires a level of cooperation and teamwork rarely found between training and operational functions. Instead of hours trained, how about illustrating sales growth following the launch of a channel partner learning portal? Instead of the number of employees who took an e-learning course, try e-learning’s impact on the call center’s customer satisfaction reports by office location or shift.
Here are a few ways to evolve the measurement relationships between training and organizational success:
- Remind executives and functional leaders that training should drive top-line sales revenue up, drive customer satisfaction up or drive individual productivity costs down. We should do one or more of those things and demonstrate our effectiveness.
- Buy into and convince others that individual training results are not as important as we have all been led to believe. They are important, but the enterprise will get more excited about large numbers, trend lines or comparisons between regions, customers, partners and operational performance, based on training impact.
- Collect data that matter to the enterprise and executives. And if you don’t know what really matters to them, ask for detailed examples.
Cushing Anderson, program vice president at IDC, recommends that learning executives “define success early.”
“By defining with stakeholders what success will look like upfront, learning professionals are more easily able to identify and benchmark key metrics for measurement before training is delivered and ensure post-training results are more easily quantified,” he said.
Step 2: Collect Relevant Data
Historically, enterprise metrics have not resided within the purview of the training function. Incorporating business metrics into training measurement can be difficult at first because it takes coordination between functions to gather and report meaningful metrics. The good news is that a coordinated effort results in more meaningful data and benefits everyone.
Part of the shift involves thinking of metrics gathering as a critical component of the learning function. Unfortunately, many learning leaders rely too much on LMS data or financial performance. Instead, consider collecting cross-functional data, and don’t be afraid to start small.
“Establish metrics at the project or business-unit level,” Anderson said. “While it may be tempting to ‘go for the fences’ and demonstrate training’s value at the enterprise level, successful measurement programs typically start off as smaller-sized initiatives that focus on projects or business units. When working with smaller groups, typically fewer obstacles interfere in the measurement process as well.”
Measurement should not be focused on individual success or failure. Instead, it should be about successful regions, the best support location or simple improvements where training has been deployed and used. It’s about large numbers and trend lines over time. Compare a sales region that makes training mandatory with one where it’s optional, for example.
Worry about the effects of training 100 people, or 60 percent of a regional sales force, and the impact on that office, not whether Bill and Mary met quotas. There are far too many variables at the individual level to document training impact. However, it is still necessary to track individual progress for compliance and utilize the raw data for bigger numbers and trend lines.
The Unimportance of Technology
The specific tools used to gather data can be outlined and scrutinized by effectiveness and cost. However, they are probably not individually important. Each company uses tools that match their culture, vision or values, but the important aspect of the tools is what they are collecting and how that data is being used.
“Tools and technology are the vehicles for how we collect data. Before we can solve any problem using technology, we need to understand the fundamental problem we are solving,” Harward said.
Aberdeen’s September 2009 “Study on Workforce Analytics” found that the organizations best able to integrate people data with other organizational data also achieved an average 11 percent increase in profitability over the past 12 months, as compared to a 2 percent decrease in all other organizations.
“Organizations that are able to integrate HR data with data from other parts of the organization, including financial and customer data, achieve impressive results,” said Mollie Lombardi, research analyst, human capital management at Aberdeen Group. “Simply measuring or capturing HCM data is not enough.”
However, when organizations are able to couple this data with educated reporting and analysis, they are more equipped to turn this data into actionable recommendations and decisions that drive better business performance.”
This is exactly why learning portals are increasingly popular. Web 2.0 learning portals have the ability to aggregate content and data from a variety of sources, breaking down the walls between training and the rest of the world.
According to Bersin’s Mallon, learning portals offer a few significant advantages. “First and foremost, they give people the specific business information they need within the context of their jobs, including formal and informal learning, collaboration and other business content,” he said.
“Portals offer users a single point of access to the LMS and many other applications, including performance and talent management, employee communities, performance support and general employee information,” Mallon added. “Perhaps a less publicized yet equally important advantage, portals also offer a platform for aggregated reporting regarding the intersection of these employees with these various data sources.”
Recent research by Expertus and Training Industry Inc. illustrates that organizations are finding the value in training portals and investing in them. In fact, more than 93 percent of respondents to the survey had some kind of learning portal, and 59 percent planned to launch a new portal or upgrade their existing portal.
Step 3: Present Data Relevant to Executives
Once you have an arsenal of relevant, cross-organizational data, don’t let it deteriorate. Use it when it’s fresh, and present it to individual executives in a meaningful way. Demonstrate training value through impact, i.e., correlations between training efforts and enterprise success, the link between absence of training and measurable consequences.
“I think the most important thing about sharing data is to make sure you communicate the data to people who need it (instead of mass publication), in a format that they can use it (analyzed) and in a time frame that the data is relevant (fresh information),” Harward said.
Anderson stated that encouraging learning executives to think about business-relevant data versus data only relevant to the training organization is fairly simple to describe, but harder to implement.
“Start with corporate business objectives, from the annual report for public companies or similar documents for private, and evaluate how training helps achieve those objectives,” he said. “Some research suggests organizations that can consistently tie training to specific changes are more likely to train less. They focus training efforts on the most appropriate people and topics and get rid of the useless efforts.”
Organizations won’t train “less” when learning leaders demonstrate positive impact on the enterprise. Instead, they will train and report differently and evolve into a strategic implementation role within the enterprise. Those things happen when there is aligned collaboration between the C-suite and the training department.
Lastly, have a formal communication plan or campaign. You are selling, or at least marketing, an old product in a new way. You need to deliver on the “new and improved” promise. It is much more rewarding to be valued, respected and strategically engaged in the enterprise. Make the move from the classroom to the boardroom by speaking their language and aggregating enterprise data with training data to generate metrics with a purpose, metrics that tell your story “their way.” That will result in validation.