Les articles du dossier
Les experts du dossier

Dossier special

Tin Can API: Will It Help Your Learning Analytics Measure Up?

For years, the learning industry has discussed the importance of business impact. And although many organizations strive to align learning with business objectives, proving that kind of impact is still a struggle.

Regardless, demand for accountability isn’t going away. In fact, now more than ever, senior managers are challenging L&D to map metrics to strategic priorities, and demonstrate continuous performance improvement. Yet, more often than not, LMS reports just don’t measure up.

Learning Analytics Disconnect: A Case In Point

For example, a professional friend of mine is responsible for training the manufacturing workforce at a pharmaceutical company, where learning is deeply tied to compliance with FDA regulations. The other day, he told me about the most frustrating part of his job — quality review meetings.

You see, at these sessions, the organization’s various functional managers share relevant data, so they can identify and resolve broader quality issues together. It’s an opportunity for each manager to bring insights to the table, and contribute to the larger mission of improving overall operational effectiveness.

However, my friend’s information often falls flat. Typically, he’s limited to reporting things like “95% of the workforce completed their required training last quarter.” That is all. No data describing how course completions relate to business outcomes. No predictive analysis, indicating what the future holds for either the 95% who completed a course, or the 5% who didn’t.

Not much actionable information in that statement. Certainly no insights to shift perceptions of L&D as a “necessary evil” cost center. And without the data he needs to add value, my learning director friend feels he has no influence during these managerial problem solving sessions.

Blame It On SCORM/AICC?

Learning

Of course, my friend isn’t alone in his frustration. From an analytics perspective, organizational learning has been stuck in neutral for far too long. Most reporting still revolves around registration, attendance and completion statistics, and assessment scores. Anyone who digs deeper is usually sidelined by learning technology infrastructure based on SCORM/AICC.

These standards aren’t inherently bad. For years, SCORM/AICC has provided a common language for tools that track and report training activity. However, its capabilities are far too rigid and limiting for modern learning analytics. We’ve entered an era of nonstop interest in big data, business intelligence, machine learning and data visualization. This should be L&D’s strong suit. But when we’re asked to join the metrics conversation, our dependence on outmoded standards forces us to remain silent. Until now.

The Rise of Tin Can API

There’s a new standard gaining buzz on the learning metrics front — The Experience API (also known as xAPI or Tin Can API). Although it’s still very early in the adoption cycle, Tin Can API is more than buzz. This is a real breakthrough for anyone who wants to draw more meaningful conclusions from learning activity data. The improved flexibility of xAPI promises several key advantages, including:

  • Better visibility into formal learning activity
  • Tracking of informal learning resources
  • Tracking of related work activity.

Combining these three dimensions in a unified analysis creates a much clearer view of the impact learning contributes to business performance. This clearer picture can prepare learning professionals for more meaningful, relevant and influential conversations with stakeholders, business leaders and others in our organizations.

Tin Can API in Action: Revisiting the Pharma Case

What could this mean for my L&D director friend? First, using content that conforms with the xAPI specification, he can track not only course completions and test scores, but also numerous other metrics that reveal how employees actually use compliance training materials at a much more detailed level. For example:

  • Pages read vs. pages skipped
  • Time spent on each page or section
  • Correct/incorrect responses to specific assessment questions
  • Number of attempts for each question
  • Demonstration video usage
  • Correct/incorrect moves inside interactive exercises and simulations

With deeper visibility into how participants interact with content, he can determine how to improve learning effectiveness. In addition, he can track usage of digital reference content, such as performance support materials and standard operating procedures (SOPs). For example, he can correlate SOP utilization with highly detailed learning records to analyze what’s happening inside specific training related to frequently accessed SOPs.

xAPI

Register for the live xAPI webinar with elearning analyst Craig Weiss!

Finally, he can work with IT to add Tin Can “hooks” to appropriate quality management systems on the factory floor. These hooks are short programming code snippets that, through xAPI, can send performance data to a learning record store (LRS) — the same location where informal and formal learning tracking data is stored.

This means that, for example, when “Jane Smith” weighs a chemical outside of the company’s 10mg guideline (a quality issue), learning records will include information about this deviation, along with all other learning-related data.

The Result?

Let’s envision what could happen at the next quality review meeting. My friend will be prepared to say things like this:

“Data shows that 20% of our folks require more than three tries to complete the weight measure simulation exercise in the weight measure SOP course. Although those participants referenced the associated SOP at least once 90% of the time, they were still far more likely to deviate from guidelines than those who completed in three tries or less. To improve these outcomes, we’re now revising the SOP and creating a remedial weight measure exercise for the team.”

Analytics That Improve Business Perceptions and Realities

As this example suggests, Tin Can API has the potential to transform learning analytics into a precise and powerful business tool. In the process, learning leaders can begin leveraging insights for more effective, authoritative conversations with senior managers, as well as peers in HR, IT, R&D, Operations, Sales, Marketing and Customer Support. In a world increasingly driven by data, Tin Can API can give learning a much-needed voice at the center of every business.

Your Turn

Has your organization started evaluating or testing analytics based on the xAPI (Tin Can) specification? What are your impressions? Please share your story in the comments below.

http://www.expertus.com/

 

 

Publi-reportage - //20