Learning data: Who gives a percentile?

  • DeakinCo.
  • 14 December 2015

by Cameron Hodkinson, Digital Learning Strategist-DeakinPrime

From electrified books in the 1900s and adaptive ‘push button education’ in the 1950s to the robotic tutors of the 80s, the predicted (and sometimes real) evolution of technology assisted learning has been a rich and fruitful journey. However, understanding the impacts that these advances have on learner engagement, personal growth and workplace proficiency has often been a complex and rather unscientific undertaking. 

In many cases this is the result of inconsistent and often unreliable data, inaccessible storage mechanisms, hard-to-manipulate formats, or simplistic and non-descript syntaxes, making life for the average data scientist a veritable minefield of manual extraction, adjustment and normalisation. This has also made it challenging to align the organisation’s learning initiatives to tangible business impacts and outcomes, meaning that program success is often anecdotal and over- or underestimated.

The inadequacies of learning data have been highlighted even further by the continuing emphasis on models such as 70:20:10, where the vast majority of learning occurs away from traditional and more formal learning mechanisms and their data collection tools.

Fortunately, the emergence of flexible data specifications like Experience API (xAPI) are not only helping to minimise these operational challenges, but are also broadening our learning data’s vocabulary and supporting a much deeper understanding of the connections between learning, activity, content and experience.

In the case of xAPI, this flexibility is achieved through a simple yet flexible data structure that includes an Actor (the person who is performing the action), a Verb (the action being performed) and an Object (whatever the action is being done to), which in human terms reads something like ‘Cameron (Actor) reads (Verb) an article, “Learning data: who gives a percentile?” (Object)’.

In the context of a workplace, this immensely extends our ability to describe and capture learning and performance activity, providing opportunities to examine the ‘20’ and ‘70’ of learning and their correlation to more formal (‘10’) activity with data statements such as:

  • Cameron watched a video, Coaching 101
  • Cameron attended a workshop, Coaching at DeakinCo.
  • Cameron discussed coaching with Arun
  • Cameron coached an Actor, John, using the DeakinCo.coaching checklist
  • Arun observed an activity, Cameron coaching John, with a rating of 4 and commented, ‘Nice work Cameron, John benefited greatly from this session’.

Individually these data points are not necessarily all that meaningful, however when viewed together, particularly over time and as a part of the organisation’s broader activity, success patterns and organisational priorities emerge.

Of course, understanding the organisation’s idea of success and formulating questions like ‘What has the most impact on the quality of coaching?’ or ‘Does coaching impact the productivity of those being coached?’ can help to design appropriate data structures and is integral to deciding what data might be relevant and worthwhile capturing.