Original report by:
In 2020, a report was commissioned by the Office of the Director of Training to develop high-level advice on the consolidation of the Centre’s learning analytics system in light of the overall digital transformation of the Centre. It developed two major visions and five challenges, including a short-, mid- and long-term roadmap to provide the Office of the Director of Training with the ability to systematically analyse and interpret learning data. The authors, Drachsler and Kalz, used the learning analytics sophistication model to evaluate the current state of LA at the Centre and assessed it as being situated between Step 2 and Step 3.
The aim of the report is to present a roadmap for reaching Step 3 of the learning analytics sophistication model. Based on Drachsler and Kalz’s report from 2020, the current learning analytics status of the ITCILO is analysed in terms of the dashboard design and its implementation in organizational processes.
The roadmap, technical and organizational steps recommended in this report are meant to be the stepping stone for a long-lasting improvement process that involves continuous re-invention of learning analytics at the Centre, continuous improvement in moving to and beyond Level 3, and towards being a pioneer in digital transformation as a training organization.
The Centre is situated between Step 2 and Step 3 of the learning analytics sophistication model
Learning Analytics Dashboards should
Data sources should not only include activity data but also take into account the content of completed assignments
Several groups of indicators relating to learners, actions, content, results, context and social factors, require different kinds of data sources.
Objectives
Shift the Centre’s paradigm towards a data-driven course-provider platform, focusing more on lifelong learning and continuous professional development
Infrastructure
Data and indicators
The PDCA Cycle
Should be used to support future evidence-centered learning design by using the data and insights gathered as an integral part of learning design for new courses or the overhaul of existing ones.
User-centricity
A clean Learning Analytics Dashboard needs to take into consideration the user experience, which benefits from clear communication and the reduction of visual messages to a minimum in getting the necessary points across.
The following diagram illustrates the technical infrastructure for handling LA data. This is not meant to represent only the current state of affairs, but also includes modifications and clean-up planned for the near future.
While not focused exclusively on learning, IBI focuses on the macro perspective. On eCampus, there is a micro-level activity manager dashboard, which gives access to course and activity completion states and grades by user.
It will soon be integrated into the eCampus as a Moodle module. In this way, data will no longer need manual extraction via Excel sheets, but can be collected by Kettle, like all other quiz results.
The system allows insights into many aspects such as sharing credentials on social media, automating credential creation and linking to the correct person.
This proposal for a plan to build future LA dashboards attempts to integrate existing solutions as best as possible to avoid duplication and unnecessary effort. In general, the current set-up is capable of reaching Step 3 of the LA sophistication model.
This process will be based on the dimensions of learning analytics proposed by Drachsler and Kalz in the previous report.
Because of the different needs, different dashboards need to be created. A one-size-fits-all solution would not only violate several objectives but would also go against the principles of the user-experience design.
When considering different stakeholder groups, it is important to note that some are interested in a well-defined information space, while others play a transversal role.
The most common activities include the consumption of training, training itself, evaluation of current or past training activities, reporting on past activities, and innovating the training process itself. This matrix is essential for understanding which indicators are useful for which roles in the organization.
Several dashboards are required. At first glance, an obvious choice would be four dashboards, for the micro, meso and macro levels, with the micro level separated into internal micro for activity managers and external micro for end users.
In the ITCILO, there are several people working on improving training quality, so it is worth designing the dashboards with this situation in mind.
When reporting numbers for the biennial report, the questions asked and the data presented should be different from the data used when analyzing the situation for the longer-term gain of the educational process. The dashboards, especially on the macro level, should allow for a differentiation between biennial KPI reporting and deep fundamental analysis of educational processes that is not suited for regular reporting but is desperately needed for learning innovation.
There are four major objectives to fill in the delta between the status quo and what can be considered as being on Step 3 of the learning analytics sophistication model.
Currently, data is only collected before and after the complete journey. This is largely due to the shift from comparatively short face-to-face courses to distance learning on a very large scale. Given the intended further expansion of distance learning, collecting relevant data early and often needs to be a major focus. Ideally, there should be several arrows in each row for collecting real-time learning data.
A good dashboard is not an Excel report, but rather the condensed information that someone has already pulled out for you, which then enables you to draw conclusions on how to act on it.
Pioneering a project is significantly easier if there is an appealing narrative to accompany it
-Tesla
The indicators and dashboard structure have been deduced from first principles, then what derives the data that needs to be shown was determined, and finally UX design principles were applied to make them as digestible as possible.
The way in which conclusions are drawn, needs to be continuously monitored and improved upon. Dashboards need to be continuously adapted to the changing questions posed by users and to consider new questions that are relevant to the current challenges faced by the different stakeholders.
This link shows several ideas for indicators which could be shown on existing dashboards.
Giving information space to breathe, results in better buy-in by users
The dashboards will only ever be as good as the questions asked while developing and using them