For Learning Designers
While all stakeholders will find their own benefits for utilizing xAPI, learning designers are well-positioned to implement designs backed by learning analytics.
As data becomes more important to justifying decisions such as purchasing new educational technologies, trying unique pedagogical approaches, and testing new learning theories, designers have an opportunity not only to offer faculty data to accompany pedagogy-based designs, but also to obtain real-time feedback from learners as they're interacting with content. In traditional models of design, the evaluative piece generally comes after learners have worked through content and generally relies on self-reported data through surveys and other qualitative measurements (Lockyer, Heathcote, & Dawson, 2013). By incorporating regular checkpoints throughout a course, evaluation can be a more continuous process as students progress through material. Learning analytics allows learner data to be captured and recorded immediately, making it much easier to intervene if necessary, or get a head start on redesigning activities for future iterations of a course if time permits.
Depending on their role in a department, designers also have a unique opportunity to observe patterns on a larger scale. Those who oversee a portfolio of courses under the same umbrella are often able to see the "bigger picture" - how the courses fit together, the ways they build onto each other, and how each course achieves certain goals in a program. Incorporating activities that emit xAPI statements crafted with contextual fields allow data to be divided by course, section, semester, etc. Designers could use this data to compare activities across courses and semesters, and once a large enough pool of data has been recorded, predictive analytics could provide designers and opportunity to identify pieces in a course or program that would benefit from redesign or further experimentation.
Predictive analysis is becoming more popular in higher education in helping identify areas of weakness. Take, for example, the story of a nursing program at Georgia State University. Stakeholders noticed that student's achievement in a specific math course was a reliable indicator of whether or not they would graduate from the nursing program (for more information about the Georgia State case, check out this story: https://nyti.ms/2jYRAHS). Knowing this, designers could work with faculty, program coordinators, and other support systems at the university to provide remediation or better prerequisite course requirements to adequately prepare students for success in that math course, and therefore increase students' chances for graduation.
Pairing learning analytics alongside carefully crafted learning outcomes gives designers a way to gather real-time data on observable student behaviors and measure that data against student grades and learning objectives. With this additional level of feedback readily available, designers and instructors can plan accordingly for future offerings and designing experiences for new course development (Perisco & Pozzi, 2015).