The advent of the COVID19 changing the landscape of digital learning is presenting challenges for all of us, especially in the digital learning world. I was no exception and wanted to share with you a recent experience where I was able to use xAPI to provide a solution to a problem that threw some unique challenges at me.
Like a lot of organisations, learning is delivered in many forms and the learning culture is generally very embedded. So, when it comes to change, or in this case a major disruption like COVID19, understandably things can get thrown into a little bit of chaos.
The challenge that was presented to me was to capture what a student was answering in an online assessment, as well as the result. The assessment was part of a face-to-face course that had always been delivered in this manner. The outline was:
- Take the face-to-face assessment, develop and deliver the assessment with scenarios online
- All answers by the students must be captured, if right or wrong
- All results must be captured
- The course must be delivered via the Corporate LMS
- The learner can have 2 attempts based on their score (ie < 90% redo all questions >90% and <100% redo only the questions they got wrong)
- An assessor must be able to view the learner’s results easily
- An assessor must be able export the results in Excel to be able to have a competency conversation in the event the learner does not pass
Now, this may not sound like much of a challenge, but here is what I was up against:
- LMS only supports SCORM
- LMS is very dated
- LMS is unable to apply a second attempt based on score
There are quite number of challenges that need to be overcome here. Notwithstanding the LMS is very dated, it only supports SCORM. As we know, SCORM is very limited on what it can report. It will tell us who did the course, when they did the course and did the learner pass or fail. The LMS also doesn’t allow multiple attempts based on the learners score. These are quite restrictive.
I needed to think well outside the box given the restrictions I had to work with. I coined the project making a Soufflé out of Mud.
The first part of the redesign was to look at developing the assessment course in Articulate Storyline 360. This is the standard for the organisation and it gives an elearning developer a little bit more control using JavaScript. This was critical as to how I was able to solve the other issues. Another product that is used in the organisation is a Rapid eLearning Development tool and did not provide near enough access to be able to achieve the desired results. This I might add is a distinctive difference between a full eLearning development tool and a rapid elearning development tool. One is for quick solutions (read PowerPoint on steroids) while the other is for more serious solutions. Like comparing Auto CAD and MS Paint.
I digress. Back to the problem in hand.
To solve the issues of capturing what the learner responded to was to use xAPI. I had started to introduce xAPI on another project, and it was only just starting get noticed. Understanding that SCORM does not inherently use xAPI, I used a JavaScript library that allowed me to manually add a trigger to events in the Storyline file to send xAPI statements to a Learning Record Store (LRS). Whilst this was more work and potentially made the Storyline file a little more complex, it allowed me to control and capture the data that we needed for the brief. The library that I used was based on the Article I wrote on Capturing xAPI from a SCORM file.
I used triggers to also look at the score that the learner obtained and set an appropriate layer based on their score. Here I ran into an issue with Storyline that was unexpected. I added a button with the built-in feature to Reset the Results. With this built-in feature us the ability to reset only the ones that are wrong. That was perfect! Not so fast. What I discovered was If you add two buttons that use the built-in feature of Reset the Results, it will set the SAME parameter on the other button. Solution was to replicate the quiz in another scene with another results page. Because the brief was that the learner must complete the entire assessment again if the score is <= 89%, having another scene and results page would work. If the learner gets >= 90%, they re-try the initial assessment but only answer the questions they got wrong. Phew!
Now I have the data, what’s next?
The organisation uses Learning Locker as their Learning Record Store (LRS). This is a great LRS, however the tool can be very complex for Assessors to be able to get the data they need. I took to developing a custom JavaScript solution that will allow me to connect to the LRS, find a course and bring up the results of a learner. The Assessor can then export the results to excel, refresh the table or print the results in the table.
Whilst initially I thought this was a quick solution, it turned out to be a little more complex than expected.
I used the JavaScript Library called xapiWrapper. This is a great library to pull back (as well as send) xAPI data to an LRS.
Initially, as we had not setup a structure for the LRS, I looked at having one Store and simply filter the results by Activity. This to me sounded logical understanding some database structuring.
As logical as it sounds, the wrapper did not allow me to just return activities to be able to filter. What I ended up doing was pulling ALL data back, adding to an array and then filtering. This was not ideal, but at that point I couldn’t think of any other way. Enter the next issue, an LRS will only bring a max of 100 records at a time. Now I had to check the results for a ‘more’ record and run that query. This was starting to get messy, and not to mention slow.
After doing a little more research, I stumbled across a post on Learning Locker a couple of years old that highlighted the issues with the LRS querying and the speed. The solution was to use the MongoDB Aggregation. I’d never used MongoDB before and a bit of trial and error, I was able to get A LOT more control over the data and the speed was phenomenal.
The end product (at this point in time) pulls back data of all the users and allows the Assessor to export the data to Excel. The data can be filtered by Verb and the xAPI statement can also be viewed for the more advanced users.
This is a work in progress and will grow. An idea I got from Devlin Peck during the Elearning Success Summit was that he was capturing comments in a course. Understanding our learner cohort, I knew that filling out an evaluation form at the end of the course was not going to happen, and if it did, it was probably never going to be used. I added a star based ranking layer at the end of each module where a leaner can rate the module and then provide any comments. This was all captured in the LRS using xAPI. I could then use this data to provide an average rating and also capture feedback in the moment, not when they finish.
I’ve learnt a lot out of this challenge. xAPI is very powerful and very flexible, but needs to be presented in a meaningful way for those that are not technical. My future LRS’s will be structured differently. Where large data is to be captured, a separate store per course or even organisations will be setup. I learnt MongoDB and just how fast it is.
It also proved that people don’t know what they want until you give it to them. I’ve been trying to introduce xAPI for 12 months, but it wasn’t until something was built with relevance that xAPI was accepted as a viable solution.
Below is are a few screen shots of the application.
I can’t share the current iteration of the product, but please reach out to me if you’d like to know more.
3 Responses