Here’s a question I’ve been battling for some time .. how do you measure the ‘success’ of a MOOC? The problem is that I haven’t been able to define what the ‘success’ is supposed to be, so to try and measure it seems, well, a pointless exercise.
So, here’s a few thoughts I’ve had based on my experiences as a learner on MOOCs (yes, plural), and as part of a team developing and delivering 4 FutureLearn MOOCs now (with a few more in the pipeline too!).
- Do you look for the headline figures of number of registered learners, or the number of registered learners that became learners (visited the course)?
- Do you look for the number at the number of learners who did something, that engaged on the course in some way .. as either a number (e.g. 4,000) or as a percentage of the learners who visited the course (e.g. 40%)?
- If you plan your MOOC to link to a paid-for course (degree, training, etc.) do you measure the success by the number of MOOC learners who enquire, or sign-up, to the linked course?
- Do you look to the quiz or test responses, to see who’s retained and regurgitated the information based on a ‘score’?
- Is it the final number of learners who make it through the length of the course to the end?
- Is the number of comments a worthy of a measurement of success? Do courses that have more comments (either in volume or as a percentage of active learners) indicate a greater success than those with fewer?
- Can you measure the success based on interactions on social media, through a defined hashtag? In which case do you measure the number of mentions on the hashtag or dig deeper and quantify the different sorts of engagements, ranging from “I’m on #such-and-such course” to enquiries or the detailed thought process involved in critical thinking along the lines of the MOOC subject?
- Is a successful course one that takes learners from the MOOC environment into a related course, be it a MOOC or other paid-for course? If so, are you capturing that data?
Here are my thoughts.
The numbers of learners on a course are not important. Yes, it’s good to say you’ve had 10,000 or 50,000 sign-up, but that gives no indication of success, especially when a meagre percentage actually do anything on the course (30%, 50%, etc.). Even then you could look into these numbers, and break out how many who turned up and looked at the course: do you look at the numbers or percentages, and who’s to say what percentage of learners who are ‘active’ or ‘social’ is the mark or a success? Obviously a high percentage would be better, but that doesn’t deal with the quality of comments … are they just “I agree” or ‘yes”, or do they show deep learning and understanding of the subject matter?
Is a successful course one that retains a higher percentage of the learners throughout it’s duration compared to other courses, or compared to previous runs of the same course (if there’s been one)? What about courses that have higher engagement rates of comments or discussions?
At times all we have are the basic figures for learners and how they behave on the platform from simple statistics and analytics of log on time, time on site, pages viewed, etc. Is the problem that the course platforms are not geared up to adequately measure the kind of activity or ‘movement’ through the course materials to give us enough data with which we can produce valuable measurements?
As a learner on various MOOCs (EDCMOOC, OpenBadges, etc.) I have to say that my progression through a MOOC to completion is in no way indicative of my ‘success’ as a learner. In one or two cases (not the two I’ve mentioned here) I dipped in for the very small bit of the course I wanted, learned what I needed, and left. That may have been only 1 week of the 4 or 6 weeks of the course duration. Some other MOOCs I stayed for the duration of the course, yes I stuck it out, but didn’t enjoy the experience and didn’t really learn anything: figures for the course run would indicate that I am indeed a success, but that’s not how I see my experience.
What would be good if we could access the learners, for each of them to answer basic and pertinent questions based on the course. I know some course providers have the option for surveys and online polls before, during, and after a course, but such a small proportion of learners actually complete it you could argue the results are inconclusive simply because (for the post-course survey).
So, does this bring us on to the topic of learning analytics, and how much (meaningful) information can they present? Again, it’s not about login times or page views (although these are important) but about how this data can be linked to other data (like missed deadlines, scores, time between logins, etc.) to present a learners ‘journey’. It’s this journey that gives a more accurate picture of the learner and their individual needs or styles, only through linking the often isolated data sets together.
As ever I turned to my PLN this morning and asked the same question:
Here are some engaging answers that, again, raise more questions than they really answer …
- Peter Evans: “MOOC as catalyst for developing capability/ staff dev in developing online courses?”
- Jennifer Reid: “diversity of participants?”
- Dilrukshi Gamage: “I found 10 factors .. mainly interaction , collaboration, pedagogy, network of opportunity.”
- Emma Betts: “Measuring MOOC success. Assessing participation against learner goals needs to be part of the answer. How, not sure.”
So … genuine question, how do you measure the success of a MOOC, and what is the ‘success’ you want to measure?