Tag Archives: MOOC

MOOC success

Making a MOOC ‘successful’

Designing a ‘successful’ MOOC is one thing. Making a MOOC ‘successful’ is something completely different.

Much has been written by far better and more eloquent people than me (here and here and here and here and here) on what makes a successful MOOC – all about interactions, journeys, optimum length, appropriate materials, platform, etc.. But what about making a MOOC successful? To me, there is a difference.

This isn’t about making / building / designing a MOOC, it’s about making / encouraging / promoting / informing people about the MOOC.

The argument about MOOC success, learner retention, completion numbers, registrations, etc., is one that will rage on and on, everyone has an opinion, everyone looking at something different, all very valid, and all very important questions. There isn’t a definitive answer – each MOOC is different, for a different audience, for a different demographic (maybe), and designed in a way that different learner ‘profiles’ can get something different out at the end. If indeed they reach the end, which of course they don’t have.

No, making a successful MOOC requires more than a lead academic(s) subject knowledge, learning technology, instructional/education design, assessments, an appropriate learning goal/journey, working platform, etc. You need all the other stuff as well.

The other stuff you need? Well, try:  Continue reading

MOOC quality

MOOCs – question on purpose, quality, student retention, feedback, etc.

Ahh, questions around the purpose, quality, value, etc. in and around MOOCs have started again, and justly so.

  • Disclaimer: Like many I have opinions, but not answers.

The recently raised questions, started by Fred Riley on the ALT mailing list, have produced a good set of resources for those of us who are starting to ask these questions, needing a more comprehensive or value-added answer.

Fred’s original query was:


Does anyone on this list know of any recent research and/or articles on the teaching quality of MOOCs? I’m thinking of things such as:

  • student retention, with MOOC drop-out rates being notoriously high (I plead guilty to that myself :( )
  • student surveys and qualitative feedback
  • how many students in a MOOC platform (eg FutureLearn) go on to take further courses in that platform

I’m sure that there are many other indicators of quality – those are just off the top of my head. I’m not in the MOOC game myself as yet, other than as a punter, but I’m looking to get into the development side of things.


In some instances, especially around the data of students/learners taking further courses (across MOOC platform providers as well as within) is difficult, but I hope we can get to a stage where this kind of data is available and open to interrogation (if only for the individual partner to  query their own courses).

Here are some of the resources shared, in response to Fred’s original query:

If you have any further links or resources that would help Fred and the ALT mailing list, please reply to the thread on the mailing list. If you don’t have access then please leave the link or your comment below for everyone to have the opportunity to read.

Yes, OK. Fred’s question also raises the question around the ‘quality’ of a MOOC, the validity in the data of learner retention or ‘steps completed’ as triggers for saying a MOOC is of a certain quality, or the student was ‘successful’ on the course, but these are for another post. Fred answered this quite clearly on the ALT mailing list that, for him “retention is IMO and indicator of quality as perceived by the student – the better retention, the more students are engaged with the course and its materials. If they don’t like a course, they’ll drop out.”

NB: I’ve helped run several runs of the Warwick/FutureLearn ‘Shakespeare and his World’ MOOC and use this as an example I use where the statistics provided for the 10 week course don’t necessarily match the actual experience. Case in point is the number of learners who complete the course, in that they take all the tests and mark at least one step as complete in each of the 10 weeks. We know from the learners themselves, from their comments, feedback, tweets, etc., that they take what they want from the course – one learner may only like Shakespeare’s comedy’s, another likes on his tragedy’s, so they will omit the plays/weeks they don’t like. They should still be viewed as a successful learner, and I’;m sure they think that of themselves, as in their own mind (and in ours!) they got what they wanted from the course, yet did not actually ‘complete’ it.

If there is one question for 2016 and MOOCs, it’s whether there is any way to really truly, honestly, understand the ‘value’ of a MOOC?

Image source: State Library Victoria (CC BY-NC 2.0)

gate

MOOCs and ‘facilitation’

What are your thoughts on this – moderation and/or facilitation of MOOCs?

Considering the time, effort, and cost of developing these free courses (more information is available here or here or here, among other sources), what are your thoughts on how we manage the course, the comments and discussion during the run, and the subsequent comments and discussion during re-runs?

Do you have support, from technical and/or academic backgrounds monitoring the course to keep comments on track and answer pertinent questions? Are these paid positions or part of their role? Do you actively check the comments? If so, what for, why, and what do you do?

Do you design-in an element of real-time collaboration on the course (facilitation of discussion, round-up videos, Google Hangouts, etc.), and if so are these sustainable over multiple runs of the course? If you’ve done these before, but then designed them out of the course for re-runs, why?

All comments and feedback welcome – I’m trying to understand how we move MOOCs forward and maintain institutional ‘control’ where there is little (financial) reward.

Image source: Greg Johnston (CC BY-NC-ND 2.0)

Customise me

Don’t give it to me unless I can customise it

My first car was a 1993 Rover Mini Cooper 1.3i, in British Racing Green (obviously). I bought it second hand in ’97 from John Cooper Garages (JCG) in West Sussex, and the legendary John Cooper himself handed my the keys (and made my mum a cup of tea while I did the paperwork).

Like so many people who own a Mini it didn’t stay ‘standard’ for very long, as I read through the Mini magazines on the kinds of things I could do to personalise the car. I went to Mini events, like the London-to-Brighton Mini Run and the 40th anniversary party at Silverstone, and looked over the show cars and private cars that were parked up, as well as the stands and auto-jumble traders. I bought the whole set of JCG brushed aluminium door furniture (window winders, door pulls, etc.) and chrome accessories (bling!), as well as doing more mechanical upgrades like vented discs and four-pot calliper for both front and read brakes, and a full-length straight-through (manifold to rear ‘box) DTM-style exhaust system (ooh, that was awesome!).

This was the start of my love affair with tinkering and messing with anything that’s standard to make it personal for what and how I like it.  Continue reading

Can MOOCs and Open Badges provide an alternative to the so-called ‘inflation of educational credentials’?

Reading: Open Badge’s and MOOCs #openbadges

Badges continue to interest me, and the development of open badges in online courses and commercial/corporate settings seems to be gaining momentum?

However, the bottom line is that conditions have changed (i.e. progressive mobility worldwide, as well as the increasing need for recognition of migrants’ qualifications). While some authors warn about the risky “inflation of educational credentials” others go even further claiming that “The university has already lost any claim to monopoly over the provision of higher education” (Duke, 1999). The initiatives described here are still in an embryonic stage but at the same time are promising in terms of new possibilities for more flexible tools and, as @daveowhite suggests, they provide new currencies that can redesigning the economy of talent (find more in UNESCO UIL or the EU ESCO).

As I always say, badges will not be suitable for everyone, nor every situation or course, or learning journey(s). But they do have a place in demonstrating acquisition of skills, in a carefully implemented and designed environment, for a specific and define purpose. Whether the display of the badge itself is part of the reason we strive to earn it is part of the value associated with the badge and is something for others to argue (but I am keenly interested in the outcome and arguments).

Image source: Alan Levine (CC BY 2.0)

Digging Deeper into Learners’ Experiences in MOOCs:

Reading: Digging Deeper into Learners’ Experiences in MOOCs

One aspect of working on MOOCs is that there is no clear way to measure it’s success. Do you use the stats and logs that indicate clicks and time-on-page, or look at the nature of the conversations and/or comments made?

That’s why this paper loaded to Academia.edu by George Veletsianos piqued my interest – is there something in here that can help me understand the metrics we need to use in order to measure the learning and/or success of a MOOC?

“Digging Deeper into Learners’ Experiences in MOOCs: Participation in social networks outside of MOOCs, Notetaking, and contexts surrounding content consumption.”

Unsurprisingly the authors highlights the lack of literature around MOOCs that look into the metrics of MOOCs that are not captured on the MOOC platform (EdX, Coursera, FutureLearn, etc.), notably the social engagements, note-taking, and content consumption. Something I’d not considered before is the “availability of large-scale data sets appears to have shaped the research questions that are being asked about MOOCs.”  Continue reading

What makes a good online course?

What makes a good online learning experience?

Is it possible to define the qualities of what makes a good online learning experience, or a good MOOC? Is there a check list we could have pinned to the wall which we could use as we design and build our courses?

Here’s a few items I think the list needs, feel free to add your own ideas in the comments field below:

Presentation: Is the student able to relate to the subject and the presenter / educator? This is not always easy as the platform (Blackboard, Moodle, FutureLearn, Udacity, etc.) often controls how the materials are ‘presented’. Even with these constraints you do have options on designing your materials and laying them out in ways which make them easy to navigate or interact with.  Continue reading

How do you measure MOOCs?

How do you measure the ‘success’ of a MOOC?

Here’s a question I’ve been battling for some time .. how do you measure the ‘success’ of a MOOC? The problem is that I haven’t been able to define what the ‘success’ is supposed to be, so to try and measure it seems, well, a pointless exercise.

So, here’s a few thoughts I’ve had based on my experiences as a learner on MOOCs (yes, plural), and as part of a team developing and delivering 4 FutureLearn MOOCs now (with a few more in the pipeline too!).

  • Do you look for the headline figures of number of registered learners, or the number of registered learners that became learners (visited the course)?
  • Do you look for the number at the number of learners who did something, that engaged on the course in some way .. as either a number (e.g. 4,000) or as a percentage of the learners who visited the course (e.g. 40%)?
  • If you plan your MOOC to link to a paid-for course (degree, training, etc.) do you measure the success by the number of MOOC learners who enquire, or sign-up, to the linked course?
  • Do you look to the quiz or test responses, to see who’s retained and regurgitated the information based on a ‘score’?
  • Is it the final number of learners who make it through the length of the course to the end?
  • Is the number of comments a worthy of a measurement of success? Do courses that have more comments (either in volume or as a percentage of active learners) indicate a greater success than those with fewer?
  • Can you measure the success based on interactions on social media, through a defined hashtag? In which case do you measure the number of mentions on the hashtag or dig deeper and quantify the different sorts of engagements, ranging from “I’m on #such-and-such course” to enquiries or the detailed thought process involved in critical thinking along the lines of the MOOC subject?
  • Is a successful course one that takes learners from the MOOC environment into a related course, be it a MOOC or other paid-for course? If so, are you capturing that data?

Continue reading

#FLbigdata

Big Data Videos #FLbigdata

I’ve already posted these videos before, but I thought I’d post them here again, in one place, as a good resource for the learners on the Big Data FutureLearn course that started today.

All of these have one thing in common … do you know where your data goes, or who is watching/listening/capturing your data?

Tom Scott – I know what you did five minutes ago / YouTube


 

Hot on your trail: Privacy, your data, and
who has access to it / YouTube

Tom Scott – Social Media Dystopia / YouTube

Jack Vale – Social Media Experiment / YouTube

Digital Dirt / YouTube

Everyone knows Sarah / YouTube

Do you have any more you’d like to showcase and bring to your fellow FutureLearners? Drop a comment with the link below.

Netflix

Learning the Netflix way

I’ve just read the post by Donald Clark called ‘What does ‘learning’ have to learn from Netflix?’ which has resonated with much of my own thinking from recent work and discussions I’ve been having on Twitter.

I signed up for one of the free 1 month trials of Netflix when it was first available in the UK. I enjoyed it, then cancelled it. I’d got what I wanted. Then I realised I wanted access to the binge-watching phenomenons like House of Cards, Breaking Bad, and the one that started them all, 24. But more than this, as Donald mentions in his post, I wanted access to the kind of programmes I like and at my convenience. I am not always available at 9PM every Thursday to watch the latest instalment of my favourite show(s), just like I don’t actually want to wait a full week for the next episode. I first watched 24 on DVD, not Sky, so I did binge-watch the show, usually 4 full episodes a night (or 1 DVD) and went to bed wired for the next marathon 24-fest.

So, if we’re changing our viewing habits, are we changing our learning habits (as pointed out by Donald)?

Yes. Consider Donald’s points:  Continue reading