Tag Archives: Online Course

MOOC success

Making a MOOC ‘successful’

Designing a ‘successful’ MOOC is one thing. Making a MOOC ‘successful’ is something completely different.

Much has been written by far better and more eloquent people than me (here and here and here and here and here) on what makes a successful MOOC – all about interactions, journeys, optimum length, appropriate materials, platform, etc.. But what about making a MOOC successful? To me, there is a difference.

This isn’t about making / building / designing a MOOC, it’s about making / encouraging / promoting / informing people about the MOOC.

The argument about MOOC success, learner retention, completion numbers, registrations, etc., is one that will rage on and on, everyone has an opinion, everyone looking at something different, all very valid, and all very important questions. There isn’t a definitive answer – each MOOC is different, for a different audience, for a different demographic (maybe), and designed in a way that different learner ‘profiles’ can get something different out at the end. If indeed they reach the end, which of course they don’t have.

No, making a successful MOOC requires more than a lead academic(s) subject knowledge, learning technology, instructional/education design, assessments, an appropriate learning goal/journey, working platform, etc. You need all the other stuff as well.

The other stuff you need? Well, try:  Continue reading

MOOC quality

MOOCs – question on purpose, quality, student retention, feedback, etc.

Ahh, questions around the purpose, quality, value, etc. in and around MOOCs have started again, and justly so.

  • Disclaimer: Like many I have opinions, but not answers.

The recently raised questions, started by Fred Riley on the ALT mailing list, have produced a good set of resources for those of us who are starting to ask these questions, needing a more comprehensive or value-added answer.

Fred’s original query was:


Does anyone on this list know of any recent research and/or articles on the teaching quality of MOOCs? I’m thinking of things such as:

  • student retention, with MOOC drop-out rates being notoriously high (I plead guilty to that myself :( )
  • student surveys and qualitative feedback
  • how many students in a MOOC platform (eg FutureLearn) go on to take further courses in that platform

I’m sure that there are many other indicators of quality – those are just off the top of my head. I’m not in the MOOC game myself as yet, other than as a punter, but I’m looking to get into the development side of things.


In some instances, especially around the data of students/learners taking further courses (across MOOC platform providers as well as within) is difficult, but I hope we can get to a stage where this kind of data is available and open to interrogation (if only for the individual partner to  query their own courses).

Here are some of the resources shared, in response to Fred’s original query:

If you have any further links or resources that would help Fred and the ALT mailing list, please reply to the thread on the mailing list. If you don’t have access then please leave the link or your comment below for everyone to have the opportunity to read.

Yes, OK. Fred’s question also raises the question around the ‘quality’ of a MOOC, the validity in the data of learner retention or ‘steps completed’ as triggers for saying a MOOC is of a certain quality, or the student was ‘successful’ on the course, but these are for another post. Fred answered this quite clearly on the ALT mailing list that, for him “retention is IMO and indicator of quality as perceived by the student – the better retention, the more students are engaged with the course and its materials. If they don’t like a course, they’ll drop out.”

NB: I’ve helped run several runs of the Warwick/FutureLearn ‘Shakespeare and his World’ MOOC and use this as an example I use where the statistics provided for the 10 week course don’t necessarily match the actual experience. Case in point is the number of learners who complete the course, in that they take all the tests and mark at least one step as complete in each of the 10 weeks. We know from the learners themselves, from their comments, feedback, tweets, etc., that they take what they want from the course – one learner may only like Shakespeare’s comedy’s, another likes on his tragedy’s, so they will omit the plays/weeks they don’t like. They should still be viewed as a successful learner, and I’;m sure they think that of themselves, as in their own mind (and in ours!) they got what they wanted from the course, yet did not actually ‘complete’ it.

If there is one question for 2016 and MOOCs, it’s whether there is any way to really truly, honestly, understand the ‘value’ of a MOOC?

Image source: State Library Victoria (CC BY-NC 2.0)

ALTC 2015

The Interview Process #altc

From this year’s ALT conference I enjoyed (finally) meeting Wayne Barry, EdTechBook contributor, and chatting about his ALTC presentation.

Wayne’s presentation looked at a different way of interviewing candidates for Learning Technologist positions using standard questions and short presentations, but also the inclusion of a short role-play exercise. Each candidate is given advance notice that they will engage with an ‘academic’ who is interested in introducing elements of distance learning to their module. During the short exercise (many people took issue with the use of the term ‘role-play’) candidates will exhibit both knowledge of their discipline as well as the ability to listen, engage, problem solve, and debate with a member of the team taking the role of an academic.

So, how do you find out if someone will fit in to your office and team environment? Can you do this by just questions? Do competency based questions offer enough space for someone to fudge their way through the process, or rather offer the interviewers enough insight to see the tRuth behind the candidate?

This reminds me of this video, from Heineken: Job Interview. Slightly over the top, but you get the idea – by changing the process you find out many different things (hopefully good) about the candidates. Enjoy!

YouTube: Job interview at Heineken

Digging Deeper into Learners’ Experiences in MOOCs:

Reading: Digging Deeper into Learners’ Experiences in MOOCs

One aspect of working on MOOCs is that there is no clear way to measure it’s success. Do you use the stats and logs that indicate clicks and time-on-page, or look at the nature of the conversations and/or comments made?

That’s why this paper loaded to Academia.edu by George Veletsianos piqued my interest – is there something in here that can help me understand the metrics we need to use in order to measure the learning and/or success of a MOOC?

“Digging Deeper into Learners’ Experiences in MOOCs: Participation in social networks outside of MOOCs, Notetaking, and contexts surrounding content consumption.”

Unsurprisingly the authors highlights the lack of literature around MOOCs that look into the metrics of MOOCs that are not captured on the MOOC platform (EdX, Coursera, FutureLearn, etc.), notably the social engagements, note-taking, and content consumption. Something I’d not considered before is the “availability of large-scale data sets appears to have shaped the research questions that are being asked about MOOCs.”  Continue reading

What makes a good online course?

What makes a good online learning experience?

Is it possible to define the qualities of what makes a good online learning experience, or a good MOOC? Is there a check list we could have pinned to the wall which we could use as we design and build our courses?

Here’s a few items I think the list needs, feel free to add your own ideas in the comments field below:

Presentation: Is the student able to relate to the subject and the presenter / educator? This is not always easy as the platform (Blackboard, Moodle, FutureLearn, Udacity, etc.) often controls how the materials are ‘presented’. Even with these constraints you do have options on designing your materials and laying them out in ways which make them easy to navigate or interact with.  Continue reading

#FLbigdata

Big Data Videos #FLbigdata

I’ve already posted these videos before, but I thought I’d post them here again, in one place, as a good resource for the learners on the Big Data FutureLearn course that started today.

All of these have one thing in common … do you know where your data goes, or who is watching/listening/capturing your data?

Tom Scott – I know what you did five minutes ago / YouTube


 

Hot on your trail: Privacy, your data, and
who has access to it / YouTube

Tom Scott – Social Media Dystopia / YouTube

Jack Vale – Social Media Experiment / YouTube

Digital Dirt / YouTube

Everyone knows Sarah / YouTube

Do you have any more you’d like to showcase and bring to your fellow FutureLearners? Drop a comment with the link below.

Learning Online

Reading: Learner engagement in MOOCs

After attending a FutureLearn partners webinar about designing online courses, the age-old issue of encouraging and engaging learners in online communication came up. It made me reflect on my past posts about online learning, specifically this one: MOOCs – 9 points on what I like, and what I don’t. If you want to go and read it before carrying on, be my guest.

Hurry back!

Glad you came back. What annoys me about MOOCs, and some people who design online courses in general, is the assumption that everything you build will be used, and be used the way you want it to be used. VLEs are somewhat to blame for the apathy or lack of engagement in online activities, especially discursive or forums or comment sections – you’re locked into one specific tool for engagement. Continue reading

MOOCs

MOOCs – 9 points on what I like, and what I don’t

Over two years ago I wrote about a few experiences I’d had with some online courses / MOOCs, and why I ‘failed’ (according to the general headline figures of engagement, attendance, etc. that are used in mainstream press).

I want to revisit this, in light of more experience in both designing MOOCs and being a student on them.

Disclaimer: This is based on courses I’ve taken on the FutureLearn, Coursera, Cloudworks, EdX, and WordPress (OcTEL) platforms. I also highlight whether is was a student on the course, or part of the development team.

1. Comments and Engagement: For the most part I’ve been a silent students. This is both deliberate and accidental. Where it’s been a deliberate choice to not engage in the comments and discussion it’s been because I knew I didn’t have the time or inclination to trawl through the hundreds of fairly uninteresting posts to add my two-pennies worth or find the one nugget of insight that is worth anything. It’s also because, for some courses, I didn’t have enough interest to take my engagement further.

Continue reading

'Dolly mixture' courses

Dolly Mixture courses

This week I had a great chat with @nancyrubin and @CliveBuckley after I re-tweeted Nancy:

Are Courses Outdated? MIT Considers Offering ‘Modules’ Instead

My thoughts on courses and training, as I mentioned above, as just this: courses tend to fit the organisational structure of the issuing body and don’t always fit the ‘need’ of the learner. You join (example) a specific school or faculty to start and complete your degree in Business Management or Economics or Sociology. But what if the specific subjects you really want to study are only loosely based around the course structure that the institution wants to teach? Continue reading

BYOD4L

Bring Your Own Devices for Learning: July 14-18 #BYOD4L

After such a successful run earlier this year, the team behind BYOD4L (Sue Beckingham, Chrissi Nerantzi, Andrew Middleton, et al) are working their magic again – put the dates in your diary: BYOD4L July 14-18. I have been invited back again this time to work with Sue, Andrew, and Chrissi (and the other team members) and will be engaging course participants online.

If you’re interested the details are below

YouTube: Bring Your Own Devices for Learning: July 14-18, 2014

Continue reading