[important]The Social Media Lab is examining Massive Open Online Courses (MOOCs) as a part of the SSHRC-funded research initiative “Learning Analytics for the Social Media Age.” A previous post, provided an overview of MOOCs and examined some of the more prominent providers. This post will put a spotlight on Learning Analytics (LA) and their role within the MOOC environment. [/important]
Learning Analytics in the MOOC Environment
According to the New York Times, 2012 was the “Year of The MOOC.” Early proponents have heralded MOOCs as the vehicle that would usher in a revolution in education where courses would be free and available to all regardless of income or geography. Since then, the hype and lofty promises have been replaced with rising doubts and distrust. Concerns about MOOCs educational quality and low student success rates are on the rise.
To ameliorate some of these concerns around student retention and success in MOOCs, learning analytics (LA) have been forwarded as a technology solution that may be able to address some of the challenges faced by MOOCs and online learning in general. This blog post will provide an overview of the integrated LA features used by some of the major MOOC providers and then comment on their potential effectiveness.
Learning analytics predate MOOCs and are essentially a set of tools and techniques that measure, collect, analyze, and report data about students and their context. For example, many basic learning analytic features (e.g., the number of visits to a given page) are also part of existing Learning Management Systems (LMS) such as Blackboard or Moodle. In contrast to formal evaluation methods (e.g., tests, quizzes, etc.), LA use advanced data analytics to systemically examine patterns of students’ behavior in aggregate for the purposes of identifying areas of concern before they evolve into significant problems. Another function of LA is to help discover more about learning environments. For example, student usage statistics provide a window into how students are engaging with the material and each other.
Using publicly accessible data (e.g., FAQs, help pages, and demo accounts), we have compiled a summary of some of the major LA features available about the top three popular MOOC providers. The features are summarized in Table 1 below. (Note: as these platforms are changing rapidly, some of the features below may not represent the latest version of the platform. Please report any changes to these platforms in the comments section of this blog post.)
As the above table indicates, the top three MOOC providers have integrated some basic participation and activity statistics into their platforms. Of the three providers examined, Coursera provides the most robust set of tools for monitoring and analyzing student-generated data within the platform. Coursera also allows an instructor to separate learners into “cohorts” (e.g., students seeking certification versus students auditing a class). This enables a more precise examination of specific learning groups. In contrast to Coursera’s class-wide or target-group focus, Alison’s LA tool set is geared more toward providing information about individual learners. This might be useful in a smaller context, however, such a narrow gaze could be problematic in MOOC environments where there might be thousand or even ten of thousands of students. Excluding long answer and discussion board statistics, Canvas Network analytics provide a good mixture of individual and class-wide student data. However, as the Canvas Learning management system is open-source, this post can only comment on features found in the default version that is available for trial.
A MOOC Professor’s Perspective on Learning Analytics
To date, there are few formal studies that can shed light on whether or not the LA tools available in MOOCs are actually useful and effective from the instructor’s perspective. To get a sense about how LA are being used in MOOCs, we interviewed Professor Blaise Landry who taught Dalhousie’s first MOOC, “Grant Writing Bootcamp,” to over 15,000 registrants in the fall of 2013. The course was hosted on Instructure’s Canvas Network platform. “Grant Writing Bootcamp” followed the typical MOOC model; it was open publicly worldwide, the course content was divided into a series of modules, and students were awarded an unofficial certificate upon completion. It also shared in the characteristically high rate of attrition within MOOCs as only around 120 of the 15,000 students who originally registered received a certification of completion.
Speaking to both the high drop-out rates and the use of analytics, Prof. Landry suggested that it is problematic to try to compare MOOCs with more traditional online courses as they are essentially different entities. MOOCs are not governed by the same conditions as other technologically mediated forms of distance education. In an online University course, there is an expectation that many of the students come from a certain level of educational background. However, with MOOCs, students are coming into the course at many different stages in their academic or professional careers and have a wider range of needs and motivations for taking the course. This lack of uniformity presents a real challenge for course creation, as well as the development of analytics that can accurately capture data about the varied MOOC populations.
While he did not find the integrated LA tools available in Canvas particularly useful for his course, Prof. Landry expressed the value of the discussion board functionality of Canvas. Early on in the course, students were asked to introduce themselves via discussion threads. This allowed Prof. Landry and his TAs to learn about the background of some of the students involved in the MOOC. As the course progressed, the discussion board was a forum for students to not only engage in collective learning, but also to provide feedback about the course which could then be used by the course instructors. In this sense, the discussion board is an invaluable tool for MOOC instructors. However, the act of manually trawling through threads and comments is time consuming and imprecise. It is in the automation of such processes that LA can be of great value in a MOOC setting.
Like MOOCs, Learning Analytics is a nascent field that is still evolving. As demonstrated in the above table, all MOOCs providers do include some basic learning analytics tools. However, more targeted and robust LAs will have to be developed to effectively meet the unique challenges posed by MOOCs such as quality, access and scalability. An example could be an automated system that could identify and categorize student shifting sentiments and thoughts as shared on the discussion boards. Such a tool would give MOOC instructors the ability to quickly and efficiently identify emerging themes or problem areas as they arose.
Over the next few years, we will be delving into many of the issues and challenges related to developing learning analytic for MOOCs and other social media-based environments. If you like to join us in this exploration of LAs and MOOCs, please contact us. And if you are an instructor who uses social media in one or more of your classes, we would like to invite you to participate in an online survey. The survey is a separate but related line of inquiry about social media and learning. It is available at http://tinyurl.com/SMlearningsurvey
Written by Lee Wilson and Philip Mai, with editorial contributions by Anatoliy Gruzd