"Understanding Persistence in MOOCS" [PDF]

Understanding Persistence in MOOCs Rachel Baker1, Thomas S. Dee2, and Brent Evans3 1 Stanford Graduate School of EducaEon 2 Stanford Graduate School of EducaEon & NaEonal Bureau of Economic Research 3 Vanderbilt University Digital Learning Forum, June 11 2014 CENTER FOR EDUCATION POLICY ANALYSIS
At STANFORD UNIVERSITY
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
cepa.stanford.edu
IntroducEon !  My ongoing, collaboraEve research related to “digital learning” !  Blended learning !  Massive online professional network of teachers !  Web-­‐based plagiarism tutorial (Dee and Jacob 2012) !  Massive Open Online Courses (MOOCs) !  The pracEcal (and academic) appeal of digital learning? !  The heady promise of low cost, broad access, capacity to scale with fidelity, and the delivery of targeted content !  Digital environments are also a tractable “sandbox” for conducEng basic research on instrucEonal design and human-­‐capital development CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
IntroducEon !  Today’s focus is on MOOCs and the most basic form of student engagement: persistence !  What traits specific to courses, lectures, and students appear to influence student persistence? !  A field experiment: can a pre-­‐commitment “nudge” encourage sustained engagement among registrants? !  Caveats about persistence as a relevant outcome !  Deeper forms of engagement and learner outcomes ma`er also !  The opEmal level of persistence should not be 100% !  Other Experimental Research in Progress !  Study of the labor-­‐market effects of MOOC compleEon !  Study of peer dynamics in MOOC discussion fora CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Exploring Persistence in MOOCs !  Unique data set based on n = 44 Coursera MOOCs offered in 2012/13 !  37 courses from Stanford, 6 from U Penn and 1 from U Florida. !  Most are in CS, mathemaEcs, and staEsEcs; 48k+ registrants per class on average !  Data on mulEple course traits (size measures, pre-­‐requisites, lecture length, offered previously) !  Key outcome measures !  % of registrants who watched 20%+ of lectures (mean = 38.5%, SD=12.3%) !  % of registrants who earned cerEficate (mean = 5.5%, SD = 4.8%) !  % cerEficate | 20%+ lectures (mean = 21.8%, SD = 10.6%) CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Key Course-­‐Level Findings !  Results from basic cross-­‐secEonal regressions !  Second or higher offerings of a given course have substanEally lower levels of persistence (e.g., 10 percentage points less likely to watch 20%+ videos; 5.4 percentage points less likely to get cerEficate) !  IniEal offering of a given course a`racted matriculants with highest moEvaEon? !  However, courses offered more recently had modestly higher persistence: 10 weeks later " 1 percentage point more likely to get a cerEficate !  Growing public awareness of MOOCs during this 2012-­‐2013 sample window a`racted higher-­‐moEvaEon students? !  Increasing legiEmacy of MOOC cerEficate during study window? CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Key Course-­‐Level Findings !  Courses with pre-­‐requisites have lower persistence on the intensive margin among engaged students !  7.7 percentage points less likely to get a cerEficate | Having watched 20%+ of lectures !  Surprising “null” findings related to course size !  The numbers of students per course and lectures per course appear to have small, staEsEcally insignificant effects on persistence !  Similarly, the average lecture length across courses is unrelated to the measures of persistence CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Exploring Persistence across Lectures !  ~2,400 unique lectures in these 44 MOOCs !  Key outcome measure: % of registrants who watched the lecture (mean = 15.7%, SD=9.9%) !  General pa`ern across these courses !  High iniEal engagement that falls off rapidly (e.g., within 10 lectures) and stabilizes at lower level CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Exploring Persistence across Lectures !  Quasi-­‐experimental analysis to model Yigt -­‐ % of registrants who watched lecture i, in course g, and “batch” t (i.e., week). !  Fixed effects for each course and each “batch” unrestricEvely control for differences across courses and over Eme. !  Thought experiment: Aser controlling for what makes a course unique and a moment in Eme unique, what features of a lecture, Xigt, predict student engagement? CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Key Lecture-­‐Level Findings !  The first lecture of the week is 2.1 percentage points more likely to be viewed than addiEonal lectures !  Length of video has small, staEsEcally insignificant effects on watching !  Caveat about out-­‐of-­‐sample predicEons !  The Etles of each lecture are related to student engagement, even condiEonal on their sequencing within the week !  Lectures with “IntroducEon”, “Overview”, and “Welcome” in the Etle are 2 to 12 percentage points more likely to be watched !  Lectures with “Review”, “Conclusion”, “Exercise”, “OpEonal”, “Advanced” are 1 to 6 percent points less likely to be watched CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Exploring Persistence across Students !  n = 2.1 million student registrants across 44 MOOCs but with limited student informaEon !  Key outcomes: Earned a CerEficate (mean = 5.3%), # of lectures watched (mean = 10.3) !  More detailed student informaEon from survey fielded at the beginning of one of the science MOOCs !  Basic econometric specificaEon with course fixed effects ! Thought experiment: among students within the same course, what student traits predict persistence? CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Key Student-­‐Level Findings !  Those who register long before a course starts (5+ weeks) are 1.4 percentage points less likely to get a cerEficate !  Those who register a4er a course started are substanEally less likely to get a cerEficate (3 to 8 percentage points) and watch 1 to 5 fewer lectures !  Only parEal “catch up” on earlier missed lectures !  Students moEvated by an affiliaEon with a presEgious university watched significantly more lectures (7.5). CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Possible Design ImplicaEons !  This analysis " several ripe candidates for more formal A/B tesEng? !  Lower long-­‐term persistence among otherwise engaged students in courses with pre-­‐requisites " role for tracks tailored to levels of desired engagement? !  The primacy of a week’s first lecture (and the seeming irrelevance of lecture length) suggests instructors should consider somewhat longer & fewer lectures !  Instructors should be aware that lecture 9tles suggesEng brevity & synopsis (e.g., “overview”) may drive uptake !  Embedding pedagogically criEcal exercises within such lectures may also promote uptake !  Shorter pre-­‐registraEon windows (<5 weeks) may promote persistence CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Time Inconsistency and Choice !  An extensive literature in behavioral economics suggests individuals exhibit “Eme inconsistent” preferences: a choice for period t+1 is preferable in period t but not when t+1 actually arrives !  e.g., buy a gym membership for next month; don’t a`end !  ImplicaEon: individuals may value and seek out “precommitment” mechanisms (e.g., buy healthy food to bind tomorrow’s snacking) !  A growing empirical literature provides mixed evidence on the take-­‐up/
efficacy of precommitment devices with respect to employee effort, savings, and healthy behaviors !  Might precommitment opportuniEes influence MOOC persistence? !  Low persistence of registrants suggests Eme inconsistency !  As does lower engagement of those who registered 5+ weeks early CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
An Experiment: “Nudging” Persistence !  We randomly assigned registrants in a science MOOC (n = 18,043) to an opportunity to an opportunity to precommit to watch the coming week’s first lecture !  Treatment condiEon: email encouragement from instructor on scheduling lecture Eme with link to Qualtrics survey in which they could choose a day and Eme for watching the week’s first lecture !  Control condiEon: instructor email with Qualtrics survey about browser choice !  Theorized mechanisms of this “sos” precommitment device !  DeviaEng from a pre-­‐chosen day/Eme implies psychological costs (e.g., cogniEve acEvaEon and dissonance, regret aversion) !  Also, professional recommendaEon of the course instructor CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
The Experimental Sewng !  Science-­‐based Coursera MOOC offered for the first Eme in 2003 !  Not overly technical and no prerequisites !  3 available tracks (audiEng, qualitaEve, quanEtaEve) !  Course structure: 8 topics, each lasEng 1 week; ~12 lecture videos released per week (each 10-­‐20 minutes in length) !  Videos available for streaming or download; discussion forum also available for students CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
The Experimental Design !  “Blocked” random-­‐assignment design !  Divide sample by baseline traits related to persistence: baseline survey compleEon X email type (edu/gmail/other) !  Reduces risk of spurious imbalance across T/C condiEons !  Harnesses staEsEcal power for subgroup results !  Analysis of student traits affirms the internal validity of the experimental design !  Student traits are balanced across the T/C condiEons, both overall and for the sample that completed the pre-­‐course survey !  Treatment/control emails were sent shortly before the first week of lectures and again before the second week !  ObservaEon of discussion forum suggests no treatment contaminaEon CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Results – Uptake & Compliance !  Modest uptake of an opportunity for precommitment !  10.2% of the 9,022 students assigned to the treatment condiEon completed the survey !  Even with parEal treatment compliance, large sample sizes " opportunity to detect effects on outcomes !  Preferred scheduling is for the day aser the video was released and in the evening (local Eme) !  Most students in treatment condiEon watch video on day they idenEfied and within an hour or two of the chosen Eme CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Results – Lecture Watching !  Overall, 47% of students watched the first lecture for week 1 and 32% watched the first lecture for week 2 !  The treatment condiEon had no sta9s9cally significant effects on the probability of watching the first lecture of week 1 or of week 2 (point esEmates of -­‐0.01 and -­‐.003, respecEvely) !  Evidence that the treatment actually reduced longer-­‐run persistence & performance (p-­‐value < 0.10) !  CerEficate compleEon by 0.8 percentage points !  Number of lectures watched by 0.74 !  Grade in course by 0.64 on 1-­‐100 scale CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Treatment Heterogeneity !  Some evidence that the treatment improved lecture persistence and grades among those who registered early (p-­‐value < 0.1) !  PosiEve effects among those for whom Eme inconsistency is most salient? !  More detailed data on the students that completed the baseline survey provides opportunity to explore this result further !  NegaEve effects of the treatment highly concentrated among those moEvated by curiosity about online environment !  Watched 14 fewer lectures !  10 percentage points less likely to get a cerEficate CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Discussion
!  Modest uptake of precommitment suggests a`enuated role for Eme inconsistency in explaining MOOC persistence (or low self awareness of self-­‐control problems) !  Some suggesEve evidence of intended effects for narrow subgroups (early registrants, those moEvated by presEge of university) !  Stronger evidence of nega9ve effects if moEvaEon is shallow. Instructor email signaled seriousness of purpose that pushed them away? !  A cauEonary tale? Treatment heterogeneity ma`ers! Nudges can interact with individual moEvaEons in possibly unanEcipated ways. This is why rigorous evaluaEon is important! !  Apart from lecture design, other possible ways to elicit engagement? CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu
Thank you!
Email: [email protected]
Twitter: @Thomas_S_Dee
CENTER FOR EDUCATION POLICY ANALYSIS at STANFORD UNIVERSITY
cepa.stanford.edu