Case study: using MOOCs for conventional college coursework

In Spring 2013 San José State University (SJSU) launched SJSU Plus: three college courses required for most students to graduate, which used massive open online course provider Udacity’s platform, attracting over 15,000 students. Retention and success (pass/fail) and online support were tested using an augmented online learning environment (AOLE) on a subset of 213 students; about one-half matriculated. SJSU faculty created the course content, collaborating with Udacity to develop video instruction, quizzes, and interactive elements. Course log-ins and progression data were combined with surveys and focus groups, with students, faculty, support staff, coordinators, and program leaders as subjects. Logit models used contingency table-tested potential success predictors on all students and five subgroups. Student effort was the strongest success indicator, suggesting criticality of early and consistent student engagement. No statistically significant relationships with student characteristics were found. AOLE support effectiveness was compromised with staff time consumed by the least prepared students.

[1]  J. Tukey Some thoughts on clinical trials, especially problems of multiplicity. , 1977, Science.

[2]  Shane Dawson,et al.  Mining LMS data to develop an "early warning system" for educators: A proof of concept , 2010, Comput. Educ..

[3]  Ernest T. Pascarella,et al.  Studying College Students in the 21st Century: Meeting New Challenges , 1997, The Review of Higher Education.

[4]  Shanna Smith Jaggars,et al.  Online and Hybrid Course Enrollment and Performance in Washington State Community and Technical Colleges , 2011 .

[5]  Shanna Smith Jaggars,et al.  Adaptability to Online Learning: Differences across Types of Students and Academic Subject Areas. CCRC Working Paper No. 54. , 2013 .

[6]  M. Fifolt The College Fear Factor: How Students and Professors Misunderstand One Another , 2014 .

[7]  Shanna Smith Jaggars,et al.  Effectiveness of Fully Online Courses for College Students: Response to a Department of Education Meta-Analysis , 2010 .

[8]  Barbara Means,et al.  Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies , 2009 .

[9]  Rebecca Ferguson,et al.  Learning analytics: drivers, developments and challenges , 2012 .

[10]  I. Ryabov,et al.  The Effect of Time Online on Grades in Online Sociology Courses , 2012 .

[11]  A. Kravtsova,et al.  UNESCO INSTITUTE FOR INFORMATION TECHNOLOGIES IN EDUCATION , 1904 .

[12]  Chris Piech,et al.  Deconstructing disengagement: analyzing learner subpopulations in massive open online courses , 2013, LAK '13.

[13]  Y. Benjamini,et al.  Controlling the false discovery rate: a practical and powerful approach to multiple testing , 1995 .

[14]  John Fritz,et al.  Classroom walls that talk: Using online course activity data of successful students to raise self-awareness of underperforming peers , 2011, Internet High. Educ..

[15]  Ray Kaupp Online Penalty: The Impact of Online Instruction on the Latino-White Achievement Gap. , 2012 .