Evaluation in Distance Education and E-Learning

Preface 1. Why do We Need a New Approach to Evaluation in Distance Education and E-learning? Overview istance Education vs. E-learning The Rapid Expansion of Distance Education and E-learning Rapid Structural Change Rapid Technological Change and Rising Stakeholder Expectations The Need for Continuous Course Improvement What is Evaluation? Why Do We Need a Professional Approach to Evaluation? What does a Professional Approach to Evaluation Look Like? Professional Program Models Responding to the Call for a Professional Evaluation Approach: The Unfolding Model Surveys/Interviews/Focus Groups/Online Ethnographies to Measure Learner Satisfaction with Course Components 2. The Theory and Practice of Program Evaluation Why are Program Evaluation Models Important? Classifying Program Evaluation Models Alkin and Christie’s (2004) Evaluation Tree The Limitations of Using Methods without Models Models with Values in the Foreground Consumer-orientation Responsive Evaluation Deliberative Democratic Evaluation Constructivism Theory-based Evaluation Models with Consequences in the Foreground The CIPP model Utilization Evaluation Empowerment Evaluation Where Do We Diverge with Alkin and Christie? Evidence, Values and Consequences as Overlapping Dimensions 3. Evaluation Theory and Practice in Distance Education and E-learning Evaluation Theory Models with Scientific Evidence in the Foreground Models based on Evidence, Values and Consequences Models based on Messick’s (1989) Framework A Summary of Evaluation Models in Distance Education Evaluation Practice Do Unintended Consequences Emerge in Authentic Evaluation Studies? 4. Messick’s Framework: What do Evaluators Need to Know? Overview The Overlap between Test Validity and Program Evaluation Messick’s Contributions Messick’s Framework The Overlap among the Four Facets The Controversy over Unintended Consequences Implications for Evaluation 5. Getting Started Planning the Evaluation Study The Ethics Review Process The Political Context of Evaluation Using the Unfolding Model as a “Road Map” Mixed Methods: Blending Quantitative and Qualitative Data What is Essential to Our Approach? Tailoring the Unfolding Model to Your Needs 6. The Scientific Basis Scientific Evidence How to Write Good Survey Questions Administering Surveys Qualitative Date: Interviews and Focus Groups Outcomes Checklists and Rubrics for Environmental Quality Learning Objects Relevance Cost-benefit Analysis Analyzing Survey Data Qualitative Data Analysis Steps in Data Analysis 7. Analyzing Values and Consequences Overview Underlying Values How to Identify the Values Underlying Your Course Course Goals and Objectives Writing Survey/Interview Questions about Underlying Values Analyzing Data on Underlying Values Unintended Consequences Instructional Consequences Social Consequences Analyzing Data on Unintended Consequences Assessing “Fit” Across the Framework How to Enhance the Validity of Your Findings Recommendations for Course Improvement Writing the Evaluation Report 8. Findings from Two Authentic Case Studies Distance Learning: Computing Science 200 (CPSC 200) Unintended Consequences Recommendations E-learning: Professional Writing 110 PWRIT 110: Recommendations for Course Improvement 9. Putting It All Together Overview Using Messick’s Framework for Program Evaluation of Distance Courses Using the Unfolding Model to Evaluate Your Courses Conducting Your Evaluation Study What are the Benefits of Using the Unfolding Model? What Have We Learned From Two Case Studies? E-learning and Beyond: Is the Unfolding Model the Last Word? The Future of Distance and E-learning The Future of the Unfolding Model Evaluation in Distance Education and E-Learning: The Unfolding Model By Valerie Ruhe, University of Minnesota and Bruno D. Zumbo, University of British Columbia