Answering Research Questions with Big Data from a MOOC

Slides:



Advertisements
Similar presentations
March 2007 ULS Information Literacy and Assessment of Learning Program.
Advertisements

Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Developing and Evaluating a Query Recommendation Feature to Assist Users with Online Information Seeking & Retrieval With graduate students: Karl Gyllstrom,
An evaluation of scaffolding for virtual interactive tutorials 指導教授 : 陳 明 溥 研 究 生 : 許 良 村 Pahl, C.(2002).An evaluation of scaffolding for virtual interactive.
Kelli Nitsch 9:00-11:00 Media Center. Participants will:  Engage in tasks/training focused around new content standards  Become aware of content curriculum,
Writing an Effective Proposal for Innovations in Teaching Grant
Assessing the Social Acceptability of Brief Experimental Analysis in the Context of a Complete Reading Intervention Program Greta Fenske, Erin Liffrig,
Example claims for approaches to the support of learning and teaching that influence, motivate and inspire students to learn fostering student development.
Student Assessment CERRA National Board Candidate Support Workshop Toolkit WS
Birzeit University Experience in Designing, Developing and Delivering e-enabled Courses Palestine December,2005 Dr. Osama Mimi, Birzeit University.
Tutorials via Social Networking. Samer El-Daher, Lucie Pollard School of Science.
Beginning Fall 2006, Clemson University implemented an Electronic Portfolio Requirement. This poster session addresses the following topics Integration.
Introduction to teaching and assessing so students will learn more using learner-centered teaching Phyllis Blumberg Warm-up activity How can instructor’s.
Learning Goals, Scales and Learning Activities
TLS Laura Winer Director, Teaching and Learning Services (TLS) MAUT Teaching and Mentoring Workshop for Non-Tenured Academic Staff 24 April 2015 The Teaching.
Stronge Teacher Effectiveness Performance Evaluation System
The Impact of On-line Teaching Practices On Young EFL Learners' Instruction Dr. Trisevgeni Liontou RHODES MAY
Fifth Annual NSF Robert Noyce Teacher Scholarship Program Conference July 7-9, 2010 Enrique Ortiz University of Central Florida Using a Teaching Goals.
September 9, Course overview Course description Course Website Required textbooks and readings Learning objectives Course components and assessment.
. GCSE Computer Science. General Information The spec has been developed with the support of Microsoft The specification and sample assessment materials.
May May Win MA (TESOL) Lancaster University.  Choosing the course  Choosing modules  Study tips  Support from the university  Coursework  Dissertation.
C R E S S T / U C L A Evaluating the Impact of the Interactive Multimedia Exercises (IMMEX) Program: Measuring the Impact of Problem-Solving Assessment.
First Principles of Instruction M. David Merrill Emeritus Professor Utah State University See my.
Authentic Assessment Principles & Methods
What should teachers do in order to maximize learning outcomes for their students?
The World of Assessment Consider the options! Scores based on developmental levels of academic achievement Age-Equivalent scores.
Post High School Planning: College and Career Naviance Family Connection Guidance Lesson Results Sample Deniece Chideme Ruben Escobar Fall 2010 Chideme.
Jenni Parker, Dani Boase-Jelinek Jan Herrington School of Education Murdoch University Western Australia.
Academic Program Review Chair’s Workshop John E. Sawyer, Ph.D. Associate Provost Institutional Research and Effectiveness.
PROF. DR. MOHD ADAM BAKAR HF :
Background Method Discussion References Acknowledgments Results  RateMyProfessors.com (RMP.com) launched in 1999 as an avenue for students to offer public.
King Library’s Online Information Literacy Tutorials Pamela Jackson Reference/Instruction Librarian San José State University Library March 15, am.
A proposal Connect the Dots: Curriculum Dashboard for STEM Careers Theodore Frick Associate Professor School of Education Indiana University Bloomington.
Copyright © 2004 Educational Testing Service Listening. Learning. Leading. Overview iBT/Next Generation TOEFL ®
Web Based Learning. Introduction Web-based environments provide new potential to enhance learning through a visual and interactive delivery of instruction.
Laura Winer Interim Director, Teaching and Learning Services (TLS) MAUT Teaching and Mentoring Workshop for Non-Tenured Academic Staff 25 April 2014.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
1 Module 8 Reporting Results. 2 Learning Objectives At the end of this session participants will:  Understand key points to effectively present results.
Traditional vs. Alternative Assessment Assessment is the process of finding out how well students have mastered the curriculum.
Delwyn L. Harnisch, University of Nebraska – Lincoln Leslie Lukin, Lincoln Public Schools.
Redesigning a Communication Support System for Teachers 05/12/2000 Yasuhisa Kato Learning, Design, and Technology Master Project.
Overview In this tutorial you will: learn what an e-portfolio is learn about the different things e-portfolios may be used for identify some options for.
JOSHUA WORLEY NOVEMBER 11, 2013 ITEC 7445 – FALL 2013 DR. GOETZEL EMERGING TECHNOLOGIES PROJECT GradeCam.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
What Makes e 3 (engaging, effective, efficient) Instruction? M. David Merrill Keynote ED-MEDIA CONFERENCE Honolulu June 25,
ELearning & Its Potential Tito O.OKUMU E-Learning Manage E-Learning Manager Presentation to College of Health Sciences on 9 th August 2012.
Present apply review Introduce students to a new topic by giving them a set of documents using a variety of formats (e.g. text, video, web link etc.) outlining.
Use of Learning Analytics in Massively Open Online Courses.
Instructional Practice Guide: Coaching Tool Making the Shifts in Classroom Instruction Ignite 2015 San Diego, CA February 20, 2015 Sandra
Internet Self-Efficacy Does Not Predict Student Use of Internet-Mediated Educational Technology Article By: Tom Buchanan, Sanjay Joban, and Alan Porter.
Celia Regimbal chapter Principles of Assessment 12.
Development of Self-Determination and Social Skills of College-Bound Students with Visual Impairments Report on an Intervention Program Designed to Improve.
Learning Assessment Techniques
Stephanie L. Craig, M.Ed. University of Kansas
Exploratory Factor Analysis Participants, Procedures, & Measures
A Usability Study of a Language Centre Web Site
Jenn Shinaberger Corey Lee Lee Shinaberger Coastal Carolina University
Social Presence within Two Massive Open Online Courses (MOOCs)
Steps for Curriculum Mapping
Design Case for a mini-MOOC
Using Pattern Matching to Assess Gameplay
Measuring Effectiveness of Instructional Games and Simulations
MASSIVE OPEN ONLINE COURSES (MOOCs)
From Controlled to Natural Settings
Indiana University School of Social Work
Improving and Assessing Instructional Effectiveness
What to do with your data?
Cooperative Learning Concepts
EDUC 2130 Quiz #10 W. Huitt.
Instructional Plan and Presentation Cindy Douglas Cur/516: Curriculum Theory and Instructional Design November 7, 2016 Professor Gary Weiss.
Presentation transcript:

Answering Research Questions with Big Data from a MOOC Theodore W. Frick, Ph.D. Professor Emeritus, Department of Instructional Systems Technology, School of Education, Indiana University Bloomington Cesur Dagli, Ph.D. Instructional and Graphics Designer, Midwest and Plains Equity Assistance Center: Great Lakes Equity Center, School of Education, Indiana University-Purdue University Indianapolis Rodney D. Myers, Ph.D. Independent Scholar, Bloomington, Indiana AECT Annual Convention Jacksonville, FL November 10, 2017

Overview

Value of MOOCs for Research Disadvantages of typical past research on instructional effectiveness Small convenience samples Limited generalizability Lack of replication Advantages of well-designed MOOCs MOOC: Massive, Open, Online Course Very large, representative samples Broad generalizability Relatively easy to replicate over time

Background of IPTAT: Indiana University Plagiarism Tutorials and Tests Mini-MOOC available worldwide since 2002 Millions of student users over first 14 years Major redesign in 2015, using First Principles of Instruction (Merrill, 2013) Results in big data: During first 22 months since Jan., 2016, over 29 million page views of interactive instruction 347,000 registered users from 204 countries 280,000 passed a Certification Test

Big data from IPTAT Supports Merrill’s prediction that First Principles of Instruction promote student learning Wide generalizability of findings 86% of students from U.S. 4% from China 3% from Canada Remaining 7% from 201 other countries Student level of education 82% Undergrad and advanced HS 18% Graduate (master’s/doctoral)

Background

Massive Open Online Courses (MOOCs) Massive  Large number of participants (Anderson, 2013) Open  Free and open (Downes, 2007; Hylén, 2006; Schaffert & Geser, 2008). Online  The delivery of courses through the Internet (Anderson, 2013) Courses  Different than traditional face-to-face and online courses (Wiley, 2012)

Mini-MOOCs Mini-MOOCs “Tightly focused in terms of scope and aimed at specific skills that can be developed and testing using automated means available through the Internet” (Spector, 2014, p. 390)

IPTAT as a Mini-MOOC IPTAT: Indiana University Plagiarism Tutorials and Tests Massive: A large number of participants Open: Anyone can take this tutorial Online: Web-based tutorial with mastery tests Not a course, per se—focused on how to recognize plagiarism, used as a learning resource by teachers and students worldwide

IPTAT Redesigned in 2015 Link: https://www.indiana.edu/~academy/firstPrinciples/ Available: 2016 to present

IPTAT Redesigned with First Principles of Instruction 5. INTEGRATION 2. ACTIVATION Watch Me! Where Do I Start? PROBLEM Let Me Do The Whole Task! 4. APPLICATION 3.DEMONSTRATION Let Me Do It! Don’t Just Tell Me, Show Me!

Simple to Complex Task Sequence

First Principles Implemented at Each Level of Task Complexity For example:

Activation Principle

Demonstration Principle

Application Principle

Application Principle (feedback)

Integration Principle

Practice Test (repeat Application)

Evaluation of Instruction: 30 items-- optional before taking a test (MOO-TALQ)

Certification Test (10 randomly selected items from large pool)

Test Feedback for Pass

Test Feedback for Fail

Methods

Purpose of Dagli (2017) Study Investigate the effectiveness of First Principles of Instruction in the context of a mini-MOOC

Problem Merrill (2002) claimed that the effectiveness of instruction depends upon the extent to which the principles are successfully implemented. Few extant empirical studies have investigated this claim (Merrill, 2013, Chapter 22). Deficiencies exist with regard to designing, developing, and deploying MOOCs and Mini- MOOCs (Spector, 2014).

IPTAT Data Collection MOO-TALQ for evaluating instructional quality Optional and only used data if user agreed to participate Only used results from first MOO-TALQ prior to first test Formed scale scores later during data analysis Certification Test on recognizing plagiarism 10 questions, randomly selected from a large inventory Answer at least 9 out of 10 questions correctly to pass 40 minutes max. to complete each test Only first test attempt was used

ALT: Academic Learning Time Not just engagement time on relevant learning tasks Must be successful engagement Past research on ALT since1970s: ALT positively correlated with student learning achievement (e.g., Berliner et al., 1992) Frick et al. (2009, 2010) found high correlations of reported ALT in higher ed with First Principles of Instruction What about First Principles and ALT in MOOCs?

TALQ (40 items) MOO-TALQ (25 items) ALT: Academic Learning Time (5) Global course and instructor quality items (3) Learning progress (5) Student satisfaction (4) First Principle: Authentic Problems (5) First Principle: Activation (5) First Principle: Demonstration (5) First Principle: Application (3) First Principle: Integration (5) ALT: Academic Learning Time (4) Global Course Quality (2) Student satisfaction (2) First Principle: Authentic Problems (3) First Principle: Activation (3) First Principle: Integration (3)

Data collection (cont’d) Data collected Jan. 12 thru Jan. 30, 2016 Imported data for qualified cases into SPSS Created scale scores for 9 MOO-TALQ scales in SPSS 1716 cases for undergrads and advanced HS students 300 cases for grad students (master’s and doctoral) Used SPSS Tables procedure for pattern analysis

Findings

Major Results Undergraduate and advanced high school students: Nearly 3 times as likely to pass their first Certification Test who agreed that they experienced First Principles and ALT Compared with those who passed their first Test who disagreed that they experienced First Principles and ALT Graduate students at master’s and doctoral level: Over 5 times as likely to pass their first Certification Test who agreed that they experienced First Principles and ALT

Next Planned Study of IPTAT

Next IPTAT Study: Analysis of Patterns in Time (APT) Track actual patterns of usage by creating temporal map for each user Effectively attempt to replicate Dagli’s study of user reports via MOO-TALQ survey BUT will actually have empirical trajectories of usage patterns (not just user reports of usage) Effectiveness of First Principles of Instruction to be determined by APT

Analysis of Patterns in Time (APT) Need to create version of IPTAT for tracking individual users (temporal maps) Collect thousands of temporal maps Then conduct APT queries of maps to determine likelihood of student mastery when First Principles of Instruction are and are not experienced by users See Myers and Frick (2015) for examples of APT

Summary

Value of MOOCs for Research Advantages of well-designed MOOCs Very large, representative samples Broad generalizability Relatively easy to replicate over time

Big Data Can Provide Empirical Support for Instructional Theory IPTAT: Indiana University Plagiarism Tutorials and Tests IPTAT was redesigned in 2015 using First Principles of Instruction (Merrill, 2013) IPTAT is a mini-MOOC and used by hundreds of thousands of students (since 2016) IPTAT results in big data Big data from IPTAT provide strong empirical support for First Principles of Instruction

Major References Dagli, C. (2017). Relationships of first principles of instruction and student mastery: A MOOC on how to recognize plagiarism. Bloomington, IN: Doctoral dissertation, Indiana University Graduate School. Frick, T., Chadha, R., Watson and Zlatkovska, E. (2010). Improving Course Evaluations to Improve Instruction and Complex Learning in Higher Education. Educational Technology Research and Development, 58(2), 115-136.  Frick, T. & Dagli, C. (2016). MOOCs for Research: The Case of the Indiana University Plagiarism Tutorials and Tests. Technology, Knowledge and Learning, 21(2), 255-276, DOI 10.1007/s10758-016-92 88-6. Frick, T., Dagli, C., Kwon, K., & Tomita, K. (in press, 2017). Indiana University Plagiarism Tutorials and Tests: 14 years of worldwide learning online. Proceedings of the 2016 AECT Summer Research Symposium. Merrill, M. D. (2013). First principles of instruction: Identifying and designing effective, efficient, and engaging instruction. San Francisco, CA: John Wiley & Sons. Myers, R. & Frick, T. (2015). Using pattern matching to assess gameplay. In C. S. Loh, Y. Sheng, & D. Infenthaler (Eds.), Serious games analytics: Methodologies for performance measurement, assessment, and improvement, (Chapter 19, pp. 435-458). Heidelberg, Germany: Springer.