The Online Academy Evaluation Plan. Need for Evaluation 1.Inform and direct current and future development 2.Examine the usability of the checklist 1.

Slides:



Advertisements
Similar presentations
Flexible Grouping Practices
Advertisements

1. Creativity and Innovation 2. Communication and Collaboration
Schools Where Technology Improves Learning Ron Anderson & Sara Dexter Project Co-directors Funding for this project was provided by a grant from the U.
Performance Assessment
A Guide to Implementation
Training Module for Cooperating Teachers and Supervising Faculty
SEVEN FREQUENTLY ASKED QUESTIONS ABOUT USABILITY TESTING Usability Testing 101.
Integrating Technology into the Classroom Design Team: The Design Divas Kristen Sabo, Kelly Neville, Candi Chandler, Leigh Davis Instructional Design EDIT.
Formative and Summative Evaluations
Digital Library Resources Instructional Design (5100:631) Dr. Savery April 27, 2010.
Online Education Transforming the traditional classroom Eddie Elfers Office of Teaching and Learning Technologies March 14, 2002.
Grade 12 Subject Specific Ministry Training Sessions
NLII Mapping the Learning Space New Orleans, LA Colleen Carmean NLII Fellow Information Technology Director, ASU West Editor, MERLOT Faculty Development.
INTEL CONFIDENTIAL, FOR INTERNAL USE ONLY 1 Intel Teach Essentials Program Curriculum Roundtable – Oxford – July 2013.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Selecting and Developing Courses for Police Product Road Map.
Developing an Effective Evaluation to Check for Understanding Susan E. Schultz, Ph.D. Evaluation Consultant PARK Teachers.
Introduction to Conducting Research. Defining Characteristics of Conducting Research An inquiry process that involves exploration. Taps into the learner's.
By, David Anderson.  Determine team members to collaborate on assessment.  Create a written AT assessment plan:  Determine assessment questions  Expected.
Qatar University Exemplary Online Course Award
The New Teaching Institute Program Evaluation Report Lin Chengyan Eric Whitmer Anastasia Kalmykova.
Copyright © 2008 Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and the Intel Teach Program are trademarks.
Standard III Jackie Cadena Sara Reibman Armandina Villarreal Oralia Altamirano Learner-Centered Technology and Information Access Strategies for Librarians.
Action Research March 12, 2012 Data Collection. Qualities of Data Collection  Generalizability – not necessary; goal is to improve school or classroom.
21 st Century 2.0 Project Creating a 21 st Century Learning Environment at Hillview School.
November 3, 2010 Virtual Windshield Tours: Integrating Google Maps and Earth into the Health Education Classroom Presenters Melissa Haithcox-Dennis, Assistant.
A Model for EAP Training Development Zhiyun Zhang IDE 632 — Instructional Design & Development II Instructor : Dr. Gerald S. Edmonds.
Copyright © 2008, Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and Intel Teach Program are trademarks of.
The Online Academy (TOA) John Baek Michelle Didier Dan Feinberg Lisa Herbert Danalyn Robinson Heather White.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
© 2010 by Nelson Education Ltd.
Introduction to Evaluation January 26, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Learnings from Classroom Connections Findings and Conclusions from Two Studies on the Statewide Laptop Initiative Dr. Wade Pogany – South Dakota DOE –
Exploring the learner experience of ePortfolios for formative and summative feedback in the health sciences Susi Peacock (CAP), Sue Murray (CAP) and Alison.
Evaluator Workshop for Personnel Evaluating Teachers Rhode Island Model The contents of this training were developed under a Race to the Top grant from.
Copyright © 2009 Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and the Intel Teach Program are trademarks.
ADEPT 1 SAFE-T Evidence. SAFE-T 2 What are the stages of SAFE-T? Stage I: Preparation  Stage I: Preparation  Stage II: Collection.
Copyright © 2008, Intel Corporation. All rights reserved. Intel, the Intel logo, Intel Education Initiative, and Intel Teach Program are trademarks of.
Chapter 13. Reviewing, Evaluating, and Testing © 2010 by Bedford/St. Martin's1 Usability relates to five factors of use: ease of learning efficiency of.
Teresa K. Todd EDAD 684 School Finance/Ethics March 23, 2011.
Florida Education: The Next Generation DRAFT March 13, 2008 Version 1.0 Lesson Study Presented by: Darliny G. Katz, Instructional Reading Specialist Florida.
WELCOME BACK! (Project Lead the Way Engineering Design & Development Course) Block 3 (12th Grade) BCIT Medford Room E110.
Reading First Overview of 2004 Site Visits Jane Granger, M.S.
STEM EDUCATION TRANSFORMATION Barbara McAllister May 2013 INTEL’S MODEL FOR.
Design and Delivery of Adult Learning Programs Fall 2015 Dr. Robin Hurst.
THE POSTMORTEM Chapter 10 Introduction to Team Software Process.
WELCOME TO WINTER QUARTER 2007 We are halfway there! iii. Sample Graduation Requirements & Integration Plan 2007a (Stanford.
Assessing Learning: Before and After Prof. Janice M. Karlen Director of Business Programs Coordinator of Credit for Prior Learning CUNY – LaGuardia Community.
Instructional Design The practice of arranging media and content to help learners and teachers transfer knowledge effectively.
Chapter 5 Informal Assessment.
Student Achievement Through Teacher Evaluation Presenters Dr. Jane Coughenour Dr. Karen Chapman Mr. Michael Matta.
TEACHING AND LEARNING WITH TECHNOLOGY IN ENGLISH AND LANGUAGE ARTS INSTRUCTION BY CHRISTEN BURKE.
Classroom Management TUESDAY, MARCH Classroom Management Goals 
Instructional Design Course Evaluation Phase. Agenda The Evaluation Process Expert Review Small Group Review The Pilot Feedback and Revision Evaluation.
Jacklyn Altuna, M.Ed. Hannah Betesh, M.P.P.. Evaluation Context  Multi-year, random-assignment evaluation of a teacher professional development program.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
COM 535, S08 Designing and Conducting Formative Evaluations April 7, 2008.
Scaffolding Learning in Online Courses
 The professional growth and development of teachers is the fundamental purpose of teacher assessment.
MAKING THE SHIFT: FROM CLASSROOM TO ONLINE COURSE DESIGN: SESSION 4 Patricia McGee, PhD and Veronica Diaz, PhD.
National Educational Technology Standards For Students.
Implementing edTPA An Overview.
School – Based Assessment – Framework
Set Sail on a Three-Course Tour: Three examples of a QM Reviewed Course Krista MacDonald Doña Anna Community College Sharon Lalla New Mexico State University.
Thinking with Technology Course Module 9
SPE 578 STUDY perfect education/spe578study.com
EDU 675 STUDY Education for Service-- edu675study.com.
Induction Competent Skilled
Accessibility Supports Training
Undergraduate Survey Data
Presentation transcript:

The Online Academy Evaluation Plan

Need for Evaluation 1.Inform and direct current and future development 2.Examine the usability of the checklist 1. Organization and self-regulation 3.Determine effectiveness of navigation

4.Review course modules for the following: Correlation to SOL Clarity of instruction Engagement and motivation of instruction Content presented at the appropriate level Adequacy of the resources, skills, and content-adequate for challenge Pedagogical approach and format Pedagogical re-engineering Need for Evaluation

 Analyze instructional strengths (pedagogical re- engineering)  Determine the effectiveness of the design and transfer of the materials from classroom to online.  Evaluate the usability and the benefits of the checklist to the learner  Locate ease of use problems Purpose/Goals

 The evaluation strategy is to determine the effectiveness of the initial modules of the seven courses being developed for TOA. Using expert and one-to-one reviews.  The results of this evaluation will be used to inform and direct the current and future design and development of TOA. Strategy

Expert review a.Find a way to determine whether the site adheres to Clark and Mayer criteria and Gardner criteria. b.Recruit experts. c.Prepare questions. d.Design data collection tools. e.Set up testing area. f.Conduct evaluation. g.Analyze results. h.Write evaluation. Objectives

One-to-One review a.Recruit students. b.Prepare evaluation questions. c.Design data collection tools. d.Write script. e.Set up testing area. f.Conduct evaluation. g.Analyze results. h.Write evaluation. Objectives

Social/Cultural Factors  Priscilla Norton, TOA Project Director  The Policy Board  School districts  Subject Matter Experts (SMEs)  Students

 multi-level evaluation to explore as many perspectives as possible  evaluate at Levels 1, 2, and 3 of Kirkpatrick’s Levels of Evaluation.  the level of formality of the evaluations will vary from informal to formal Levels of Evaluation

 use observations during the one-to-one evaluations with the students  conduct interviews for the expert reviews  follow a script with a list of prepared questions, allowing for the introduction of additional questions or probing questions when needed.  evaluations will be recorded on either audiotape or videotape.  we will document observations and interviews in field notes. Data Collection

Analyze data by reviewing field notes and transcripts to look for common themes, patterns, and trends. Data Analysis

Logistics  Identified potential evaluators  Assigned team members to conduct evaluation (2 to each evaluation, one person interviews while the other records)  Identified tentative location and time  Written interview questions  Developed a protocol and script for each type of evaluation.