Download presentation
Presentation is loading. Please wait.
Published byMelanie Garrison Modified over 8 years ago
1
Holistic Evaluation of an Academic Online Program Aug. 5, 2004 Larry Schankman Mansfield University
2
Today’s Outline l Introductions (Me and You) l Why Evaluate l Popular Evaluation Models l Alternative Models l Proposed Plan l Instruments & Data l Student Database
3
Why Evaluate l Internal Reasons Improve Program or Courses Validate Goals & Objectives Validate Effectiveness/Impact l External Reasons Accreditation Funding Validate (Justify) Program’s Worth Marketing and Status (recognition)
4
Evaluation Models l Decision-Oriented (bottom line: keep, change, discard?) Kirkpatrick and variations (e.g. Brinkerhoff) Emphasis on Inputs, Outputs, Processes, and Products Not intended to improve academic departments l Objectives-Based (achieve objectives?) “scientific” but no concern for long-term (far transfer) No account for unintended consequences or impact l Instructional Design (Learning Theory approach) Focus on development & maintenance (confirmation) l Accreditation and Program Review (external mandate) No concern for far transfer or long-term impact
5
Confirmative Models l Dessinger & Moseley; Hellebrandt & Russell; Misanchuk; Morrison, Ross, & Kemp l “new paradigm for continuous improvement” to assess ongoing behavior and results l Examines: Need (ISD/ADDIE) Design and Development (ISD/ADDIE) Reaction (Kirkpatrick) Accomplishment (Kirkpatrick) Learning Transfer (Kirkpatrick) Impact (Kirkpatrick) Value (various influences)
6
Maintenance Models l Reeves & Hedberg; Sims; Tennyson l Quality control process: is instruction as effective now as originally designed l Basic Questions: are instructional materials current and worth using are instructional goals and objectives still relevant are media elements still achieving intended results have learner skills, knowledge or attitudes changed has the technology environment changed, suggesting inclusion of new media sources is the current delivery platform up to date.
7
Proposed Plan l Examination of Student Data l Examination of Program Documents l Course Reviews (ID perspective) l Kirkpatrick 4-Level Evaluation (customized) l Other (e.g. Self-Assessment for National Baldrige Award for Education)
8
Examination of Student Data l Number & Percent Enrolled l Number & Percent Graduated l Percent Withdrawn, Dropped or Inactive l Percent Dismissed l Graduates: Percentage of Cohort (by class/semester – look for trends)
9
Program Documents l Strategic Planning documents (Philosophy and Mission Statement, Program Goals and Objectives, etc.) l Student Handbook l New Student Letter & Welcome Packet l Faculty Manual l Program Review l Accreditation Documents
10
Course Reviews l Comprehensive examination of instructional design, instructional strategies, cognitive load, assessments and activities, alignment of objectives to assessments and program mission (View Checklist and Objectives Matrix)ChecklistMatrix l Transcript analysis to examine learner engagement, dissonance and negotiation skills l Examination of eCollege records to determine average times for learner participation in threaded discussion, total time in course, number of downloads, etc.
11
Kirkpatrick Levels 1. Student reaction and satisfaction (student and faculty surveys)surveys 2. Learning (learner assessments, including PRAXIS tests) 3. Behavior (surveys, focus groups, success cases, interviews)interviews 4. Results and impact (questionnaires/interviews for supervisors/colleagues, observations, performance reviews of graduates)questionnaires
12
Student Database l Data from Student Information System l PRAXIS Test Scores l Student Contact Information l Student Employment & Contacts l Stored and manipulated in SQL Server l Web Application for data entry and student self-updates (login with SSN)
13
Mid-Term Survey l Web-based ASP Application l Self-Scoring and Self-Tabulating l Data stored in Student Database l View Student Survey (static Web page)Web page l View Faculty Survey (raw prototype)prototype
14
Focus Group Interviews l Conducted at Residency (hotel) l Two groups of 8-10 students l 3 Demographic Questions l 8 open-ended questions l View InstrumentInstrument
15
Impact Interviews l Conducted by Phone or at Job Site l Supervisors, Colleagues/Peers, and Students of Graduates l Emphasize Graduates who Worked Before and After Completing Degree l Focus: Performance Change & Impact l View InstrumentInstrument
16
Comments on Model l What’s missing? l What is unnecessary? l What should work? l What won’t work? l Other comments?
17
3/1/201617 Thank You! Questions??? Paper available at: http://library.mansfield.edu/larry/evalplan.pdf http://library.mansfield.edu/larry/evalplan.pdf
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.