Online Course Design Online Course Design EVALUATION AND REVISION

Slides:



Advertisements
Similar presentations
JENNIFER FREEMAN Online Course Design DEVELOPING EFFECTIVE CONTENT.
Advertisements

A Systems Approach To Training
Developing Common Assessments. What Do Students Know… Frequent monitoring of each students learning is an essential element of effective teaching; no.
Confirming Student Learning: Using Assessment Techniques in the Classroom Douglas R. Davenport Dean, College of Arts and Sciences Truman State University.
Assessing Student Learning TSS Participants: Dana Bradley Cheryl Hitchcock Arlecia Sewell.
Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
Assessing the Quality of Online Courses Rayane Fayed Instructional Designer American University of Beirut June 2013.
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
Welcome to Home Base. Why Home Base? Improving teacher effectiveness by providing tools aligned to NC’s Standard Course of Study that promote efficiency.
Using Home Base/Schoolnet to Deliver Assessments CTE Summer Conference July 14, 2014.
Applying Assessment to Learning
Assessment of Adult Learning in a Discipline-Based Dual Language Immersion Model® Fidel R. Távara, M.Ed. Assessment coordinator Florida Campuses.
Formative and Summative Evaluations
Keyboarding Evaluation. Types of Evaluation Diagnostic – identify skill level Formative – ongoing; student progress Summative – summarize progress  Note:
Facilitator Training Program
Assessment & Evaluation  A measurement tool  Non-judgmental*  On-going  Answers the questions:  How much did they learn?  How well did they learn.
Clara Fowler University of Texas M.D. Anderson Cancer Center, Houston
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
Innovative Educators Webinar December 1, 2010 Jan Norton, Presenter.
Developing Evaluation Instruments
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
ASSESSMENT Formative, Summative, and Performance-Based
Instructional Design Eman Almasruhi.
Instructional Design Gayle Henry. Instructional Design Instructional Design is creating experiences for the learner where how they learn is achieved in.
Department of Computing and Technology School of Science and Technology A.A.S. in Computer Aided Design Drafting (CADD) CIP Code Program Quality.
Currituck County Schools “READY, Set, Go-ing to be a Great year!” Convocation
Additional Unit 2 Lecture Notes New Instructional Design Focus School of Education Additional Unit 2 Lecture Notes New Instructional Design Focus School.
Software Evaluation Criteria Automated Assignment Applications RSCoyner 10/8/04.
Thepphayaphong Setkhumbong Master Degree Student, Department of Educational Technology, Faculty of Education, Silpakorn University, Thailand.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Understanding Meaning and Importance of Competency Based Assessment
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
Assessment of an Arts-Based Education Program: Strategies and Considerations Noelle C. Griffin Loyola Marymount University and CRESST CRESST Annual Conference.
CA12 Assessing Online Courses Howard University Spring 2015.
Dakota State University.
Student Growth Measures in Teacher Evaluation Using Data to Inform Growth Targets and Submitting Your SLO 1.
Instructional Technology Master of Education 1. LEARNING OBJECTIVES 1) Explain what an LMS is. 2) Differentiate between some types of LMS. 3) Identify.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Testing and Accountability Using Data to Help Improve Student Achievement By Charity Bell Executive Director, West Learning Community.
Assessment Literacy Interim Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Authentic Assessment Kellie Dimmette CI Pretest on Evaluation Part I 1.C & D 2.B & C 3.T 4.Valid, reliable 5.T 6.T 7.T 8.A & B 9.C 10.B.
Building Collaborative Learning Communities a division of Computer Strategies, LLC Your professional development solution.
Assessment and Continuous Improvement in Teacher Education.
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
PBL for the 21 st century. Getting Started Planning & Preparing Managing Reflect Perfect & Planning and Preparing Entry event Culminating products/rubrics.
Online Course Design Jennifer Freeman ACADEMIC ■ IMPRESSIONS
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Teaching Portfolios TA Training Session. What is a Teaching Portfolio?  Three types used in academia  The academic portfolio  The teaching portfolio.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
SEC.FAIL Information Security Defense Project Based Learning Rubric my.sec.fail/1SWqE6F.
Identifying Assessments
1 Assessment. 2  Formative assessment – assessment for learning  Summative assessment – assessment of learning.
Research Problem The role of the instructor in online courses depends on course design. Traditional instructor responsibilities include class management,
The Use of Formative Evaluations in the Online Course Setting JENNIFER PETERSON, MS, RHIA, CTR DEPARTMENT OF HEALTH SCIENCES.
Performance Management Procedural Job Aid A consultant’s guide through performance management. Learning Team A AET/550 University of Phoenix April 13,
Program Evaluation Making sure instruction works..
Instructional Design Ryan Glidden. Instructional Design The process of planning, creating, and developing instructional or educational resources.
Stetson University welcomes: NCATE Board of Examiners.
Introduction to Blackboard Rabie A. Ramadan Session 3.
ASSESSMENT and EVALUATION (seeing through the jargon and figuring out how to use the tools)
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
Principles of Instructional Design Assessment Recap Checkpoint Questions Prerequisite Skill Analysis.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
Learning Management System (LMS) /Course Management System (CMS) & Digital Portfolio.
Getting Prepared for the Webinar
#AdaptiveMap || The Challenges of Adaptive Content: The Perspective of two Universities Amanda Buckley (Bay Path University)
Principles of Assessment & Criteria of good assessment
Introduction to myIMPACT-Adjuncts
Presentation transcript:

Online Course Design Online Course Design EVALUATION AND REVISION JENNIFER FREEMAN

Session Goals Understand the difference between assessment and evaluation Define formative, summative and confirmative evaluation and understand the importance of each Explore theories and methods of evaluation Create a course evaluation and revision plan

Evaluation vs. Assessment Evaluation: measuring the quality and effectiveness of learning materials and activities Assessment: measuring students’ learning and achievement of goals and objectives

What Do We Evaluate? Objectives and alignment Quality of instructional materials, resources, strategies Tools and technology Testing instruments Evaluation chart Morrison, Ross and Kemp (2004) Evaluate objectives: are we teaching and assessing what we said we would? Evaluate content: everything from no typos present to factually correct content to aesthetically pleasing materials Evaluate instructional support: are students getting the support they need? (library, help desk, tutorials) Is external content reliable and of good quality? Evaluate the teaching strategies: did that group project work out the way we would thought it would? Evaluate the tools: how easy were they for students to use? Did the chat tool work? Was the flash exercise too complicated? Evaluate communication: was the level of interaction and communication appropriate? Instructional materials’ alignment with objectives and the effectiveness of testing instruments Quality of instructional materials Quality of external resources Effectiveness of instructional strategies Usability of tools and technology Effectiveness of teaching skills Evaluation chart

Formative Evaluation of Instructional Materials Why? Uncover problems early on; fix broken stuff Discover potential usability/accessibility issues Examine effectiveness and improve functionality Dynamic nature of online learning Find all of the typos, errors, broken links, gaps in content Test for problems that may arise for students with disabilities What looked good on the storyboard may not work well in practice New technologies and teaching theories surface every day in a field as you as online learning

Formative Evaluation of Instructional Materials What? When? An ongoing process, usually done both during development and while being taught Asks the question, “How are we doing?” Find all of the typos, errors, broken links, gaps in content Test for problems that may arise for students with disabilities What looked good on the storyboard may not work well in practice New technologies and teaching theories surface every day in a field as you as online learning

Who will use this evaluation information? How? Formative Evaluation Who will use this evaluation information? How? What should be evaluated? Who will use this evaluation information? Course development team Instructor How? What should be evaluated? Instructional materials Instructional strategies Use of tools and technology Keep the audience in mind when designing your evaluation plan and writing the questions you will ask…who will be interested in the evaluation feedback at this stage? Probably will have limited student feedback on instructional support at this point

Formative Evaluation: Questions to Ask Do learning activities and assessments align with the learning objectives? Do learning materials meet quality standards? Are the technology tools appropriate and working properly? Do learning activities and assessments align with the learning objectives? Do learning materials meet quality standards? Are learning materials error-free? Are learning materials accessible? Are learning materials usable? Are the technology tools appropriate and working properly? Are you able to draw a clear correlation to a learning goal for each course activity? Is each learning objective represented by content, activities and assessment? Focus on the instructional materials Focus on the tools – test them – are they working and will students be able to use them properly? Are further instructions or documentation needed for any tool being used? (LMS messaging, chat, discussion areas, assignment tool, tests, etc.)

Formative Evaluation: Gathering data Course development rubrics Checklists Focus group feedback consistency and repeatability Help desk error logs Students Faculty notes Course development rubrics Checklists Focus group feedback For consistency and repeatability To reduce bias, evaluation should not be done by members of the course development team…you need a “fresh pair of eyes” Develop a testing process/protocol/list of questions/evaluation forms Determine a standardized, set time for testing so it occurs regularly as a normal part of the process Help desk error logs Student FAQ discussion threads “Extra credit for errors found” idea Faculty notes jotted down during semester Facilitator note to self: Focus group feedback (think eportfolio project) and Q/A reviews Use of rubrics and checkpoints assigned to an experienced ID/developer who didn’t work on the course Focus group of potential users should test every part of the course (freelance reviewers) During the first semester a course is being taught, keep a close watch on the error logs from the help desk. Keep track of problem areas and note improvements to be made Let your students help!

          

Sample Formative Evaluation and Revision Plan Checkpoint #1 – syllabus, outline and first lesson Checkpoint #2 – half of the course, viewed on multiple platforms Checkpoint #3 – Entire course proofread/edited Checkpoint #4 – Entire course tech reviewed Checkpoint #5 – Final check (previous errors) Student survey after first three lessons Instructor survey after first three lessons Examination of help desk error logs During development Show sample checkpoint rubrics, checklists and surveys NEED TO IDENTIFY THESE Course is Live

Sample Formative Evaluation and Revision Plan Analysis of problems found Assign each issue a priority score Establish a threshold Prioritized list of change requests Assign corrections Make note of unaddressed issues Analysis of problems found How urgent is it? How long will it take to fix? Assign each issue a priority score Establish a threshold below which a course will be postponed Prioritized list of change requests…when’s the best time to revise? Assign corrections and establish a deadline for each Make note of unaddressed issues URGENT – will immediately affect a student’s grade (a broken test that must be taken by midnight) MAJOR – important content is unavailable (javascript mouseover function isn’t working correctly) MINOR – a typo in the instructor’s bio

Summative Evaluation Why? What? When? Why? What? When? Examine effectiveness Improve functionality Discover causes for failures Constant maintenance and improvements What? When? Usually done after the completion of each semester Asks the question, “How did we do?” Why? Examine effectiveness Improve functionality Discover causes for failures…fix existing problems What works in theory doesn’t always work in practice Constant maintenance and improvements to content and strategies…the dynamic nature of online learning What? When? Usually done after the completion of each semester Asks the question, “How did we do?” First semester is complete, now have student data and feedback available to identify problems not discovered earlier. “The first semester is like the first pancake.” Dynamic nature of teaching with technology: what improvements have been made that we can take advantage of? What new research is available?

Summative Evaluation Who will use this evaluation information? How? What should be evaluated? Who will use this evaluation information? Instructor Course development team Administration How? What should be evaluated? effectiveness of instructional materials and strategies the learning environment the instructor’s teaching skills availability and ease of use of tools and technology instructor satisfaction with the online teaching experience student satisfaction with the online learning experience Keep the audience in mind…what questions are they interested in having answered? In addition to the areas evaluated during the formative stage, we can now take a look at the areas measured by student success and feedback: the “feel” of the learning environment, the instructor’s skill in teaching online, the usability of the content and teaching tools; as well as the instructor’s opinions on the experience

Summative Evaluation: Questions to Ask Student success grades learning activities , assignments and assessments time Instructor and student satisfaction learning materials motivation tools Are program/department needs being met accreditations, prerequisites for other courses, competencies Is the course scalable? Did the students succeed? (grades) Did the learning activities and assessments align with the learning objectives? Were assignments and assessment appropriate to the content? Was time adequate to convey material and complete tasks? Level of instructor and student satisfaction (participation and opinion) Were learning materials easy to use and accessible? What content did students frequently have problems with? What areas of the course are error-prone? Were there any concerns about motivation? What tools did the instructor or students frequently have problems with? Should we continue to use chosen tools? Are program/department needs being met accreditations, prerequisites for other courses, competencies Is the course scaleable? How were students grades? How was the level of participation, according to student opinion? According to instructor opinion? Compared with other courses? How did students feel about the assignments and assessments? How did the instructor feel? What was the rate of success/completion of the assignments and assessments compared to other courses? What were the instructor’s and students’ opinions about time requirements? (This is a big one. Almost every online course attempts to do too much the first time around.) Other student opinions: content? Environment? Level of communication? Tools/technology? Re-visit the help desk logs…where were the problems?

Summative Evaluation: Gathering data Student grades Student surveys Instructor satisfaction surveys Learner self-assessments Pretest/posttest comparisons Assessment item analysis Focus group feedback Help desk error logs Discussion forum and chat archives

Sample Summative Evaluation and Revision Plan Analyze surveys; identify themes or trends Analyze assessment results Analyze help desk logs Examine course archives Compile list of issues Research solutions Assign priority ratings Assign tasks and establish deadlines Analyze student and faculty surveys; identify themes or trends Analyze assessment Analyze help desk logs Examine course archives Compile list of issues (including issues noted during formative phase that have yet to be addressed) Research solutions Determine time needed to fix Assign priority ratings Assign tasks and establish deadlines Will vary, depending on how often, soon course is to be offered again resources available (release time/extra staff less likely to be forthcoming than during initial development) May have to limit revisions to urgent issues until time/resources are available

Confirmative Evaluation Why? long-term effectiveness large-scale issues What? When? after the completion of each semester “How are we doing now?” Why? Discover long-term effectiveness of the course Address large-scale changes necessary to the curriculum Constant maintenance and improvements to technology, content and strategies…the dynamic nature of online learning What? When? Usually done some time after the completion of each semester Asks the question, “How are we doing now?” Curriculum appropriate within department (prerequisites for other courses, fit, etc.)? Continue using LMS? Budgetary

Confirmative Evaluation Who will use the evaluation information? How? What is being evaluated? Who will use the evaluation information? Instructor Administration How? What is being evaluated? Students’ long-term retention of learning, usefulness to their long-term goals Long term effectiveness of the course within the program LMS and other technology/tools

Confirmative Evaluation: Questions to Ask program/department needs trends in student satisfaction scalability/sustainability tools and technology Are program/department needs being met (accreditations, prerequisites for other courses, competencies) Trends in level of student satisfaction Course valuable/meaningful to long-term goals? (program/career) Is the course scalable? Is the course sustainable? Learning environment, technology, tools still meeting our needs? LMS evaluation Major course redesign

Confirmative Evaluation: Gathering data Program student surveys Departmental administrative opinions Faculty peer review of learning materials Employer surveys Retention data Help desk logs LMS effectiveness study / survey

Common LMS Evaluation Criteria cost server space adequate maintenance needs being met vendor/software standards reliability security customization course structure and presentation Costs rising at a reasonable rate? Are server space and maintenance needs being met? Is the vendor and software in compliance with required standards? How reliable has the system been? Have there been any security concerns? Level of customization possible within the system? Satisfied with the structure and presentation of courses? Satisfied with the authoring tools provided? Satisfied with the tracking capabilities of the system? Satisfied with the testing engine and/or assessment tools available in the system? available in the system? • Are faculty, students and staff satisfied with the collaboration tools (discussion areas, journaling, help desk, whiteboard) provided through the system? • Are faculty, students and staff satisfied with the productivity tools (calendar, help files, search engine) provided through the system? • Is student / faculty / staff documentation or training sufficient? • How usable do students, faculty and staff find the tools? • What is the vendor’s reputation in the industry? • What is the vendor’s position in the industry?

Common LMS Evaluation Criteria tracking assessment collaboration tools productivity tools documentation and training vendor reputation and position Satisfied with the collaboration tools (discussion areas, journaling, help desk, whiteboard) provided through the system? Satisfied with the productivity tools (calendar, help files, search engine) provided through the system? Is student / faculty / staff documentation or training sufficient? How usable do students, faculty and staff find the tools? What is the vendor’s reputation in the industry? What is the vendor’s position in the industry?

http://www.edutools.info/static.jsp?pj=4&page=HOME

Evaluation Activity is it accessible? is it scaleable? is it sustainable? Questions will be asked and discussed for each learning object/activity idea on the flip chart. If the answer to any question is no, ideas for improvement will be discussed.

Jennifer Freeman jenni.z.freeman@gmail.com