PROGRAM EVALUATION 2013 R&D, FEBRUARY 12, 2014 DBEEPEDML.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

Vs. Attending a Different Training as a Site Team.
Assessment: A Mirror with 2 Faces Accountability Reflective Practice.
WV High Quality Standards for Schools
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
M & E for K to 12 BEP in Schools
MATERIALS EVALUATION A perspective of evaluation and its application to EFL coursebook selection Prof. Pablo E. Requena.
1 Strengthening Teaching and Learning: Educational Leadership and Professional Standards SABES Directors’ Institute July 2011.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Needs Analysis Instructor: Dr. Mavis Shang
Evaluation. Practical Evaluation Michael Quinn Patton.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Questions to Consider What are the components of a comprehensive instructional design plan? What premises underline the instructional design process?
Standards and Guidelines for Quality Assurance in the European
Academic Assessment Report for the Academic Year Antioch University New England Office of Academic Assessment Tom Julius, Ed.D., Director Submitted.
REGIONAL PEER REVIEW PANELS (PRP) August Peer Review Panel: Background  As a requirement of the ESEA waiver, ODE must establish a process to ensure.
Dr.Mohamed E. Osman & Prof.Thuwayba A. Al Barwani With Dr.Abdo M. Al Mekhlafi Dr. Khalid Al Saadi Ms.Laila Alhashar Ms.Fathiya Al Maawali Ms.Zuhor Al lawati.
Up-Date on Science and SS February 2015 Faculty Meeting.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Instructional Design Eman Almasruhi.
Instructional System Design
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Curriculum Update January What are the big projects? Fall 2013 – Math Common Core Implementation Fall 2014 – English/Language Arts Common Core Implementation.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
1 Project of Reading Course Development Designer: Erin M Instructor: Mavis Shang Date: 06/09/2008.
Technology Use Plan Bighorn County School District #4 Basin / Manderson, Wyoming “Life-long learning through attitude, academics, and accountability.”
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
PBIS Meeting for BCPS Team Leaders and Coaches March 14, 2008 Oregon Ridge.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
20081 E-learning Lecture-10: EVALUATING THE IMPACTS OF E-LEARNING week 12- Semester-4/ 2009 Dr. Anwar Mousa University of Palestine Faculty of Information.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Workshops to support the implementation of the new languages syllabuses in Years 7-10.
WORKING TOWARDS INCLUSIVE ASSESSMENT Messages from across Europe Reutte 28th February 2008.
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Institutional Effectiveness A set of ongoing and systematic actions, processes, steps and practices that include: Planning Assessment of programs and.
ADMS 631 Evidence Based Decision Making Action Based Research Plan By Christopher Martinez.
Tier III Implementation. Define the Problem  In general - Identify initial concern General description of problem Prioritize and select target behavior.
TEACHER EVALUATION After S.B. 290 The Hungerford Law Firm June, 2012.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Focused Review of Improvement Indicators A Self-Assessment Process SPP Stakeholder Meeting December 16, 2009.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Evaluating Multilingual Education Programs Seminar on Multilingual Education Kabul, March 2010 Dennis Malone, Ph.D.
Data-Based Decision Making: Using Data to Improve Implementation Fidelity & Outcomes.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
B.A. (English Language) UNIVERSITI PUTRA MALAYSIA
Instructional Design Ryan Glidden. Instructional Design The process of planning, creating, and developing instructional or educational resources.
Zimmerly Response NMIA Audit. Faculty Response Teacher input on Master Schedule. Instructional Coaches Collaborative work. Design and implement common.
Development Team Day 5a October Aim To explore approaches to evaluating the impact of the curriculum on pupil learning.
PRACTICED BASED RESEARCH Overview 25 th November 2013.
Indicator 5.4 Create and implement a documented continuous improvement process that describes the gathering, analysis, and use of student achievement.
Monitor and Revise Teaching. ObjectivesObjectives Describe how to monitor teaching List ways to contribute to broader evaluations Explain how to review.
Development Account: 6th Tranche Strengthening the capacity of National Statistical Offices (NSOs) in the Caribbean Small Island Developing States to fulfill.
Workshop 1 Self-Assessment Committee (SAC)
Department of Political Science & Sociology North South University
Introduction to CPD Quality Assurance
Faculty Development Dr Samira Rahat Afroze.
February 21-22, 2018.
Presentation transcript:

PROGRAM EVALUATION 2013 R&D, FEBRUARY 12, 2014 DBEEPEDML

ABOUT THE PE2013 PROJECT A project of the SFL  DBE  DML  EPE Carried out by R&D (Zeynep Akşit, Defne Akıncı Midas / Hale Kızılcık) - interested researchers invited (FLE)

PROGRAM EVALUATION ---a systematic collection and analysis of information necessary to  improve a curriculum,  assess its effectiveness and efficiency, and  determine participants’ attitudes within the context of a particular institution. Brown, 1995

PURPOSE OF EVALUATION  Summative judgment  Learning and formative improvement  Accountability  Monitoring  Development  Knowledge generation

BENEFITS OF EVALUATION  Accountability  Information on specific strengths and weaknesses of the curriculum and its implementation  Critical information for strategic changes and policy decisions  Input for improved learning and teaching  Indicators of monitoring  Students (& their families)  Instructors  School Administration  University Board  Society  International Accreditors

OBJECTS OF EVALUATION Needs Analysis Objectives Testing Materials Teaching Brown, 1995

WHAT DO WE EVALUATE?  Curriculum design  The syllabus & program content  Classroom processes  Materials  The teachers  Teacher training  The students  Monitoring of ss progress  Learner motivation  The institution  Learning environment  Staff development  Decision making

WHAT ARE THE STEPS OF EVALUATıON?  What information do we need?  What will the results be used for?  Is evaluation necessary (do we already have this information)?  How much time is available?  What kind of information will be gathered?  Who will be involved in/support the evaluation?  How will we gather information (and who will be involved)?  How do we present the findings?

OBJECTS OF EVALUATıON Needs Analysis Objectives Testing Materials Teaching Evaluation Brown, 1995 Objectives

USEFUL TO WHO?  Stakeholders  Students  Instructors  Test Writers  Teacher Educators  Academic Coordinators  Administrators  Faculty Members

HOW DO WE EVALUATE?  Technical Procedure  Objectives – what do we intend to achieve?  Variables – what type of information is needed? (context, processes, outcomes)  Framework – what are the conditions for data gathering? (survey, environment analysis)  Instrumentation – what instruments will be used to collect data?  Data Collection – how will it be implemented?

IMPETUS MACRO  School-wide attempt at re-defining program outcomes ( METU Strategic Plan)  A need to coordinate the programs at the DBE & DML

IMPETUS MICRO  The last curriculum project – 2003  The last evaluation process – 2005 – problems indicated

SURVEY OF EXISTING CONDITIONS  The existing curriculum & syllabus  The materials in use  The teacher population  The learners  The resources of the program Dubin & Olshtain, 1990

WHAT DO WE WANT TO ACHIEVE?  To what extent the current program effectively addresses students’ needs?  Curriculum / Syllabus documents  Materials  Teachers  Test batteries  Environmental factors

EVALUATıON & CURRıCULUM RENEWAL

INITIAL PHASE Evaluation & Needs Assessment Analysis Results & Decision Making

SECOND PHASE Needs Goals & Objectives Content & Sequencing StrategiesMaterialsAssessment Review / Revise Present Situation & Target Situation Analyses Pilot Group

INıTıAL PHASE – STAFF ıNVOLVEMENT Needs Assessment Analysis Results & Decision Making R&D Core Group Voluntary Ts R&D Core Group Voluntary Ts R&D Core Group Voluntary Ts R&D Core Group Voluntary Ts R&D Core Group Ts as Decision-Makers R&D Core Group Ts as Decision-Makers

TENTATıVE SCHEDULE InputOutput1 st Stage2 nd Stage Focus Group InterviewsDBE Ss Questionnaire09 December25 Dec (+review) Administration of the Q13-17 Jan Ss Q Initial AnalysisSs Interviews17-24 Jan Feb Classroom Observations24 Feb.-03 Mar Teacher Questionnaire03-16 Mar (+review) Teacher Q Initial AnalysisTeacher Interviews17-30 Mar Apr Qual & Quan Data AnalysisPresentation of Results11 Apr.-15 May May 2014

WHERE ARE WE NOW?  Review of previous studies, literature, research reports,  Project outline / documentation  Focus group interviews  Ss Questionnaires

WHERE DO WE GO FROM HERE?  Data collection cont’d  DBE / DML Students  DBE / DML Instructors  Faculty Interviews  Faculty Documents  Data Analysis  Reporting

Project framework Tools Data Collection Data Analysis Report (Tentative) Program Revision Fall Spring Fall

Graves, 2000