2010-2012 TITLEIIA(3) IMPROVING TEACHER QUALITY COMPETITIVE GRANTS PROGRAM 1.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Education Consultant: Cyndi Chancery Planning & Set-Up.
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Migrant Education Comprehensive Needs Assessment
Lee County Human Resources Glenda Jones. School Speech-Language Pathologist Evaluation Process Intended Purpose of the Standards Guide professional development.
A Self Study Process for WCEA Catholic High Schools
1.  Why and How Did We Get Here? o A New Instructional Model And Evaluation System o Timelines And Milestones o Our Work (Admin and Faculty, DET, DEAC,
Enhancing Education Through Technology Round 9 Competitive.
OCTOBER 25, m-NET Mobilizing National Educator Talent (“m-NET”) is an innovative, nontraditional program to help special education teachers earn.
Mathematics and Science Partnership Grant Title IIB Information Session April 10, 2006.
Drawing by Mankoff: copyright 1993 The New Yorker Magazine, Inc.
Technical Assistance March 18, 2015 Webinar and Meeting
Spring 2015 TELPAS Holistic Rating Training System
Introduction to the MSP Management Information System Molly Hershey-Arista December 16, 2013.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
What is program success? Wendy Tackett, Ph.D., Evaluator Valerie L. Mills, Project Director Adele Sobania, STEM Oakland Schools MSP, Michigan.
Tennessee Promise Forward Mini- Grant Competition Tennessee Higher Education Commission Informational Webinar.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
New England Regional Colloquium Series “Systems of State Support” B. Keith Speers January 24, 2007.
ADEPT Framework
Periodic Program Review for Academics Affirming Excellence in Education LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
TitleIIA(3) Technical Assistance
Stronge Teacher Effectiveness Performance Evaluation System
Title II, Part A(3) Competitive Grant Program for Improving Teacher Quality Technical Assistance March 17, 2011 Webinar and Meeting.
Outcome Based Evaluation for Digital Library Projects and Services
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Mathematics and Science Education U.S. Department of Education.
ADEPT 1 SAFE-T Judgments. SAFE-T 2 What are the stages of SAFE-T? Stage I: Preparation  Stage I: Preparation  Stage II: Collection.
Introduction & Step 1 Presenter: Updated 6/21/2013.
Teacher Evaluation and Professional Growth Program Module 1: MSFE TEPG Rubric.
U.S. Department of Education Mathematics and Science Partnerships: FY 2005 Summary.
Creating Pathways for Education, Career and Life Success Webinar: Developing a Pathways Plan January 18, 2013 Facilitated by Jeff Fantine, Consultant.
South Western School District Differentiated Supervision Plan DRAFT 2010.
NC Teacher Evaluation Process
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Nuts and Bolts of the Title II, Part A Application Virginia Department of Education Coordinators’ Academy July 22 – July 24, Coordinators' Academy.
Why Do State and Federal Programs Require a Needs Assessment?
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
CIRTL Network Data Collection 3/2/2013. Institutional Portrait: Purpose Consistency with the TAR principle Accountability: – Helps us all monitor Network.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Governor’s Teacher Network Action Research Project Dr. Debra Harwell-Braun
1 Access to the World and Its Languages LRC Technical Assistance Workshop (Part 1) Access to the World and Its Languages I N T E R.
Vers national spatial data infrastructure training program NSDI Cooperative Agreements Program (CAP) Introduction to the Cooperative Agreements.
Proposal Writing Workshop Features of Effective Proposals.
1 Support Provider Workshop # East Bay BTSA Induction Consortium.
Professional Development Opportunities for the New Math Standards.
Education Unit The Practicum Experience Session Two.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Documenting Completion of your PDP
Mathematics and Science Partnership APR Updates apr.ed-msp.net.
© 2014, Florida Department of Education. All Rights Reserved. Student Unit Record Data Use Division of Florida Colleges December 8, 2015.
North Carolina Educator Evaluation System Jessica Garner
Overview of Policies and Procedures University of Missouri-Kansas City.
Math and Science Partnership National Science Foundation MSP Project Description FY’06 Institute Partnerships  Vision, Goals and Outcomes  Vision, Goals.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Consortium for Educational Research and Evaluation– North Carolina Building LEA and Regional Professional Development Capacity First Annual Evaluation.
Introduction to the Road to Quality Process using the Missouri Afterschool Program Self- Assessment.
Lenoir County Public Schools New North Carolina Principal Evaluation Process 2008.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
Title II, Part A(3) Competitive Grant Program for Improving Teacher Quality Technical Assistance March 26, 2009 Webinar.
Informational Webinar Troy Grant Assistant Executive Director for P-16 Initiatives Tennessee Higher Education Commission.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Learning Management System
Nevada Mathematics and Science (MSP) Program Grants Technical Assistance Meeting November 2014.
Conducting Needs Assessments for UF/IFAS Extension
Continuous Assessment Establishing Checkpoints
Presentation transcript:

TITLEIIA(3) IMPROVING TEACHER QUALITY COMPETITIVE GRANTS PROGRAM 1

 Evidence Partners Meet Federal Criteria  Student Data  Indication of student needs based on available test scores or other pertinent student data NEEDS DATA 2

 Participant Needs Survey Data  Standard Survey Form with place to add project-specific questions  Available for four core subject areas  Available in paper/pencil or electronic form  SAMPI compiles and returns data for use in proposal development (electronic survey only)  Submit summary of survey data and any additional data (student test scores, etc.) in narrative portion of proposal  Submit raw data with code numbers in database (data bases available for basic needs survey) NEEDS DATA 3

 Other Data  Provide data pertinent to partner needs if available  Previous Research/Evaluation Findings  Describe previous actions taken by IHE partner to address identified needs  Describe value of proposed intervention based on previous findings  Category 1: Describe results of previous TitleIIA(3) project on which proposal is based NEEDS DATA 4

 Teacher/School Commitment Forms  Informal “Understanding of Agreement” for teachers  Informal “Understanding of Agreement” showing partner school principal support  Participation in Technical Assistant Session— Part II  Prior to proposal submission to focus on needs data from survey and other NEEDS DATA 5

EVALUATION 6

1)Improve programming through use of evaluation data and 2)Determine the impact of project and statewide effort on new and returning participants and identify strengths and limitations of the projects. EVALUATION GOALS: 7

 Use common cross-site pre/post surveys, pre/post lesson observations, interview sample of participants, collect teacher artifacts, gather evidence of impact on students  Use additional instruments/procedures specific to project as desired to determine impact or gather implementation data  Gather and report participation/program data  Support a person with adequate time dedicated to evaluation  External or internal, with appropriate expertise and without major program coordination duties PROJECT-LEVEL EVALUATION 8

 Coordinated by SAMPI—Western Michigan University  Advise on local project use of pre/post surveys, pre/post lesson observation, interviews, teacher artifact collection, student impact data collection procedures  Advise on evaluation as requested  Coordinate cross-site meetings (see below)  Conduct observation of project PD activities  Compile data from across projects  Prepare periodic statewide reports CROSS-SITE STATE-LEVEL EVALUATION 9

ELEMENTS OF THE EVALUATION 10

 Pre/post surveys (plus comparison with previous survey data for Category 1 projects)  Pre/post lesson observations (plus comparison with previous observation for Category 1 projects)  Sample of interviews of participants  Collect sample of teacher artifacts DETERMINE IMPACT ON TEACHERS/OTHER PARTICIPANTS 11

 Student test data as appropriate  Procedures to identify changes in student learning (pre/post tests, pre/post surveys, focus group interviews, assessment of student work)  Collect sample of teacher artifacts to show changes DETERMINE IMPACT ON STUDENTS OF PARTICIPATING TEACHERS 12

 1st six months—one face-to-face meeting in Lansing, one webinar  Remainder of project funding period—two face-to-face meetings in Lansing, two webinars PROJECT DIRECTOR AND PROJECT EVALUATOR PARTICIPATION IN REQUIRED CROSS-SITE ACTIVITIES 13

 Required Common Reporting  Participation levels and demographics, PD types, overall assessment of progress toward goals  Final evaluation report  End of project participant-specific data consistent with cross-site statewide data collection  Sharing findings at cross-site sessions REQUIRED COMMON REPORTING 14

END-OF-PROJECT REPORTING  Director Report  Use standard report format available in electronic version  SAMPI can provide Access database to projects to maintain participation records and to facilitate report preparation 15

PART #1 SAMPLE REPORTING TABLES CategoryNumberMajor Target Audience (Yes or No) 1. Number of different teachers served by the project 2. Number of different administrators served by the project 3. Number of different paraprofessionals served by the project 4. Number of different parents served by the project 5. Number of “others” served by the project 16

PART #2: DIRECTOR PERCEPTIONS OF PROJECT ACCOMPLISHMENTS  Rate progress toward outcomes  Provide evaluation/other evidence to support rating 17

PART #2: DIRECTOR PERCEPTIONS OF PROJECT ACCOMPLISHMENTS-SAMPLE FORM Section A: #1: Intended Outcome (Type in Outcome Statement): Rating of Progress Toward Outcome: Rate the degree of progress you believe was made in accomplishing this outcome on a 5-point scale, with 1 = no progress and 5 = fully accomplished Evidence to Support Rating (Type in rational for your rating or, if pertinent, note that there is direct evidence in evaluation report section-see below): Discussion of Progress in Evaluation Section YesNoPage No. Complete the following tables, one for each of your intended outcomes as per your proposal. Outcomes are statements of intended impacts or results that will occur as a result of your professional development programming or other project interventions. 18

PART #3: NATURE OF PROFESSIONAL DEVELOPMENT/INTERVENTIONS  Hours, participants, schedule by type of intervention  Role of college faculty/content experts  Problems planning, implementing project  Problems recruiting teachers 19

PART #3 NATURE OF PROFESSIONAL DEVELOPMENT/INTERVENTIONS SAMPLE FORM PD Format No. Hrs. of this PD Provided Over Entire Project Participants for the PD (Key: T=Teachers, A=Administrators, PP=Para- Professionals, P=Parents, O=Others) When was this PD conducted? (Key: RSD=Regular School Day, AS=After School, EVE=Evening, SAT=Saturday, SUM=Summer) Workshops (usually one- or half-day sessions) Institutes (5 or more days usually in summer) College course work (for credit) E-Learning Courses (self paced web-based course) 20

NATURE OF PD/INTERVENTIONS, PART #3, CONTINUED - SAMPLE FORM NamePosition/ Institution Primary Role Hours of Involvement 3. Use the chart below to describe higher education faculty (both content and education faculty) or other external expert. PARTICIPATION IN YOUR PROJECT. 21

PART #4: REQUIRED COMPONENTS OF EVALUATION REPORT  Data collection  Progress toward project outcomes  Lesson observation data  Teacher and student artifacts  Impacts on participating students  Effectiveness of project partnership  Other 22

PART #4: REQUIRED COMPONENTS OF A PROJECT EVALUATION REPORT  Prepared by your internal evaluator  Based on their evaluation work  Evaluation reports will vary  Should be appropriately labeled as Core Evaluation Report Components

PART #4: REQUIRED COMPONENTS OF A PROJECT EVALUATION REPORT, CONTINUED Core Evaluation Report Component 1: Data Collection  Describe the data collection activities that occurred over the course of the project. 24

PART #4: REQUIRED COMPONENTS OF A PROJECT EVALUATION REPORT, CONTINUED Core Evaluation Report Component 2: Progress Towards Project Outcomes  For each proposed outcome of the project, briefly summarize progress made toward its accomplishment based on evaluation findings. 25

PART #4: REQUIRED COMPONENTS OF A PROJECT EVALUATION REPORT, CONTINUED Core Evaluation Report Component 3: Lesson Observation Data  Detailed findings from lesson observations should be included in the evaluator report 26

PART #4: REQUIRED COMPONENTS OF A PROJECT EVALUATION REPORT, CONTINUED Core Evaluation Report Component 4: Lesson Observation Data  Detailed findings from lesson observations should be included in the evaluator report 27

Core Evaluation Report Component 5: Impacts on Students of Participating Teachers  For those projects that have gathered data related to impact on students, detailed findings should be presented in the evaluator’s report, if not part of the Progress Towards Project Outcomes (#2) above. PART #4: REQUIRED COMPONENTS OF A PROJECT EVALUATION REPORT, CONTINUED 28

PART #4: REQUIRED COMPONENTS OF A PROJECT EVALUATION REPORT, CONTINUED Core Evaluation Report Component 6: Effectiveness of Project Partnership  Briefly describe the effectiveness of the partnership in implementing project activities. Provide evidence for your findings. 29

OTHER REPORTING (OPTIONAL)  Report as appendix to core required report  Can include information Director wants to share about intervention, materials used, etc.  Can include additional evaluation or other pertinent data about project  Can include samples of teacher or student materials 30

QUESTIONS?  Contact information:  Kristin Everett – SAMPI, Western Michigan Unversity  (269)