“Scientifically Based Evaluation Methods” Presented by Paula J. Martin COE Conference, September 13, 2004.

Slides:



Advertisements
Similar presentations
AmeriCorps State and National Evaluation Requirements August 16,
Advertisements

Introduction to Monitoring and Evaluation
U.S. Department of the Interior Bureau of Indian Education Elementary and Secondary Education Act (ESEA) Flexibility Request: Summary of Key Provisions.
Consensus Questions.  The Education Study scope is broad and includes the following areas under the role of the federal government in public education.
ALIGNING STATE SYSTEMS TO IMPROVE TEACHER QUALITY FOR STUDENTS WITH DISABILITIES.
THIS WORKSHOP WILL ADDRESS WHY THE FOLLOWING ARE IMPORTANT: 1. A comprehensive rationale for funding; 2. Measurable objectives and performance indicators/performance.
MSDE Alternative Governance Plan Development School: James Madison Middle School January 2012.
Determining Validity For Oklahoma’s Educational Accountability System Prepared for the American Educational Research Association (AERA) Oklahoma State.
Funding Opportunities at the Institute of Education Sciences: Information for the Grants Administrator Elizabeth R. Albro, Ph.D. Acting Commissioner National.
Lecture(2) Instructor : Dr. Abed Al-Majed Nassar
Introduction & Background Laurene Christensen National Center on Educational Outcomes National Center on Educational Outcomes (NCEO)
Quality Assurance Program - The Law - 33 USC § 892b. Quality assurance program (a) Definition. For purposes of this section, the term "hydrographic product"
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
ORC TA: Medicare Rural Hospital Flexibility Grant Program HRSA U.S. Department of Health & Human Services Health Resources & Services Administration.
Mathematics/Science Partnerships U.S. Department of Education: New Program Grantees.
By: Becky Guzie Chapter 5: Developing Adaptations to Promote Participation in Inclusive Environment.
Federal Program Monitoring and Support Division Charlotte Hughes, Director Donna Brown, Section Chief.
Education in Delaware: ESEA Flexibility Renewal Community Town Hall Ryan Reyna, Office of Accountability.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Reviewer Conference Call March 12, 2014 Program Representative: LCDR Makeva Rhoden Division of Healthy Start and Perinatal Services Maternal and Child.
HOW TO DO A STATE LONGITUDINAL EVALUATION MATH AND SCIENCE PARTNERSHIPS PROGRAM FEBRUARY 2011.
CALIFORNIA DEPARTMENT OF EDUCATION – Reviewed on 8-May-12. The content was not modified. Jack O’Connell, State Superintendent of Public Instruction Quality.
Assessing Students With Disabilities: IDEA and NCLB Working Together.
Council of State Science Supervisors Secretary’s Math and Science Initiative NCLB M/S Partnerships Philadelphia, PA March, 2003 Presented by: Triangle.
12/07/20101 Bidder’s Conference Call: ARRA Early On ® Electronic Enhancement to Individualized Family Service Plans (EE-IFSP) Grant and Climb to the Top.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
HECSE Quality Indicators for Leadership Preparation.
Title III Notice of Proposed Interpretations Presentation for LEP SCASS/CCSSO May 7, 2008.
No Child Left Behind Math and Science Partnerships Title II Part B.
NCLB Federal Funding Planning Meeting Private Non Profit Schools LEA Date.
Program Evaluation NCLB. Training Objectives No Child Left Behind Program Series: Program Evaluation To provide consistency across the State regarding.
ESEA Flexibility: Overview Maryland Accountability Program Presentation 1 of 8.
Federal Update: Part II NASDTEC Allison Henderson, Westat June 7, 2015.
ACADEMIC PERFORMANCE INDEX (API) ADEQUATE YEARLY PROGRESS (AYP) PROGRAM IMPROVEMENT (PI) SEPTEMBER 18, 2014 Accountability Progress Reporting Update.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
AmeriCorps Grantee Training Evaluation and Research September 11, 2014.
CALIFORNIA DEPARTMENT OF EDUCATION Jack O’Connell, State Superintendent of Public Instruction Quality Education Investment Act of 2006 (QEIA) February.
No Child Left Behind Education Week
Evaluating gifted and talented students January 2007.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
WELCOME Title I School-wide Open House EWING PUBLIC SCHOOLS
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
HEOA Implications on Talent Search Texas Association of Student Special Services Programs 36 th Annual Conference March 8 – 11, 2009.
Intervention and Support Inclusion Questions. Early and Strategic  How does the school provide purposeful early intervention and support to lift the.
Teacher Incentive Fund U.S. Department of Education.
Preparation Plan. Objectives Describe the role and importance of a preparation plan. Describe the key contents of a preparation plan. Identify and discuss.
February 2016 Overview of the Every Student Succeeds Act.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
The Every Student Succeeds Act Highlights of Key Changes for States, Districts, and Schools.
Overview: Every Student Succeeds Act April ESEA in Ohio In 2012, our state applied for and received a waiver from provisions of No Child Left Behind.
Office of School Turnaround Center for Accountability and Improvement, Ohio Department of Education 25 South Front Street, Columbus, Ohio
Overview of the FY 2011 SPDG Competition Jennifer Coffey, Ph.D. State Personnel Development Grants Program Lead 1.
Diane Mugford – Federal Accountability, ADAM Russ Keglovits – Measurement and Accountability, ADAM Renewing Nevada’s ESEA Waiver Flexibility Request.
U.S. Department of Education Office of Special Education Programs Building the Legacy: IDEA 2004 Highly Qualified Teachers (HQT)
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Evaluation What is evaluation?
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Stages of Research and Development
Every Student Succeeds Act of 2015: Highlights and
ESEA Flexibility: An overview
Driving Through the California Dashboard
GREENHOUSE GAS EMISSIONS INVENTORY
Investing in Innovation (i3) Fund
Kansas Leads the World in the Success of Each Student.
The Role a Charter School Plays in its Charter Authorizer’s Submission of the Consolidated Federal Programs Application Joey Willett, Unit of Federal Programs.
NSTA Summer Congress July, 2002
Quality Assurance Program - The Law -
Maryland State Board of Education October 25, 2011
Assessing Students With Disabilities: IDEA and NCLB Working Together
Monitoring and Evaluating FGM/C abandonment programs
Presentation transcript:

“Scientifically Based Evaluation Methods” Presented by Paula J. Martin COE Conference, September 13, 2004

Introduction US Department of Education Definition Where is it now? What does it establish Why should we be concerned? Potential Impact on TRIO How can we prepare Possible evaluations we can do now

Scientifically Based Research Methods Definition According to the US Department of Education “Scientifically Based Research Methods” are considered to be a priority for program projects proposing an evaluation plan that is based on rigorous scientifically based research methods to assess the effectiveness of a particular intervention. This priority is intended to allow program participants and the Department to determine whether the project produces meaningful effects on student achievement or teacher performance.

Where Is It Now? Currently this stipulation exists within the ESEA or Elementary Secondary Education Act as reauthorized by the NCLB (No Child Left Behind Act). It is intended to ensure that Federal funds are used to support activities and services that work. It is a Proposed Priority for other education programs in Federal Register, Vol. 66, November 4, 2003, pp Comments were accepted up to December 4, 2003.

What does it propose to establish? If the priority is used as a competitive preference priority, points awarded under this priority will be determined by the quality of the proposed evaluation method. In determining the quality of the evaluation method, we will consider the extent to which the applicant presents a feasible, credible plan that includes the following: The type of design to be used (that is, random assignment or matched comparison). If matched comparison, include in the plan a discussion of why random assignment is not feasible.

What does it propose to establish continued Outcomes to be measured A discussion of how the applicant plans to assign students, teachers, classrooms, or schools to the project and control group or match them for comparison with other students, teachers, classrooms or schools. A proposed evaluator, preferably independent, with the necessary background and technical expertise to carry out the proposed evaluation. An independent evaluator does not have any authority over the project and is not involved in its implementation. In general, depending on the implemented program or project, under a competitive preference priority, random assignment evaluation methods will receive more points than matched comparison evaluation methods.

Why should we be concerned? Potential Impact on TRIO Proposing that this be implemented by any office as needed for various projects within the Department of Education. Contains key buzz words: “evaluation”, “consistent with statutory purpose”, “project effectiveness”, “evaluation method”, and “outcomes to be measured”. Current climate that exists within the Department due to NCLB, increased pressure to hold projects they oversee as being accountable, and increased monitoring of projects. The fallout from ongoing and completed National Evaluations of TRIO programs such as Upward Bound and Talent Search.

Potential Impact on TRIO It is very conceivable that if this priority is adopted that we will see it or some facsimile appear within our grant applications under the evaluation section. If an external evaluator is required, there is no guarantee that additional grant money will be set aside for the hiring of such an individual. Will require many of us to incorporate an evaluation method(s) that most of us know little about. May require the revamping of the design of our projects in order to be able to meet the evaluation requirements.

How can we prepare? Begin to study what are “randomly-assigned” and “quasi- experimental designs with matched comparison conditions” as well as other alternative experimental designs such as “regression discontinuity” or “single-subject”. Look at various evaluation plans such as “Logical”, “Theoretical”, and so on to see what best fits our projects and then begin to implement them. Begin doing internal and external evaluations of our projects. Begin to look at comparison data. Begin to educate and solicit the cooperation of outside agencies who will be key to carrying out these designs, such as our target schools, agencies, institutions, participants, and so on.

Possible Evaluations We Can Do Now? Look at the graduation rates from high school of our participants vs. that of students who are not our participants in our target schools. Look at college placement rates of our participants vs. that of students who are not our participants in our target schools. Look at college graduation rates. Set up our own cohorts, if not already mandated as in the case of Upward Bound and now SSS, to follow as to their progress towards graduation and college placement and graduation. Collect pertinent data that will allow us to conduct comparative analyses.

Highlights of my comments in response to Federal Register Proposed Rules Need to consider cost and whether there will be additional monies for the hiring of an external evaluator and implementation of the evaluation plan and method. Not all grantees are created equal with the same amount of resources at their disposal. On what authority would grantees be able to demand the cooperation of schools, teachers, students to participate in a study or provide data? Randomly assigned design might be in violation of grant regulations that require that project activities/services be available to all participants because of need. Would necessitate a reexamination of performance reports criteria and reporting.