Advancing Evidence-Based Reforms in Federal Math and Science Education Programs MSP Regional Conference – Dallas, TX David Anderson The Coalition for Evidence-Based.

Slides:



Advertisements
Similar presentations
Council for Education Policy, Research and Improvement CEPRI Projects Presentation to State Board of Education February 17, 2004.
Advertisements

The Readiness Centers Initiative Early Education and Care Board Meeting Tuesday, May 11, 2010.
Harold Pollack, Co-Director. Founded in 2008 to partner with Chicago and other jurisdictions to carry out randomized experiments to learn more about.
Executive Office of the President and the Cabinet
Job Search Assistance Strategies Evaluation Presentation for American Public Human Services Association February 25, 2014.
Achieving the Dream. Achieving the Dream is a national effort to help more community college students succeed, with a special focus on students of color.
Bringing Evidence-Driven Progress To Education MSP Regional Conference - Boston Jon Baron The Coalition for Evidence-Based Policy March 30, 2006.
MSDE Alternative Governance Plan Development School: James Madison Middle School January 2012.
Evidence, Ethics, and the Law Ronnie Detrich Wing Institute.
Unit 5: The Executive Branch
Computing Leadership Summit STEM Education Steve Robinson U.S. Department of Education White House Domestic Policy Council February 22, 2010.
Increasing Government Effectiveness Through Rigorous Evidence About “What Works” Jon Baron Coalition for Evidence-Based Policy NASCSP Conference, March.
Collective Opportunity in Collaboration. Reasons to Rejoice Funders usually like To impact large numbers. To impact large geographic areas. To interact.
STEM Education Reorganization April 3, STEM Reorganization: Background  The President has placed a very high priority on using government resources.
Center of Excellence in Mathematics and Science Education Cooperative Partners College of Arts and Sciences College of Education Dr. Jack Rhoton East Tennessee.
Division of Research & Economic Development Report to the URI Faculty Senate April 19, 2012 Peter Alfonso, Ph.D. Vice President for Research & Economic.
Bridging Research, Practice, and Policy in the Field of Early Childhood Education Wingspread Recommendations and Next Steps.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Acting Commissioner, National Center for Education Research.
Funding Opportunities at the Institute of Education Sciences: Information for the Grants Administrator Elizabeth R. Albro, Ph.D. Acting Commissioner National.
Funding Opportunities NSF Division of Undergraduate Education North Dakota State University June 6, 2005.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Healthy North Carolina 2020 and EBS/EBI 101 Joanne Rinker MS, RD, CDE, LDN Center for Healthy North Carolina Director of Training and Technical Assistance.
Educational Research Funding Opportunities W. Eryn Perry.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
NSF Programs That Support Research in the Two-Year College Classroom  V. Celeste Carter, National Science Foundation Jeffrey Ryan, University of South.
CH THE EXECUTIVE OFFICE OF THE PRESIDENT ADVANCED AMERICAN GOVERNMENT.
SFI’s Conference and Workshop Programme Eimear Holohan, Scientific Programme Officer.
If We Build It, Will They Come? The Case for Structural Change to Support STEM Education Reform Association of American Colleges & Universities Annual.
©2007 Dr. Karl Squier1 Toolkit 1 –Step 1b Understanding a Complete Planning Cycle.
Class Size Reduction vs. “Race to the Top” and corporate-style reforms What does the research say, who supports, and who benefits Presentation to SOS March.
University Strategic Resource Planning Council Budget.
TEMPLATE DESIGN © Virginia Union University STEM Education for Pre Service Educators Abstrac t Lessons LearnedConclusion.
CONNECTING SCIENCE TO DECISIONMAKING ON CLIMATE CHANGE David Blockstein, Ph.D., Senior Scientist, NCSE Executive Secretary Council of Environmental Deans.
Research Policies and Mechanisms: Key Points from the National Mathematics Advisory Panel Joan Ferrini-Mundy Director, Division of Research on Learning.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Mathematics and Science Education U.S. Department of Education.
> Tom Wolf, Governor Pedro Rivera, Acting Secretary of Education | Ted Dallas, Acting Secretary of Human Services Early Learning in Pennsylvania Today.
HECSE Quality Indicators for Leadership Preparation.
Less Pain, More Gain: An Evidence-Based Approach to Long-term Deficit Reduction Jon Baron Coalition for Evidence-Based Policy March 2013.
President’s Office of Science and Technology Policy Deborah D. Stine Specialist in Science and Technology Policy December 3, 2008.
Council of Academic Deans from Research Education Institutions Donald E. Thompson Acting Assistant Director NSF Directorate for Education and Human Resources.
Don Dodson, Senior Vice Provost Diane Jonte-Pace, Vice Provost for Undergraduate Studies Carol Ann Gittens, Director, Office of Assessment Learning Assessment.
The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
The National Center for Special Education Research (NCSER) in The Institute of Education Sciences (IES): An Introduction.
Congress created the NSF in 1950 as an independent federal agency. Budget ~$7.0 billion (2012) Funding for basic research.
Mathematics and Science Partnerships program U.S. Department of Education Regional Conferences February - March, 2006.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
1 Alliances for Graduate Education and the Professoriate Comparison Groups and Other issues September 18, 2008 by Catherine M. Millett, Ph.D. Policy Evaluation.
Mathematics and Science Partnerships: Summary of the FY2006 Annual Reports U.S. Department of Education.
+ IDENTIFYING AND IMPLEMENTING EDUCATIONAL PRACTICES SUPPORTED BY RIGOROUS EVIDENCE: A USER FRIENDLY GUIDE Presented by Kristi Hunziker University of Utah.
2006 Fall Workshop PLANNING and ASSESSMENT: A TUTORIAL FOR NEW DEPARTMENT CHAIRS – A REFRESHER COURSE FOR OTHERS.
Institutional Effectiveness A set of ongoing and systematic actions, processes, steps and practices that include: Planning Assessment of programs and.
Project Design Jennifer Coffey OSEP May 4,
Strategies for Effective Program Evaluations U.S. Department of Education The contents of this presentation were produced by the Coalition for Evidence-Based.
Carpe Diem Phase II Making the Case. Jot down 3 questions that you have related to Phase II (the one’s that you carried into the room today)
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Spring 2015 OMSP Request For Proposal. Important Dates Intent to Submit: March 21, 2015 Applications: 4:30 p.m., Friday, May 15, 2015 Announcement of.
What Makes A Juvenile Justice Program Evidence Based? Clay Yeager Policy Director, Blueprints for Healthy Youth Development.
1 American Competitiveness Initiative John H. Marburger, III President’s Council of Advisors on Science and Technology March 28, 2006.
The Role of Federal Government in Driving Research and Evaluation Grover (Russ) Whitehurst The Herman and George R. Brown Chair Senior Fellow Director.
Evaluation of NSF’s Research and Evaluation on Education in Science and Engineering (REESE) program Challenges and Strategies Joy Frechtling, Westat John.
Project Design SPDG Competition FY C. QUALITY OF PROJECT DESIGN (0-20 points)  Describe the extent to which the goals, objectives, and outcomes.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
Jo Handelsman Associate Director For Science
The Diplomas Now Approach
Ch. 8 Sect. 4 Obj: Explain the duties of the executive office
Unidata Policy Committee Meeting
The Curry School of Education October 3, 2018
Presentation transcript:

Advancing Evidence-Based Reforms in Federal Math and Science Education Programs MSP Regional Conference – Dallas, TX David Anderson The Coalition for Evidence-Based Policy February 8, 2007

Coalition for Evidence-Based Policy Board of Advisors Robert Boruch - Co-Chair, Campbell Collaboration Jonathan Crane - Sr Fellow, Progressive Policy Institute David Ellwood - Dean, Harvard’s JFK School Judith Gueron – fmr President, MDRC Ron Haskins – Sr Advisor to President for Welfare Policy Robert Hoyt – Founder, Jennison Associates Blair Hull – Founder, Hull Trading Co David Kessler – fmr FDA Commissioner Jerry Lee – President, Jerry Lee Foundation Dan Levy – Researcher, Mathematica Diane Ravitch – fmr Asst Secretary of Education Laurie Robinson – fmr Asst Attorney General, DOJ Howard Rolston – fmr Director of Research, HHS/ACF Isabel Sawhill – fmr Associate Director of OMB Martin Seligman – fmr President, American Psychological Assn Robert Slavin – Co-Director of CRESPAR at Johns Hopkins Univ Robert Solow – Nobel Laureate in Economics, MIT Nicholas Zill – Vice-President, Westat

Problem: Social programs, set up to address important problems, often fall short by funding ineffective practices When gov’t-funded interventions are evaluated in rigorous studies, many are found to have small or no effects. Interventions producing meaningful, sustained effects on important outcomes tend to be the exception. This pattern also occurs in other fields where rigorous evaluations of intervention impact are conducted (e.g., medicine, psychology).

Why It Matters: We’ve made little progress in many areas of social policy U.S. has made no significant progress versus drug/alcohol abuse since U.S. has made very limited progress in raising K-12 achievement over past 30 years. U.S. poverty rate today is higher than in 1973.

Congressionally-Established Academic Competitiveness Council (ACC) Established by Congress in 2006 Comprised of cabinet secretaries and other top officials from the 13 federal agencies that manage federal STEM education programs. Charged with evaluating and improving the effectiveness and coordination of these programs.

One Result … Office of Management and Budget (OMB) has asked all federal STEM programs to submit information on their ongoing and planned rigorous evaluations of intervention impact. OMB will consider these plans in making FY 08 budget decisions.

What constitutes a “rigorous evaluation of intervention impact”: Well-designed randomized controlled trials as the highest quality evaluation for determining an intervention’s impact. Well-matched comparison-group studies as a second-best alternative when randomized controlled trials are not feasible. (Comparison-group studies are sometimes called “quasi-experimental” studies.)

Less rigorous study designs Less rigorous study designs include: Unmatched or partially-matched comparison-group studies; and Pre-post studies Such designs can be very useful in generating hypotheses about what works, but often produce erroneous conclusions regarding an intervention’s impact.

Job Training Partnership Act: Impact on Earnings of Male Youth (Non-arrestees) Program group Control group

Our Review of Evaluations of Federal STEM Education Programs/Projects For ACC We reviewed 100+ evaluations submitted by 41 federal programs. 10 were rigorous evaluations of an intervention’s impact, of which 4 found meaningful positive effects. 65 were less-rigorous evaluations; almost all concluded that the intervention being evaluated was effective. 25 did not measure impact.

Examples of STEM Educational Interventions Shown Effective In Rigorous Impact Evaluations University of Michigan program creating research partnerships between faculty and undergraduates (randomized trial of 1300 students). Direct Instruction in teaching 3 rd and 4 th graders how to design a simple, unconfounded scientific experiment (randomized trial of 112 students). Incorporating peer-guided, small group sessions into an undergraduate general chemistry course (well-matched comparison-group study of 264 students).

Suggestions on Evaluation Strategy For STEM Education Program Officials Focus rigorous impact evaluations on most promising interventions (as opposed to evaluating everything). Make sure the type of evaluation is appropriate for the stage of an intervention’s development (e.g., early- stage development vs. scale-up of mature intervention). Recognize that well-designed randomized trials can sometimes be done at modest cost (e.g., $50,000). Start by trying to get a small number of rigorous evaluations underway (e.g., one or two).

Evidence-Based Policy Help Desk for OMB/Agencies: WWC’s Evidence-Based Education Help Desk Social Programs That Work: David Anderson