Narrowing the Gap between Evaluation and Developmental Science: Implications for Evaluation Capacity Building Tiffany Berry, PhD Research Associate Professor.

Slides:



Advertisements
Similar presentations
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Advertisements

1 National Collaborative on Workforce and Disability for Youth Competencies for Working with Youth: Identify, Assess, and Build Them!
TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Building a Strategic Management System Office for Student Affairs, Twin Cities Campus Ground Level Work Metrics Initiatives Managing Change Change Management.
The Core Competencies for Youth Development Professionals were developed with leadership from the OPEN Initiative, Missouri Afterschool Network (MASN),
A Practical Guide. The Handbook Part I BCCC Vision of Assessment Guiding Principles of Assessment Part II The Assessment Model Part III A guide on how.
School Psychology – Division 16 of APA. “School psychology is a general practice and health service provider specialty of professional psychology that.
Measuring Value: Using Program Evaluation to Understand What’s Working -- Or Isn’t Juliana M. Blome, Ph.D. , MPH Office of Program Analysis and Evaluation.
What You Will Learn From These Sessions
Imagine you are in the classroom of a highly effective teacher:  What would you see?  What would you hear?  What would the students be doing or saying?
A Logic Model for the Effective Implementation of Service Coordination: Culmination of Five Years of Research Michael Conn-Powers, Indiana University Julia.
The Head Start Child Development and Early Learning Framework A Focus on School Readiness for Infant and Toddler Children August 19, 2014 RGV Pre-Service.
Equal Opportunities at CERN Tim Smith - Equal Opportunities Advisory Panel Chair Doris Chromek-Burckhart - Equal Opportunities Officer.
Bridging Research, Practice, and Policy in the Field of Early Childhood Education Wingspread Recommendations and Next Steps.
Beth Rous University of Kentucky Working With Multiple Agencies to Plan And Implement Effective Transitions For Head Start Children Beth Rous University.
SUNY Cortland Conceptual Framework … our shared vision for preparing candidates to work in P-12 schools.
1 Strategic Planning. 2 Elements of the Strategic Planning Process Strategic planning is a continual process for improving organizational performance.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Advancing Campus Internationalization Through an Integrated Approach: The Role of Languages and Cultures Across the Curriculum Regional Meeting of the.
Developing a Policy Framework for Assessing and Recognizing Prior Learning Key Issues for Consideration.
Diversity and Inclusion at NASA: A Strategic Integrated Approach
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Commonwealth of Massachusetts Executive Office of Health and Human Services Improving the Commonwealth’s Services for Children and Families A Framework.
Meeting SB 290 District Evaluation Requirements
Cross Border Animal Health Plan of Action – Kenya and Uganda Four Strategic areas 1. To improve prevention, management and control of cross border animal.
Investing in Change: Funding Collective Impact
SAR as Formative Assessment By Rev. Bro. Dr. Bancha Saenghiran February 9, 2008.
The Iowa Pediatric Integrated Health Home Program (PIHH) is for children and youth, 0 to 18 years old, who are Medicaid eligible and have a Severe Emotional.
Towards the Development of Health Promotion Competencies Laying the Foundation for Discipline-Specific Competency Development Marco Ghassemi, MSc Chronic.
Discipline standards: opportunities and risks Panel on discipline standards “Critical times? Changing journalism in a changing world” : The JEAA Melbourne.
Using Implementation Research to Inform Technical Assistance Practice Sam Morgan Peggy Malloy
Focus on Learning: Student Outcomes Assessment and the Learning College.
Early Childhood Development (ECD) Scales: Overview & Lessons Learned Dr. Ghazala Rafique Aga Khan University Pakistan Regional Consultative Meetings on.
Thomas College Name Major Expected date of graduation address
Mission The faculty and staff of Pittman Elementary School are committed to providing every student with adequate time, effective teaching, and a positive.
National Science Foundation 1 Evaluating the EHR Portfolio Judith A. Ramaley Assistant Director Education and Human Resources.
Convocation Week 2008 Strategic & Academic Action Planning Update.
October 8,  Review TEAC Process  Faculty Presentations on Reflection/ Learning to Learn  Group Work on Evidence for Claim 3  Audit Update 
How to Frame an Ed.D. Program The following are a set of examples of how programs can be framed to make them unique and focused around the values of the.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Your Presenters Melissa Connelly, Director, Regional Training Academy Coordination Project, CalSWEC Sylvia Deporto, Deputy Director, Family & Children’s.
The CG on ECCD: Developing Our Strategy.
Mountains and Plains Child Welfare Implementation Center Maria Scannapieco, Ph.D. Professor & Director Center for Child Welfare UTA SSW National Resource.
Developmentally Appropriate Practice in Early Childhood Programs Serving Children from Birth through 8 A position statement of the National Association.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
NAEYC Developmentally Appropriate Practice in Early Childhood Programs Key Messages and Implication.
1 National Collaborative on Workforce and Disability for Youth “Keys to Success: Youth Service Practitioners Competencies”
Copyright © 2014 by The University of Kansas A Framework for Program Evaluation.
Mountains and Plains Child Welfare Implementation Center Maria Scannapieco, Ph.D. Professor & Director Center for Child Welfare UTA SSW Steven Preister,
Science Case Network How an RCN-UBE on Case Studies and PBL Can Help You Pat Marsteller and Ethel Stanley Channeling Margaret Watrerman Networking Conference.
Early Childhood Outcomes Center New Tools in the Tool Box: What We Need in the Next Generation of Early Childhood Assessments Kathy Hebbeler ECO at SRI.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
EARLY LEARNING & KINDERGARTEN STANDARDS ALIGNMENT: A CLEAR PATH TO SUCCESS THROUGH ALIGNED STANDARDS August 18, 2015.
Learning-Centered Leadership Joseph Murphy Peabody College, Vanderbilt University.
Report on Language Learning Discussion. Outline Teacher Capacity Building Standards Assessment Use of ICT Policy.
Dr. Salwa B. El-Magoli 16/1/2007Dr.Salwa B. El-magoli Cairo: 16/1/2007 Quality Assurance and Accreditation (The Egyptian Experience) Dr. Salwa B. El-Magoli.
Region 7 Education Service Center Head Start. Copyright 2012 by Region 7 Education Service Center. This document may be reproduced for educational use.
Developed by: July 15,  Mission: To connect family strengthening networks across California to promote quality practice, peer learning and mutual.
30/10/2006 University Leaders Meeting 1 Student Assessment: A Mandatory Requirement For Accreditation Dr. Salwa El-Magoli Chair-Person National Quality.
Chapter 1 You and Early Childhood Education. Early childhood professionals have an exciting and evolving role in the overall field of education. As you.
Standards & Competencies MA in Social Work Nevenka Zegarac, Full professor, FPN, BU.
From Program Theory to Systems Theory: Using Logic Analysis to Re- conceptualize an Evaluation Lori L. Bakken, PhD; Jonathan M. Ross, MD; Curtis A. Olson,
September 2014 Geriatric Social Work Competencies Marilyn Luptak, PhD, MSW, LICSW Associate Professor & Chair, MSW Aging Concentration Hartford Geriatric.
Your Presenters What we’re asking of you Statewide stakeholder review process taking place in all regions We want your feedback on all aspects of the.
February 2, PM ET. Since the Summit… WE LISTENED…. Here’s what’s happening….. Curriculum Working Group is hard at work …… Why we are having these.
Standards and Competences for Social work Education for working with children and youth Prof dr Nevenka Zegarac Ass MA Anita Burgund.
Geriatric Social Work Competencies
Evaluating Educationally Significant Outcomes: The Need to Balance Academic Achievement with Social-Emotional Learning Dr. Tiffany Berry Research Associate.
NJCU College of Education
Presentation transcript:

Narrowing the Gap between Evaluation and Developmental Science: Implications for Evaluation Capacity Building Tiffany Berry, PhD Research Associate Professor Associate Director Claremont Evaluation Center Claremont Graduate University June 29, 2015 It’s my honor to be with you here tonight and I appreciate the warm welcome, hospitality, and the beauty of South Africa. Tonight, I’m going to discuss how to strategically narrow the rather wide gap we have between evaluation as a discipline and developmental science; how we can harness a bidirectional transaction between the two disciplines (focusing on what developmental science could offer the evaluation community as well as how evaluation can improve developmental science). Then, I’m going to end on the implications these issues have for building evaluation capacity.

Disciplinary Training Research University Full-time Educational Evaluator Child and Adolescent Development Evaluation Context of Work Disciplinary Training My interest in this work stems from constantly being at the intersection of multiple worlds.

Developmental Research Process Evaluation Process Developmental Research Process Youth Programs Evaluation Improve Programs and Youth Outcomes Research Community Research Improve Theory and Knowledge

How can we strategically narrow the gap between developmental science and evaluation practice? How can we build capacity within both fields?

Evaluation Practice “Developmental Sensitivity” Developmental Science Tools, Mechanisms, Value, Development “in context”

Where does developmental sensitivity conceptually fit within the discipline of evaluation?

#1: AEA Guiding Principles “Evaluators have the responsibility to understand and respect differences among participants, such as differences in their culture, religion, gender, disability, age, sexual orientation and ethnicity, and to account for potential implications of these differences when planning, conducting, analyzing, and reporting evaluations” (AEA, 2004, emphasis added). The American Evaluation Association (AEA) has officially acknowledged the fact that different age populations require special understanding and consideration in their Guiding Principles for Evaluators, stating Kids are unique group – change quickly, development is marked by peaks and valleys, qualitative and quantitative growth What remains largely undetermined however, is how evaluators “account for potential implications of these differences” when dealing with underage program participants.

#2: Cultural Competence The culturally competent evaluator is one who “draws upon a wide range of evaluation theories and methods to design and carry out an evaluation that is optimally matched to the context” (AEA Cultural Competence Statement, 2011) Youth as a context to attend to…

Childhood as a context to understand

#3: Need Many programs serving children and youth ECD Programs 1

#3: Need Insufficient and poor quality evaluations with youth populations (Bialosiewicz & Berry, in preparation) 57% 47% 45% 70% 46% 88% Peer reviewed articles Fields represented: Psychology/Psychiatry, Health, Education, Interdisciplinary, Social Work, Zoology, Law Level of evaluation: Local level in the US (41%), state wide or multistate in the US (6%), International (53%). Articles were coded on 31 variables Description of the article (Authors and affiliation, substantive field, year) Description the program (Purpose, components, target population, etc.) Description of the evaluation (Design, focus/purpose, participants, independent and dependent variables, moderators, implementation, measures, timing of assessments, etc.) The coding categories related to the evaluation were intended to extract information that could be assessed in comparison to best-practices for developmentally sensitive assessment of child and youth populations A checklist for developmentally sensitive YPE was developed first through an extensive literature review of developmental psychology and applied developmental psychology research. The checklist was then expanded and refined though feedback and input from the evaluation community at a roundtable session at the 2011 conference of the American Evaluation Association This presentation will examine some of the key findings in light of some of the guidelines presented in this checklist   57% did not assess implementation 47% did not take into account natural maturation 45% did not take a whole child approach – examining more than one limited outcome variable 70% did not report on the psychometric properties of the scales they used with young children 46% did not disaggregate by key moderators (based on individual characteristics) 88% did not disaggregate by level of participation 2

#3: Need Early intervention holds the most promise for putting youth on positive developmental trajectories 3

How do we define developmental sensitivity?

Principles of Development Environment Individual Interactions Age Sensitive periods Milestones Domains Cognitive Physical Social-Emotional Moral Behavioral Age: Infancy Importance of attachment  consequences of social development in later childhood Adolescents Importance of sense of autonomy  consequence of self-esteem Language Acquisition To master a second language (or language in general), exposure early in life is compulsory Perceptual Development Attachment PYD encompasses a more broad model of development including six domains of competencies to be developed (Catalano, Berglund, Ryan, Lonczek, Hawkins, 2004). More holistic approach to development (focusing on positive growth, as well as the reduction of problems in these realms), thus the inclusion of emotional, moral, and behavioral development. Within our evaluation we attempted to assess students’ develop across multiple developmental domains. With the exception of grades, test scores and academic attitudes all the other constructs were were assessed through our student survey. Discuss how we measured the whole child via these constructs on the student survey

How do we apply developmental sensitivity to educational evaluation practice?

CDC Evaluation Framework Engage Stakeholders PHASE I: Describe the Program PHASE II: Focus the Evaluation Design PHASE III: Gather and Analyze Evidence PHASE IV: Justify Conclusions PHASE V: Ensure Use and Share Lessons Learned PHASE VI: CDC Evaluation Framework

Realistic expectations for development? Knowledge Engage Stakeholders Engage Stakeholders PHASE I: Describe the Program PHASE II: Focus the Evaluation Design PHASE III: Gather and Analyze Evidence PHASE IV: Justify Conclusions PHASE V: Ensure Use and Share Lessons Learned PHASE VI: Realistic expectations for development? Knowledge of child development? Understand the importance of developing quality attachment relationships in young childhood? If they are running an adolescent program, do they understand the importance of developing identity and autonomy?

Describe the Program Developmentally appropriate? Engage Stakeholders PHASE I: Describe the Program PHASE II: Focus the Evaluation Design PHASE III: Gather and Analyze Evidence PHASE IV: Justify Conclusions PHASE V: Ensure Use and Share Lessons Learned PHASE VI: Developmentally appropriate? Capitalize on multiple developmental domains? Aligned with sensitive periods? Articulate a logic model; clarify program goals and objectives in relation to actual implementation; analyzing program context Developmental appropriate: Are they a program targeting adolescents and they ask them to color to improve their cognitive ability? Are they integrating services across developmental domains? Developmentally appropriate design? Capitalize on multiple developmental domains? Aligned with sensitive periods?

Focus the Evaluation Design Engage Stakeholders PHASE I: Describe the Program PHASE II: Focus the Evaluation Design PHASE III: Gather and Analyze Evidence PHASE IV: Justify Conclusions PHASE V: Ensure Use and Share Lessons Learned PHASE VI: Account for maturation? Stability and variability across domains? Mediators and moderators? Ecological context? Purpose of evaluation; how results are to be used; describing methods; identifying evaluation questions; articulation of evaluation procedures How to account for maturation; stability and variability across domains Risk and resiliency Mediators and moderators; individual context interactions

Gather and Analyze Evidence Engage Stakeholders PHASE I: Describe the Program PHASE II: Focus the Evaluation Design PHASE III: Gather and Analyze Evidence PHASE IV: Justify Conclusions PHASE V: Ensure Use and Share Lessons Learned PHASE VI: Developmental precursors? Standardized assessments across domains? Sensitive measures? Mixed methods? Decisions about what to measure, which assessments to use, and who will respond to assessments Role of developmental precursors (e.g., self-regulation as precursor to academic achievement) Standardized assessments for cognitive/academic; less so for socio-emotional domains; role of multiple informants Developmentally sensitive measures (response options); role of observations and qualitative research are important Pilot test

Understanding youth in context Justify Conclusions Engage Stakeholders PHASE I: Describe the Program PHASE II: Focus the Evaluation Design PHASE III: Gather and Analyze Evidence PHASE IV: Justify Conclusions PHASE V: Ensure Use and Share Lessons Learned PHASE VI: Understanding youth in context? Embrace and measure variability; statistical modeling to identify pathways Analyzing data and making conclusions about the program; linking conclusions to evidence of effectiveness Understanding youth in context Embrace and measure variability in program participants Why they joined; levels of risk; performance across contexts Statistical modeling to identify pathways and moderators

Ensure Use and Share Lessons Learned Engage Stakeholders PHASE I: Describe the Program PHASE II: Focus the Evaluation Design PHASE III: Gather and Analyze Evidence PHASE IV: Justify Conclusions PHASE V: Ensure Use and Share Lessons Learned PHASE VI: Communicate findings back to program AND scholarly outlets to promote understanding of “youth in context” Make stakeholders aware of the findings; make findings considered in decisions that affect the program Communicate findings back to program AND scholarly outlets to promote understanding of “youth in context”

Developmental Sensitivity Developmental Sensitivity Engage Stakeholders PHASE I: Describe the Program PHASE II: Focus the Evaluation Design PHASE III: Gather and Analyze Evidence PHASE IV: Justify Conclusions PHASE V: Ensure Use and Share Lessons Learned PHASE VI: CDC Evaluation Framework

Evaluation Practice “Developmental Sensitivity” Developmental Science Tools, Mechanisms, Value, Development “in context”

How can we conceptualize evaluation practice in way that will inform developmental science?

Evaluation Practitioners Afterschool Programs Context Eval Theory Program Theory Social Science Theory Service Providers Research Community Improve Programs Improve Theory Consistent attendance – what predicts that? We start with a problem – for example, society needs youth with as Rich Lerner would say, the 6 C’s (competence, caring, connection, confidence, contribution, character). Youth programs are developed to try to build these skills in youth and the research community is engaged to understand how to measure positive youth development, how to conceptually define it, what the consequences and antecedents for PYD are, etc. If we harnessed information from both sources to use evaluation as a bridge, then we would engage in evaluation processes that consisted of these three features: understanding context, mechanisms, and meaningful collaboration. When we use mechanisms, we are referring to not only the core processes that bring about intervention effects, but also developmental mechanisms (what are the mechanisms that are responsible for a change in child development?). Context - when applied to youth development programs, we often think of Bronfenbrenner and his ecological systems of development – how is a child’s ecology (home, school, community, etc.) interacting with the program services to produce intended effects? Or, how do these developmental or program mechanisms work in particular contexts? The interaction between the context and mechanisms that bring about change is the key here. For example, I’ve done a lot of work in the afterschool context, and we’ve identified several different activity contexts that seem to relate to differential outcomes for youth (and the outcomes are different for different ages of youth). Too often in the basic research literature, we don’t focus on practitioners or try to learn from them. Tapping into practitioners knowledge, or collaboration, is key – and something that Huey Chen, for example, has written extensively about as an undervalued part of the evaluation enterprise. Practitioners’ institutional knowledge will serve us well as we investigate the relations between how different mechanisms play out in different contexts. Now, each of these different features of the evaluation are informed by different theories – evaluation theory (understanding what framework might be best applied to a particular mode of inquiry), program theory (how do stakeholders think the program works), and social science theory (what critical frameworks/research/theories are particularly relevant to understanding the evaluand and program theory)? The results of this evaluation inquiry can be applied to improve programs as well as improve/inform developmental theory for the broader research community. But, another dissemination mechanism that’s often missed is evaluation practitioners – all of you! How can we create better systems so that the broader evaluation community is learning and engaging in relevant program work? Collaboration Mechanisms Dissemination Evaluation Practitioners

Developmental Science Evaluation Practice Developmental Science

Implications for Building Evaluation Capacity Programs Organizational learning approach Let evaluators learn from you Evaluators Tools and Measurement Social Science Theory Age as a unique context Institutions Promote inter-disciplinary training Develop and reward scholar-practitioners Eval Training Programs – increase content knowledge in substantive domains related to context of program Increase knowledge of child development Increase program staff’s knowledge through evaluation as education

Evaluation is a bridge Theory Practice

Tiffany.berry@cgu.edu; 909.607.1540