Recycling data: common metrics for research, policy, and evaluation Samantha Becker, MLIS, MPA Principal Research Scientist University of Washington Information.

Slides:



Advertisements
Similar presentations
Measuring Impact – Making a Difference Carol Candler – NRF Graeme Oram – Five Lamps.
Advertisements

Using Customer Research to inform service redesign Katy Wilburn Head of Research and Insight.
Hampshire Children’s Services Personalisation and Personal Budgets Pilot A Parent and Carer Guide.
Giving Meaning To Values. Meaning to Values (1) For any set of values to have an impact, they need to be clearly defined and aligned to the overall process.
Monitoring and evaluation of carers’ services and projects Dr Andrea Wigfield - Associate Professor of Social Policy Centre for International Research.
Development and evaluation of recovery focused group and Skype self management programmes for people living with early dementia and their carers Dr Andy.
PATIENT-CENTERED OUTCOMES RESEARCH INSTITUTE PCORI Board of Governors Meeting Washington, DC September 24, 2012 Sue Sheridan, Acting Director, Patient.
Planning for an Uncertain Future: Promoting adaptation to climate change through Flexible and Forward-looking Decision Making Findings from the Africa.
Developing a Logic Model
Evaluation research Using research methods in combinations Policy analysis.
Questions from a patient or carer perspective
Katherine Smithson Policy and public affairs officer Charity Finance Group.
Chapter 20 CONTROLLING FOR ORGANIZATIONAL PERFORMANCE © 2003 Pearson Education Canada Inc.20.1.
How to Develop the Right Research Questions for Program Evaluation
HIT Policy Committee Accountable Care Workgroup – Kickoff Meeting May 17, :00 – 2:00 PM Eastern.
1 Types of Evaluation. 2 Different types of evaluation Needs assessment Process evaluation Impact evaluation Cost-benefit analysis.
Scottish Government funded Projects: Monitoring, Evaluation and Learning (MEL) … and reporting 4 September 2013.
Outputs and Outcomes Building Better Opportunities Neil King - Director – CERT Ltd.
Convention for Persons with Disabilities Why the Convention is so important to us.
Setting Priorities Delivering Best Value Managing Scarcity: Experience from Tayside Danny Ruta.
Multi-faceted Cyber Security Research Group edited strategy.
Engaging with communities for health improvement: lessons for commissioners Presentation at ‘Reducing health inequalities in Bradford & Airedale Districts’,
What do we mean when we talk about IMPACT? IAS Public Engagement and Impact 6 th November 2014.
The Impact of Asthma on 3 rd Grade Reading: An Equity-Based, Data-Driven Approach to Health and Learning Interventions Urban Strategies Council urbanstrategies.org.
Young Carers “Promote more effectively the individual needs of young carers within schools to ensure that a supportive approach is adopted that takes sufficient.
USING OUTCOMES EVALUATION FOR BUILDING SUPPORT Samantha Becker, MLIS, MPA Research Project Manager University of Washington Information School.
Dividing by Decimals. Dividing decimals is a lot like dividing by a whole number. There is just one thing you need to remember before dividing: You have.
Evaluation of Efforts to Broaden STEM Participation: Results from A Two-Day Workshop Planning Committee: Bernice Anderson Elmima Johnson Beatriz Chu Clewell.
LARMA Professional Development Communications and Engagement.
Session 4 Engagement, Continuous Improvement, and Accountability CLAS Training [ADD DATE] [ADD PRESENTER NAME] [ADD ORGANIZATION NAME]
How can we evaluate the impact of supported employment and help make a better business case To demonstrate impact we need to measure the social value created.
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
Improving lives for people with sight loss Overcoming adult service cuts - the benefits of a universal sight loss pathway ADASS Sensory Network June 8th.
CdL SPSS a.a. 2008/091 CICLO DI GIBBS (1998) Jasper, M. (2003) Beginning Reflective Practice – Foundations in Nursing and Health Care. Nelson Thornes:
Performance Budgeting and Results First – creating a strong state accountability system Gary VanLandingham Director, Results.
Program evaluation 1: outcomes Bill Burdick and Stewart Mennin.
Indicators for ACSM.
Model Name Transition Project Learning Network Workshop 3 IYF Transition Project.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
NATIONAL MENTAL HEALTH SERVICES COLLABORATIVE Report of Independent Evaluation Presentation – 7 th February 2012 NATIONAL MENTAL HEALTH SERVICES COLLABORATIVE.
Measuring and reporting outcomes for BTOP grants: the UW iSchool approach Samantha Becker Research Project Manager U.S. IMPACT Study 1UW iSchool evaluation.
Evaluation Workshop Self evaluation – some workable ideas and approaches.
Multi-Year Action Planning Using a Logic Model: A structured interview approach Bonita Westover Regional Evaluation Specialist University of Wisconsin-Extension.
RBA Forum Implementing RBA The journey so far…...
Community Economic Development Conference Auckland 2010.
 Highlights from the conference. What will we invest? What will we do? Who will we reach? What difference will it make? Planning Assessing W.K. Kellogg.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
The Thinking and Experience in Qatar Considerations Looking Ahead.
Prevention of HIV infection: How effectively are countries responding to changing epidemics in the Asia Pacific Region? 1.
Including School Stakeholders. There are many individuals and groups associated with schools and many of these people are likely to have valuable ideas.
Innovation Forum: some conclusions Sarah Porter Head of Innovation, JISC.
Cultured monitoring and evaluation – getting it right Steven Marwick Evaluation Support Scotland.
Blueshieldcafoundation.org Pathways to Health and Safety: Bridging the divide between healthcare and domestic violence Presenter: Lucia Corral Peña, Blue.
Positive Behavior Supports 201 Developing a Vision.
Collaboration to Clarify the Costs of Curation CERN Costs Workshop Activities and Approaches to Cost Modelling in the 4C Project 13 – 14 January 2014 Germán.
Adult Learning and ESOL Policy in Scotland Emily Bryson, Development Officer, Education Scotland.
…Coventry! Proud to support the National Year of Communication ‘Hello’ Launch Leicester 10 th February 2011.
Chief results from Sheffield study 1. The public’s understanding of science is not so much a question of whether people understand pieces of science as.
Improving training & employment outcomes for people with a disability.
Integrating Engineering NASA SMD Education Community Meeting Sept , 2015.
Mandy Williams, Participation Cymru manager
The Leisure Experience
Building the foundations for innovation
Bridging research and policy agendas
شاخصهای عملکردی بیمارستان
مدل زنجیره ای در برنامه های سلامت
فرق بین خوب وعالی فقط اندکی تلاش بیشتر است
How to design programs that work better in complex adaptive systems
How could quality standards benefit target populations?
Jon Glasby, Health Services Management Centre
Presentation transcript:

Recycling data: common metrics for research, policy, and evaluation Samantha Becker, MLIS, MPA Principal Research Scientist University of Washington Information School

How does digital inclusion work? Digital inclusion programs Inputs Increased digital literacy, technology use Outputs Opportunity Outcome Health, wealth, happiness! Impact What happens here?

The “digital divide” concept suggests we just need to build (the right) bridge and our work is done.

But there are many trails to opportunity.

Practioners need help measuring. But they don’t always know the best ways to achieve their aims, or how well they are doing, or what happens to their clients when they get to the other side. They are not experts in program design or measuring what they have done. Community-based organizations understand the gaps they are trying to bridge. They understand what barriers from opportunity their populations encounter. They are trusted allies. Meaningful metrics would allow them to better communicate with other practioners. A common language will help them identify best practices and learn from each other to deliver better programs. How can they know if they’re doing good?

Researchers, evaluators, and policy makers need data. A lot has been invested in building bridges, but we still don’t have a common language to talk about impact. We don’t really know what it takes to get people to the other side of their own digital divide. We don’t even know whether getting them to the other side actually creates the opportunities we hope for. We need a model that is nuanced and multifaceted to account for the different levels of adoption, the different pathways to fluency, and the possibilities of impact for people at different stages of their lives. We also need to understand how digital inclusion for individuals impacts the development of communities. For this, we must have data that can be aggregated for meta-analysis and longitudinal evaluation.