ALYSSA M. WECHSLER, M.PHIL. HUMPHREY COSTELLO, M.A. WYOMING SURVEY & ANALYSIS CENTER AT THE UNIVERSITY OF WYOMING ANNUAL CONFERENCE OF THE AMERICAN EVALUATION.

Slides:



Advertisements
Similar presentations
Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
Advertisements

Know how a data management project can help:  Improve program design  Demonstrate effectiveness  Highlight the best work being done  Compete for.
An Introduction to the HR Council’s HR Management Standards for Nonprofits Module 2 Standard Area 2: Getting the Right People.
AACE Goals Goals as identified by AACE’s Board of Directors for
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
June 28, 2008ALA Annual Conference, Anaheim1 Very Temporary Employees: Managing Practicum, Internship and Volunteer Experiences in Technical Services Units.
EITC Platform. The EITC Platform What is a Platform? A platform allows multiple organizations to collectively build and share tools and resources in a.
1 Assessment: the Good, the Bad, and the Ugly Kenneth E. Fernandez Assistant Professor Department of Political Science University of Nevada, Las Vegas.
PBIS Applications NWPBIS Washington Conference November 5, 2012.
The PAIN OUT project - an overview Presented by ….
Timothy S. Brophy, Ph.D., Director of Institutional Assessment University of Florida Office of the Provost.
CNJohnson & Associates, Inc An Overview of Chargeback Best Practices.
ExactTarget State of Indiana Partnership Overview November 21, 2006.
CPI Conference 2001 Education Tools and How to Access Them Education Committee Chris Hyvonen Kiewit Chris Hyvonen Kiewit.
Thirteenth Lecture Hour 8:30 – 9:20 am, Sunday, September 16 Software Management Disciplines Process Automation (from Part III, Chapter 12 of Royce’ book)
Supporting A-level Geography Students Online …at Kingston College 2004/5 LSDA Q Project.
Practical implementation of ERM systems in international organisations Geneva May 2003 ERM Manual.
TRUST SOSMIE – Paris Meeting n° 1: 4th - 6th October 2012 Workshop n.1: Partner presentation Workshop n.2: Key questions.
Addressing the Metadata Bottleneck* *By Developing and Evaluating an Online Tool to Support Non-specialists to Evaluate Dublin Core Metadata Records Michael.
Approaches to Measuring Child Outcomes Kathy Hebbeler ECO at SRI International Prepared for the NECTAC National Meeting on Measuring Child and Family Outcomes,
The Career and Technical Education (CTE) Completer Follow-up 2015 Follow-Up of the 2014 CTE Graduates.
Collecting, Compiling & Disseminating Vital Statistics Central Statistics Office Mauritius.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Project Communication Kirsti Mijnhijmer & Christopher Parker 23 February 2010 – Copenhagen, Denmark European Union European Regional Development Fund.
The Costs and Benefits of Using a Tiered, Web-based Course Evaluation Tool Katharine Mason Robin Zuniga.
START global change SysTem for Analysis, Research & Training UNFCCC Expert Workshop on Monitoring and Evaluating Capacity Building in Developing Countries.
The NEW American FactFinder Association of Public Data Users (APDU) 2010 Annual Conference American FactFinder Update & Demonstration September 21, 2010.
Developing Survey Handbooks as Educational Tools for Data Users Presented at the European Conference on Quality in Official Statistics May 2010 Deborah.
Web Site Usability. Benefits of planning usability Increased user satisfaction, which translates directly to trust and brand loyalty Increased user productivity,
Big6 Overview Big6™ Trainers Program McDowell County Schools.
The CSO’s IT Strategy – using the GSBPM to support good governance MSIS 2010 – Daejeon April 2010 Joe Treacy Central Statistics Office.
Desk Top & Web-Based Tools for Assistance & P2 Program Measurement Desk Top & Web-Based Tools for Assistance & P2 Program Measurement Terri Goldberg NEWMOA.
Presentation to SIG PBIS in Juvenile Justice Center November, 2013 REACH MS Selina Merrell, MS, Ed
Creating an Online Tutorial on TUSK Workshop 6 th Annual Health Sciences Mini-Symposium for Teaching and Learning Innovative Teaching with Technology at.
Enterprise Survey/Assessment Tool Administrative Computing Working Group October 25,
SOFTWARE ENGINEERING MCS-2 LECTURE # 4. PROTOTYPING PROCESS MODEL  A prototype is an early sample, model or release of a product built to test a concept.
1 IMPLEMENTATION STRATEGY for the 2008 SNA OECD National Accounts Working Party Paris, France 4 to 6 November 2009 Herman Smith UNSD.
Human-Computer Interaction at CMU Jodi Forlizzi Jason Hong.
Soil survey training plans and OJT for the NRCS MLRA SSO and Field Office employees.
Building the Knowledge Base How to Successfully Evaluate A TCSP Project presented by William M. Lyons U.S. Department of Transportation/Volpe Center and.
1 Monitoring and Evaluation in ECA Region Land Thematic Group Retreat November 19-20, 2007.
Graphics and interface design Feng Liu Ph.D.. Outline Design Principles – What designer need to keep in mind Elements of design Where interface design.
Measuring and Improving Child and Family Outcomes Conference New Orleans, Sept 19-21, 2011 Using the Child Outcomes Measurement System (COMS) Self-Assessment.
Bringing people together to create great places to live, work, and play Planning for Rural Success APA Idaho October 7, 2015.
Overview of Statewide AT Program Quality Services & Indicators.
Working With the Instructional Development Team Presented by Heidi King 20 November 2002.
The CSO’s IT Strategy and the GSBPM IT Directors Group October 2010 Joe Treacy Central Statistics Office Ireland.
ä Online permit applications in Michigan are taking a major step toward an integrated one-stop process.
1 Web Search What kind of education do you need to be an astronaut? 2 Web Search What additional training do you need to be an astronaut? 3 Web.
North Carolina MSP Data Collection Center. Primary Purposes To create a database application containing common information for all MSP’s across NC To.
Using ICTs To Support Pre- university Education : Performance based Indicators Mohammed A. Ragheb’ PhD Strategic Planning Consultant MOE-Egypt.
Better Data, Better Decisions, Better Government: Digital Accountability and Transparency Act (DATA Act) Implementation Update Christina Ho, Deputy Assistant.
Improving Data, Improving Outcomes Conference Washington, DC Sept , 2013 Planful Changes: Using Self-Assessments to Improve Child and Family Outcome.
Recruiting participants for education studies: Practical strategies and advice Elizabeth Autio, M.A. Jason Greenberg-Motamedi, Ph.D. Annual Meeting of.
The Career and Technical Education (CTE) Completer Follow-up 2016 Follow-Up of the 2015 CTE Graduates March 3, 2016.
1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of.
Methodological Issues in Needs Assessment for Quality Assurance in a National Context: The Case of Head Start Needs Assessment Hsin-Ling (Sonya) Hung,
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
User-Centered Design Services for MSU Web Teams Sarah J. Swierenga, Director Usability & Accessibility Center MSU Webmasters 402 Computer Center February.
Car Clay Treatment in Anaheim CA
Tool Support for Testing
An agency of the Office of the Secretary of Education and the Arts
17 March, 2011 Presenter Prof. Shih-Chung Kang
Principles of User Experience Design
Instructional Design Gibran Carter.
Using the Checklist for SDMX Data Providers
Project Communication Niclas Forsling 21st and 22nd of September 2009 – Höfn, Iceland European Union European Regional Development Fund.
ODV beta Multiple views, new views bar
The Career and Technical Education (CTE) Completer Follow-up
Presentation transcript:

ALYSSA M. WECHSLER, M.PHIL. HUMPHREY COSTELLO, M.A. WYOMING SURVEY & ANALYSIS CENTER AT THE UNIVERSITY OF WYOMING ANNUAL CONFERENCE OF THE AMERICAN EVALUATION ASSOCIATION NOVEMBER 4, 2011 ANAHEIM, CA A Web Tool for Collecting County-Level Program Implementation Data for State- Level Evaluation

2 What to Expect The benefits of web-based reporting systems Steps for building a web-based reporting system Lessons learned from WYSAC’s system, PAPR Discussion

3 What to Expect The benefits of online reporting systems Steps for building a web- based reporting system Lessons learned from WYSAC’s system, PAPR Discussion

4 What to Expect The benefits of online reporting systems Steps for building a web-based reporting system Lessons learned from WYSAC’s system, PAPR Discussion

5 What to Expect The benefits of online reporting systems Steps for building a web-based reporting system Lessons learned from WYSAC’s system, PAPR Discussion

Wyoming’s Needs Guidance for Program Managers 6

Wyoming’s Needs Guidance for Program Managers 7 Goals, outcomes, indicators What gets measured gets done

Wyoming’s Needs Guidance for Program Managers 8 Goals, outcomes, indicators What gets measured gets done

Wyoming’s Needs Evaluation and data collection tool 9

Wyoming’s Needs A response to the changing political and economic environment 10

11 Benefits of Web- Based Systems A complex approach to a complex system Activity Outcome Goal

Benefits of Web- Based Systems A complex approach to a complex system 12

13 Graphic Representation of WYSAC Tobacco Evaluation Website Website Visualizer Application, Benefits of Web- Based Systems A complex approach to a complex system

Benefits of Web- Based Systems A complex approach to a complex system 14

Benefits of Web- Based Systems 1. Ease and consistency of data collection 2. Guidance for Program Managers 3. Ease of data management 15

Benefits of Web- Based Systems Ease of data compilation and reporting 16

17 Building a Web-Based Planning and Reporting System Lessons Learned 1. Identify needs 2. Identify users, user needs, and user roles 3. Build static mock-ups and flat files 4. Allow for repeated beta group input 5. Provide training for users o Identify needs early to avoid mission drift

18 Lessons Learned 1. Identify needs 2. Identify users, user needs, and user roles 3. Build static mock-ups and flat files 4. Allow for repeated beta group input 5. Provide training for users o Identify needs early to avoid mission drift Building a Web-Based Planning and Reporting System

19 Building a Web-Based Planning and Reporting System Lessons Learned 1. Identify needs 2. Identify users, user needs, and user roles 3. Build static mock-ups and flat files 4. Allow for repeated beta group input 5. Provide training for users o PAPR is not the same as paper! o Identify needs early to avoid mission drift

20 Building a Web-Based Planning and Reporting System Lessons Learned 1. Identify needs 2. Identify users, user needs, and user roles 3. Build static mock-ups and flat files 4. Allow for repeated beta group input 5. Provide training for users o Identify needs early to avoid mission drift o PAPR is not the same as paper! o Be prepared to invest lots of time and multiple iterations

21 Building a Web-Based Planning and Reporting System Lessons Learned 1. Identify needs 2. Identify users, user needs, and user roles 3. Build static mock-ups and flat files 4. Allow for repeated beta group input 5. Provide training for users o Identify needs early to avoid mission drift o PAPR is not the same as paper! o Provide training but also build system to stand alone o Be prepared to invest lots of time and multiple iterations o Provide training but also build system to stand alone

Discussion 22 Questions? What are we missing? What are you dealing with? Do you think an online planning/reporting system would be useful for you? How could we improve PAPR?

Contact Information Wyoming Survey & Analysis Center Alyssa M. Wechsler Assistant Research Scientist (307) Humphrey Costello Assistant Research Scientist (307)