2003 Annual NSDL PI Survey A Collaborative Formative Evaluation.

Slides:



Advertisements
Similar presentations
U.S. Department of the Interior U.S. Geological Survey Data Services Task Team Proposal Discussion at WGISS #25 February, 2008 Lyndon R. Oleson U.S. Geological.
Advertisements

MAPP Process & Outcome Evaluation
1 Mid-Term Review of The Illinois Commitment Assessment of Achievements, Challenges, and Stakeholder Opinions Illinois Board of Higher Education April.
Community engagement Implementing NICE guidance 2008 NICE public health guidance 9.
© Grant Thornton UK LLP. All rights reserved. Review of Partnership Working: Follow Up Review Vale of Glamorgan Council Final Report- November 2009.
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk –
In Europe, When you ask the VET stakeholders : What does Quality Assurance mean for VET system? You can get the following answer: Quality is not an absolute.
Project Monitoring Evaluation and Assessment
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
Introducing Gooddee 2 ds. Are people the problem or the solution?
Family Resource Center Association January 2015 Quarterly Meeting.
Systems Engineering in a System of Systems Context
Open Library Environment Designing technology for the way libraries really work November 19, 2008 ~ ASERL, Atlanta Lynne O’Brien Director, Academic Technology.
1 CCLI Proposal Writing Strategies Tim Fossum Program Director Division of Undergraduate Education National Science Foundation Vermont.
Orientation to the Physical Education K to 7 Integrated Resource Package 2006.
THE ADVANCED TECHNOLOGY ENVIRONMENTAL AND ENERGY CENTER (ATEEC) Summative External Evaluation July 1, 2013 – June 30, 2014 PRELIMINARY OUTLINE.
Proposal Strengths and Weakness as Identified by Reviewers Russ Pimmel & Sheryl Sorby FIE Conference Oct 13, 2007.
1 Unified Communications Survey Summary Results Market Connections, Inc. June 2007.
Change Advisory Board COIN v1.ppt Change Advisory Board ITIL COIN June 20, 2007.
Uncovering the Promise of Faculty Success Online Lawrence C. Ragan, Ph.D. Penn State’s World Campus NERCOMP Boston 2005.
ISO 9001:2015 Revision overview - General users
OSSE School Improvement Data Workshop Workshop #4 June 30, 2015 Office of the State Superintendent of Education.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Jan. 24, 2007 / Open Repositories Conference / San Antonio It’s All in How You Slice It Fedora Outreach and Communications: A Working Group Report Carol.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Updated Performance Management for Exempt Staff Fall 2009.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
An Evaluation of SLIS Student Satisfaction and its Global Impacts Christina Hoffman, MLS Dr. Samantha Hastings, Interim Dean The University of North Texas.
Date Coordinator Name(s) Other Leadership Name(s) ABC Coalition Clean Cities Re-designation.
Working Definition of Program Evaluation
NAVCA Quality Award Andrea Allez Performance Improvement Manager Excellent service for local groups.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
August 7, Market Participant Survey Action Plan Dale Goodman Director, Market Services.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
Communications Skills (ELE 205)
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
CONDUCTING A PUBLIC OUTREACH CAMPAIGN IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Conducting a Public Outreach Campaign.
ASEF Risk Communication for Public Health Emergencies, 2015 Overview.
S&I Standards Organization Engagement & Communication Plan DRAFT Standards Support Team 1 September 2011.
Maine SIM Evaluation Subcommittee April 2015 April 22, 2015.
Review of CoP-MfDR Pilot Phase Review of CoP-MfDR Pilot Phase Third International Roundtable on Managing for Development Results Hanoi, Vietnam February.
Strategies for Managing the Online Workload CADE 2003 St. John’s Newfoundland June, 2003.
Communications Skills (ELE 205) Dr. Ahmad Dagamseh Dr. Ahmad Dagamseh.
Chapter 6 CASE Tools Software Engineering Chapter 6-- CASE TOOLS
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
Working in Partnership
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
OCLC Online Computer Library Center 1 Putting Advocacy Plans into Practice.
School Development Implementation and Monitoring “Building a Learning Community”
CERN openlab Phase 5 Preparation (Technical) Alberto Di Meglio CERN openlab CTO office.
Stage 1 Integrated learning Coffee Shop. LEARNING REQUIREMENTS The learning requirements summarise the knowledge, skills, and understanding that students.
June 7, 2016Action Research and Seminar1 Action Research l The task is NOT finished when the project ends l Participants continue to review, evaluate and.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Earth Educators’ Rendezvous Workshop Leader Webinar Introduction Workshop Design Best Practices Utilizing the Web Tools Evaluation Instruments David McConnell,
Status Report to the President under EO EPA ACTIONS 1 Executive Order: Improving Chemical Facility Safety & Security.
Success in the Online Environment Lawrence C. Ragan, Ph.D., Penn State’s World Campus Mount St. Vincent University April 12th 2005.
Stages of Research and Development
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
Overview – Guide to Developing Safety Improvement Plan
TSMO Program Plan Development
Overview – Guide to Developing Safety Improvement Plan
2003 Annual NSDL PI Survey A Collaborative Formative Evaluation
Hoop Magic Sports Academy Educational Technology Center
Using State and Local Data to Improve Results
Presentation transcript:

2003 Annual NSDL PI Survey A Collaborative Formative Evaluation

Purpose of Presentation  Describe the community-supported formative evaluation efforts to evaluate the NSDL’s distributed library and community organizational processes  Elicit feedback on the results and recommendations from the 2003 Annual NSDL PI Survey

Timeline of the Evaluation Effort  Dec. 2001: Evaluation Working Group Meeting articulated four questions to be addressed in a pilot study  February 2002: Evaluation Workshop developed a work plan to answer the four questions. An online survey would be used to identify: How are the distributed library building and community governance processes working?  Feb.-July 2002: Survey developed and delivered. Preliminary survey data reported at JCDL.  Dec. 2002: Final survey data presented to the community during the EIEC meeting. Committee agreed that an NSDL-wide implementation of the survey should be undertaken  Apr.-June 2003: Survey revised, tested for validity and usability  June-July 2003: Survey delivered  Oct. 2003: Survey analyzed and presented at NSDL Annual Meeting

Survey Details  Primary Question: How are the distributed library building and community governance processes working?  Developed measures in four areas to answer this question:  NSDL communications infrastructure  Extent and success of NSDL projects’ collaboration  Projects’ participation level in NSDL organizational structures  Information needs and motivations factors for NSDL community  Note: The questions and scales have evolved with each iteration, but the four areas being evaluated have remained the same.  Questions sought to identify:  Levels of use, effectiveness, and extent of participation  Information needs and motivations  Barriers to use, effectiveness, participation and suggestions for improvement  Types of scales: Likert, multiple choice, open-ended

Survey Details (continued)  Primary audience: NSDL PIs (though all project members were encouraged to contribute to survey completion)  Distribution: survey available online four weeks during June- July 2003; notification and reminders sent to All-Projects and PI listservs  Verification: one survey per project was included in final analysis; used NSF award number to verify results, then disassociated the award number from further analysis Examples of survey questions->

Summary of Key Findings from the Four Areas  1. Communication Infrastructure: The NSDL Annual Meeting is still the best place for projects to establish communication and identify potential collaborators. Other communication tools are perceived as not supporting projects accomplishing their goals.  2. Collaboration: 80% of respondents indicated that they had collaborated with a median of 2 other NSDL projects since the beginning of % of respondents incorporated something they learned through a collaborative relationship into their project’s design and use. Respondents overwhelmingly identified lack of time and being unclear about other projects’ goals and activities as barriers to further collaboration.  2. Organizational Structures: A majority of the respondents had not interacted with NSDL organizational structures since the beginning of % of respondents had a mid-level rating on the overall perceived effectiveness of NSDL groups. “I'm sure all this activity is necessary but it hasn't trickled down to the individual projects.”  4.a. Information Needs: A majority of all respondents were highly aware of the standards and technologies needed to integrate projects into NSDL. 62% of respondents indicated that they had implemented some part of their project’s evaluation plan in the past year. 100% of respondents reported that they had disseminated information about their projects through an average of 8 publications or presentation from June 2002 to June  4.b. Motivation: 85% of respondents indicated that it was important to participate in NSDL communication, organizational groups and library building activities because they wanted to help establish a National Science Digital Library

Results, Recommendations & Questions  Individual projects are not aware of whose expectations they should meet, what level of participation is expected or what the outcomes of participation should be when using the NSDL communications infrastructure, pursing collaborative opportunities and participating in NSDL organizational structures.  Recommendation: Establish a process for defining and communicating shared priorities and expectations.  Recommendation: Provide targeted “entry points” for project or individual participation  The motivation to expend extra time and resources on participation, collaboration and communication is often redirected by deadlines imposed by a two-year funding cycle.  Recommendation: Identify tangible benefits for projects to participate, aside from building NSDL.  Question: Is it necessary that every project participate in order for NSDL to be built?

Results, Recommendations & Questions  Respondents identified what type of support they would like from the Core Integration; however, this could be a moving target because projects are at different stages of development.  Recommendation: Identify specifically what projects need in terms of technical support.  Recommendation: Define and communicate what levels of support can be realistically expected from the Core Integration team.  Recommendation: Leverage the expertise of NSDL projects.  Respondents indicated that they were unsure about how collaboration was being defined and what were the expected results of collaborative efforts.  Recommendation: Establish a process to define and measure collaboration in a distributed environment.  Question: How much emphasis should be placed on collaborative achievements as factor in the success of NSDL?

Results, Recommendations & Questions  Projects are not sure how to identify others working on similar topics, outside the NSDL Annual Meeting. They expressed the need for a “human” connector to point them to the right collaborator; and, the need for direction about how to be involved in organizational structures.  Question: How could the current NSDL organizational structure adapt to meet the demand for connections on a more granular level than now exists?  The current communications infrastructure is not meeting projects’ information needs and t he multitude of communications channels present too many choices when deciding what to read regularly and where to look for information to support project work.  Recommendation: Establish one resource that is continually maintained and updated with basic technical information and other information relevant to project development.

Design & Delivery Issues  Audience:  Pilot - everyone working on a project – result was conflicting responses from one project.  one response per project, though everyone working 20 hrs/wk for past 3 mos. could contribute opinions. This is easier to validate, verify and analyze, but are we missing out?  Moving target topics…  Ur-question remains the same: How are the distributed library building and community governance processes working?  four issues to evaluate remain the same: NSDL communications infrastructure; the extent and success of NSDL projects’ collaboration; the projects’ level of participation in the NSDL organizational structures; and, the information needs and motivations of the community.  definitions, issues, tools have changed (i.e., governance -> organizational structure; CommPortal ->re-designed)

Design and Delivery Issues (continued)  Reliability, Validity, Sampling  Pilot - Didn’t do.  extensive review and comments from EIE Committee members and the Core Integration Team; expert review and validity testing with NSDL PIs; lastly, think-aloud usability testing  Distribution:  Pilot – used proprietary software for distribution and analysis.  2003 – CI (Casey & Alex) hand built a form. There has to be a better way.  Timing  Distribute when the maximum number of respondents are near their computers; both the Pilot and 2003 surveys were delivered in early summer  Coordinate with other NSDL-wide data-gathering efforts  Re-Use (of instrument and of data)  Design so the same basic instrument can be used regularly – cost and time savings  Pilot: sparked the idea of creating an “Evaluation” category for the Collaboration Finder and for creating the NSDL Progress Report  2003: Provided data for the NSDL Progress Report

Process Issues & Questions  Clarify the purposes and timing of the Annual PI Survey relative to other data-gathering activities occurring across NSDL: 1) surveys from other standing committees; 2) the Collaboration Finder; and, 3) NSF reporting requirements.  The Annual PI Survey has functioned well for gathering data about perceptions and beliefs about organizational aspects of NSDL. Does it function well as a tool for identifying projects’ immediate technical or informational needs?  The results from this Annual PI Survey were used in the NSDL Progress Report. Re-purposing survey data reflects a larger organizational need for both formative evaluation and for data to support outreach efforts. How should the Annual PI Survey respond to shifting NSDL organizational needs and goals?