Tools for Assessing Perceptions and Uncovering Influence Jim Dearing Center for Health Dissemination and Implementation Research Kaiser Permanente Colorado.

Slides:



Advertisements
Similar presentations
Identifying enablers & disablers to change
Advertisements

COLLECTING DATA ON A SAMPLE OF RESPONDENTS Designing survey instruments.
On-the-Spot: Needs Assessment. Objectives To recognize the importance of conducting a rapid initial assessment before deciding whether and how to respond.
4.11 PowerPoint Emily Smith.
Report Assessment AE Semester Two
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Chapter 14: Surveys Descriptive Exploratory Experimental Describe Explore Cause Populations Relationships and Effect Survey Research.
Research & Consumer Behavior H Edu Activity On The Draw On The Draw “Drawing” the Customer “Drawing” the Customer.
Data collection methods Questionnaires Interviews Focus groups Observation –Incl. automatic data collection User journals –Arbitron –Random alarm mechanisms.
Developing a Questionnaire. Goals Discuss asking the right questions in the right way as part of an epidemiologic study. Review the steps for creating.
Evaluation of Health Promotion CS 652 Sarah N. Keller.
Chapter 13 Survey Designs
Chapter 6 The Survey Interview Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Chapter 9 Descriptive Research. Overview of Descriptive Research Focused towards the present –Gathering information and describing the current situation.
 Market research is the process of gathering information which will make you more aware of how the people you hope to sell to will react to your current.
A QUALITATIVE RESEARCH METHOD Depth/intensive interviewing.
Development of Questionnaire By Dr Naveed Sultana.
Interviews Stephanie Smale. Overview o Introduction o Interviews and their pros and cons o Preparing for an interview o Interview Elements: o Questions.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
Questionnaires and Interviews
Epistemology and Methods Survey Research & Interview Techniques May
Data Analysis for Evaluation Eric Graig, Ph.D.. Slide 2 Innovation Network, Inc. Purpose of this Training To increase your skills in analysis and interpretation.
Designing Learning.
Key Performance Measures, Evaluation Plans, and Work Plan
Chapter 7: surveys.
Qualitative Research Methodologies Keys to Exploratory Research.
Chapter 4: Explicit Reports An Introduction to Scientific Research Methods in Geography As Reviewed by: Michelle Guzdek GEOG 4020 Prof. Sutton 2/1/2010.
The Evaluation Plan.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Inventory and Assessment of NBCCEDP Interventions Evaluation November 1, 2007.
Chapter 12: Survey Designs
“Advanced” Data Collection January 27, Slide 2 Innovation Network, Inc. Who We Are: Innovation Network National nonprofit organization Committed.
Deborah Boisvert Director, BATEC Center for IT University of MA Boston.
 Collecting Quantitative  Data  By: Zainab Aidroos.
3. Qualitative Research. Exploratory Research When a researcher has a limited amount of experience with or knowledge about research issue, exploratory.
1 Collecting primary data: questionnaires Week 7 lecture 2.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Chapter Four Managing Marketing Information. Copyright 2007, Prentice Hall, Inc.4-2 The Importance of Marketing Information  Companies need information.
Needs Assessment EDTC General Definition The process of comparing a desired goal state with existing conditions Data is fundamental to all decision.
Team-Based Development ISYS321 Determining Object- Oriented Systems Requirements.
BUSINESS STATISTICS Chapter 1 (Page 26). 1.1 What is Business Statistics (Page26) Business Statistics – is a collection of tools and techniques that are.
Secondary Translation: Completing the process to Improving Health Daniel E. Ford, MD, MPH Vice Dean Johns Hopkins School of Medicine Introduction to Clinical.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Assessment Procedures for Counselors and Helping Professionals, 7e © 2010 Pearson Education, Inc. All rights reserved. Chapter 2 Fundamentals of Assessment.
Quantitative and Qualitative Approaches
Stakeholder Analysis. What is stakeholder analysis?  Stakeholder analysis is a process of systematically gathering and analyzing qualitative information.
University of Sunderland Professionalism and Personal Skills Unit 9 Professionalism and Personal Skills Lecture Data Collection.
CHAPTER 12 Descriptive, Program Evaluation, and Advanced Methods.
Lecture 02.
CHAPTER 16 ASSESSMENT OF THE PROGRAM. Educational Assessment »Assessment and Evaluation is an integral part of any educational program. »This is true.
I/O Psychology “Job Analysis” Hardianto Iridiastadi, Ph.D.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Copyright © 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 13 Data Collection in Quantitative Research.
10 Questionnaire Design. Role of Questionnaire Survey research, by definition, relies on the use of a questionnaire. A questionnaire is a set of questions.
+ Interviews This course material is for non-commercial use only. Any public display, distribution and adaptation is not allowed for any purposes.
4.4 Marketing Research.
Strategic Research. Holiday Inn Express Stays Smart What research results led to an upgrade of all Holiday Inn Express bathrooms? How did their agency,
Paper III Qualitative research methodology.  Qualitative research is designed to reveal a specific target audience’s range of behavior and the perceptions.
1 An Evaluation Plan and Tool Kit for the Archibald Bush Innovative Teaching and Technology Strategies Grant Valerie Ruhe and J.D. Walker, Center for Teaching.
Quantitative Data Collection Techniques In contrast to qualitative data quantitative data are numerical. Counted, calculated tallied and rated. Also ratings.
Chapter 5: Research. Research is the most important to PR because it is used to... Achieve credibility with management Define audiences and segment publics.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Survey Training Pack Session 3 – Questionnaire Design.
Session 2: Developing a Comprehensive M&E Work Plan.
Research Philosophies, Approaches and Strategies Levent Altinay.
Pharmacy in Public Health: Community Health Course, date, etc. info.
Office of Overseas Programming & Training Support (OPATS) Environment Community Engagement— Environmental Education and the Design for Behavior Change.
Planning my research journey
Identifying enablers & disablers to change
Presentation transcript:

Tools for Assessing Perceptions and Uncovering Influence Jim Dearing Center for Health Dissemination and Implementation Research Kaiser Permanente Colorado Synergy Project for Research, Practice and Transformation January 10-12, 2010, Albuquerque NM

2 Session Objective: To help you understand methods of data collection and measures for the assessment of perception and influence

Formative evaluation is a type of applied research that is conducted prior to the introduction of an innovation to increase the likelihood of achieving scale 3

Formative Evaluation of Two Types 1. Learning about the innovation 2. Learning about potential adopters 4

Learning about the Innovation  Is based in the assessment of perception Whose perception most counts? Whose perception most counts? From what other types of stakeholders might we want to gather data about perception? From what other types of stakeholders might we want to gather data about perception? 5

Learning about the Innovation  How can we learn of perceptions?  Two methods of data-collection Interviews with open-ended response categories Interviews with open-ended response categories Questionnaires with closed-ended response categories Questionnaires with closed-ended response categories  Under what conditions might we prefer to collect data by interview vs. questionnaire? 6

7 Interviews are Preferable When  Interviewees are very high-ranking, few in number, when the topic is especially sensitive, or when we have reason to doubt that standard attributes will well- represent the characteristics of the innovation  Always pretest your interview protocol  Is this human subjects research? IRB IRB

Collection of Data via Interviews has Its Downside  Real-time jotting down of comments 2 nd person? 2 nd person?  Digital recording Transcription, training, coding into attribute categories, organizing of data, analysis of frequencies, reporting Transcription, training, coding into attribute categories, organizing of data, analysis of frequencies, reporting 8

A Questionnaire can be Brief 1. Compatibility 2. Cost 3. Simplicity 4. Adaptability 5. Effectiveness 6. Observability 7. Trialability 9

Simple Tallies of Responses can be Insightful  The more respondents the better staff perception, staff portrayals, potential adopter perception staff perception, staff portrayals, potential adopter perception which of these three types of respondents is most important? which of these three types of respondents is most important?  Use mean scores per attribute Whether for creating attribute matrices or going straight to an attribute profile (1x7 data representation per innovation) Whether for creating attribute matrices or going straight to an attribute profile (1x7 data representation per innovation) 10

Tools for Assessing and Comparing Innovations   Attribute matrix, innovation profile, and potential for adoption (PAR) score are simple ways of organizing numerical data purpose is standardization, diagnosis, improvement, comparison Dearing JW, Meyer G (1994). An exploratory tool for predicting adoption decisions. Science Communication 16(1):43-57.

Discussion  Do you have one priority innovation that you’re considering for scale-up? what sorts of feedback have you gotten to- date about it, and from what type of stakeholders? what sorts of feedback have you gotten to- date about it, and from what type of stakeholders?  How do you decide who to collect formative evaluation data from? Do you sample from the target population and if so, how? Do you sample from the target population and if so, how? 12

Does All This Suggest Too Much Rigor?  Formative evaluation is usually done with small numbers of people (even questionnaires)  These means of learning about the innovation are meant to supplement other means of helping you decide which innovations are ready  You don’t need to overcome bias but you do want to understand it 13

Learning about Potential Adopters   Data-collection can be done in one of four ways: informant interviews observation sociometric survey self-report All for assessing social influence

What Difference Does It Make if We Work with Influentials?  Look at the degree of “reach” that people have in influence networks 15

A typical KP Colorado employee’s 2-step network neighborhood

An established KP Colorado employee’s 2-step network neighborhood

A KP Colorado bridging individual’s 2-step network neighborhood

Learning about Potential Adopters   Informant interviews (snowball process) suited for large social systems identify a few informants with broad knowledge of many others (1 st round interviews) ask their opinion of whom among potential adopters they think are looked to by others (2 nd round interviews, beginning of data-collection) 3 rd round interviews, 4 th round interviews, etc stop when no new names are being generated

Learning about Potential Adopters   Observation suited to small social systems train knowledgeable observers to recognize the behaviors associated with social influence what are these? recording sheets (roster format)

Learning about Potential Adopters   Sociometric survey who-to-whom questions   Well-suited to medium size social systems   Simple administration, simple data-entry   Wholly different form of relational data as a result   Best method for identifying and representing influence   Partner with a grad student?

Learning about Potential Adopters   Self-report survey large social systems (marketing research) lower inherent validity but validated instruments still available less intrusive more typical types of questions   Partner with a grad student?

Working Through Opinion Leaders   Does intervention with opinion leaders work to speed and spread the scale-up of innovations? Yes, generally efficacious Depends on how you recruit them Depends on what you ask them to do

Recruitment & the Request  Your appeal should be normative in nature  Ask them to continue to act with the best interests of the network in mind  To evaluate the pros and cons of innovations communicated to them  To talk, to refer, to suggest, all within their own range of everyday behaviors 24

In Sum  I have married an outstanding woman. I owe all of my accomplishments to her – she is brilliant, patient, orderly and neat, and sexy beyond words. And as a mother, she sets the example for all professional, hard working women. 25

In Sum   Formative data-collection can be conducted to improve the likelihood that your innovation can reach more potential adopters, more rapidly, and produce positive perceptions among them a range of tools exist for measurement of both personal perceptions and social influence

27 Dearing JW (2009). Applying diffusion of innovation theory to intervention development, Research on Social Work Practice 19: