Using Mixed Methods Research to Analyze Surveys

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Mixed Methods Research: Developing a Deeper Understanding of Teacher and Librarian Collaboration. CISSL-ILILE Research Symposium Multiple Faces of Collaboration.
“Mixed Methods in Program Evaluation” Presented by Tom Chapel
Using MIXED METHODS RESEARCH to Analyze Surveys
Critical Thinking Course Introduction and Lesson 1
From Objectives to Methods (d) Research methods A/Prof Rob Cavanagh April 7, 2010.
About this Template In the past decades, mixed methods research has been increasingly applied to the dominant research fields of social, behavioral or.
The Concept Anchoring Routine The Content Enhancement Series 2002 The University of Kansas Center for Research on Learning.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Chapter 17 Mixed Method Designs
Qualitative Data Analysis and Interpretation
Chapter 17 Mixed Methods Designs
1 Developments (last 20 years) Increasing interest in and advocacy for mixed methods Evolving understanding of what is mixed methods research Developing.
DEFINING JOB PERFORMANCE AND ITS RELATIONSHIP TO ASSESSMENTS.
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
Research Design Mixed Methods
The Research Process. Purposes of Research  Exploration gaining some familiarity with a topic, discovering some of its main dimensions, and possibly.
Chapter One of Your Thesis
1 SUMMER CONFERENCE What is “Mixed Methods” Research Research studies that include both QUALitative and QUANtitative data. QUAL and QUAN data purposely.
Science Inquiry Minds-on Hands-on.
The phases of research Dimitra Hartas. The phases of research Identify a research topic Formulate the research questions (rationale) Review relevant studies.
FLCC knows a lot about assessment – J will send examples
A rationale for this study a)Deciding on the type of design b)Identifying the design approach to use c)Matching the design to the study’s problem, purpose.
Development of Questionnaire By Dr Naveed Sultana.
MGT-491 QUANTITATIVE ANALYSIS AND RESEARCH FOR MANAGEMENT OSMAN BIN SAIF Session 14.
Technology and Motivation
Report.
Section 29.2 The Marketing Survey
Chapter 19: Mixed Methods Research
INTRODUCTION TO THE MINISTRY OF EDUCATION’s ACHIEVEMENT CHART Bedford Park PS September 2013.
VIRTUAL BUSINESS RETAILING
Meeting SB 290 District Evaluation Requirements
{ Senate Hearing Project Kathryn Gustafson Farmington High School.
+ Using Mixed Methods Research Designs for Research in Teaching and Learning Dr. Elizabeth G. Creamer, Virginia Tech Dr. Beth Mac Donald, Utah State University.
Robert W. Arts, Ph.D.* Professor of Education & Physics University of Pikeville Pikeville, KY Using a Universal Qualitative Analysis Writing Assignment.
TEMPLATE DESIGN © The Homework Effect: Does Homework Help or Harm Students? Katherine Field EdD Candidate, Department.
RESEARCH IN MATH EDUCATION-3
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
Research Methods in Education
Chapter Ten: Mixed Methods Procedures
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Threshold Concepts & Assessment Ahmed Alwan, American University of Sharjah Threshold Concepts For Information Literacy: The Good, the Bad and the Ugly.
Copyright ©2008 by Pearson Education, Inc. Pearson Prentice Hall Upper Saddle River, NJ Foundations of Nursing Research, 5e By Rose Marie Nieswiadomy.
The Process of Conducting Research
Thoughts on the Role of Surveys and Qualitative Methods in Evaluating Health IT National Resource Center for HIT 2005 AHRQ Annual Conference for Patient.
Situation Analysis Determining Critical Issues for Virginia Cooperative Extension.
Abstract Culture and learning are inextricably intertwined. This study investigated 22 per-service teachers’ preparedness to teach.
Abstract: The definition of effective teaching is fluid and dependent on the teaching environment and its community members (faculty, students and administrators).
Qualitative Research January 19, Selecting A Topic Trying to be original while balancing need to be realistic—so you can master a reasonable amount.
Presented By Dr / Said Said Elshama  Distinguish between validity and reliability.  Describe different evidences of validity.  Describe methods of.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Critically reviewing a journal Paper Using the Rees Model
EDU 5900 AB. RAHIM BAKAR 1 Research Methods in Education.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
Chapter Ten: Mixed Methods Procedures. Chapter Outline Components of Mixed Methods Procedures – The Nature of Mixed Methods Research – Types of Mixed.
Discuss how researchers analyze data obtained in observational research.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
Standardized Testing EDUC 307. Standardized test a test in which all the questions, format, instructions, scoring, and reporting of scores are the same.
 CHAPTER 8 Mixed-Methods Research By Laura Fox Fall 2015 EDCI 695.
What is Research?. Intro.  Research- “Any honest attempt to study a problem systematically or to add to man’s knowledge of a problem may be regarded.
Inquiry Primer Version 1.0 Part 4: Scientific Inquiry.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
BUS 660 Entire Course (2 Sets) FOR MORE CLASSES VISIT This Tutorial contains 2 Sets of Papers for each Assignment (Check Details Below)
Criminal Justice and Criminology Research Methods, Second Edition Kraska / Neuman © 2012 by Pearson Higher Education, Inc Upper Saddle River, New Jersey.
Designing Quality Assessment and Rubrics
MPU 1024 Mixed Methods.
Designing Professional Development for Elementary School Teachers
Writing Tasks and Prompts
Presentation transcript:

Using Mixed Methods Research to Analyze Surveys Keith Wurtz Senior Research Analyst Chaffey College Keith.Wurtz@chaffey.edu www.chaffey.edu/research

What is Mixed Methods Research? Difficult to define Examples of Definitions The use of qualitative and quantitative techniques in both the collection and analysis of data Mixed Methods research is given a priority in the research and the integration of both the quantitative and qualitative results occurs at some point in the research process Research that includes both quantitative and qualitative data in a single research study, and either the QUAN or QUAL data provides data that would not otherwise be obtainable when using only the primary method MM Research is difficult to define because it is an emerging field Here I provide some examples of the different definitions. There are more. I like the third one here the best because it implies a practical reason for using the MM approach.

Why is Mixed Methods Research Valuable? Answers questions that other modalities cannot Provides a deeper understanding of the examined behavior or a better idea of the meaning behind what is occurring The inferences made with mixed methods research can be stronger Mixed methods research allows for more divergent findings MM research can include culture in the design by giving a voice to everyone involved in the behavior being examined An example of how MM research allows for more divergent findings is that it can simultaneously answer exploratory and confirmatory theoretical questions or it can answer the following questions: What is happening and Why is it happening? The examples shown here will show how all the voices were “heard” because even though most people were satisfied with the programs being evaluated there were still some identifiable issues when the MM piece was included in the analysis.

Collaborative MM Research Seeks to include stakeholders in the design and the research process Can be very beneficial when many of the stakeholders are more likely to be critics Includes less powerful groups and helps to ensure that they have an equitable impact on the research Collaboration has the ability to stimulate ways of thinking that might not occur when working individually on a project This is something that can work for any research design. The difference with MM Research is that includes the MM component as well. Collaboratively developed research projects may be very common for many us. For me, I have often collaborate with many stakeholders and gather than input when conducting a study. They are more likely to be better consumers of the information when they are included in the process.

Setting-Up a Mixed Methods Research Study The key to any study is the research question(s) because this dictates the selection of the research methods In designing a study the underlying purpose is the reason for doing it, and is a necessary component Why are we doing the study? The quality of the study and the meaningfulness of the results are enhanced if we are clear about the purpose Often, the underlying purpose for us is student success. The reason “why?” needs to connect the research question. The purpose is the reasons why the research is undertaking the study.

Six Categories of MM Research Designs Sequential Explanatory Design Sequential Exploratory Design Sequential Transformative Design Concurrent Triangulation Design Concurrent Nested Design Concurrent Transformative Design I put these slides in here because I’m a researcher and I often like to categorize things. However, there aren’t any hard and fast rules in terms of the designs. Hopefully, these may help you to begin to think about how a MM Research Design might apply to some of your work.

Sequential Explanatory Design Collection and analysis of QUAN data followed by the collection and analysis of QUAL data Priority is usually given to QUAN data Integration of QUAN and QUAL data usually occurs in the interpretation phase of the study The purpose is usually to use the QUAL results to help explain the QUAN results WITH the sequential designs, you may conduct a study which raises some additional questions, and conduct a second study to help answer those questions. One drawback here is that there are two separate phases. With the sequential explanatory design, the results from the first phase inform the development of the second phase.

Sequential Exploratory Design Conducted in two phases Priority is given to the first phase of QUAL data collection The second phase involves QUAN data collection Overall priority is given to QUAL data collection and analysis The findings are integrated in the interpretation phase Most basic purpose is to use QUAN data to help interpret the results of the QUAL phase With the sequential exploratory design, the results from the first phase inform the development of the second phase. This type of design can help to identify or narrow the focus of possible variables with the QUAL phase, and inform the development of the QUAN phase. Can also help in the development of a new instrument.

Sequential Transformative Design Has two distinct data collection phases A theoretical perspective is used to guide the study Purpose is to use methods that will best serve the theoretical perspective of the researcher With the sequential transformative design, the results from the first phase inform the development of the second phase. Little research has been published using this type of design.

Concurrent Triangulation Design This is probably the most familiar MM design The QUAL and QUAN data collection are concurrent, and happen during one data collection phase Priority could be given to either QUAL or QUAN methods, but ideally the priority between the two methods would be equal Two methods are integrated in the interpretation phase The integration focuses on how the results from both methods are similar or different, with the primary purpose being to support each other

Concurrent Nested Design* Gathers both QUAL and QUAN data during the same phase Either QUAL or QUAN dominates the design The analysis phase mixes both the QUAL and QUAN data The QUAL data is used to help explain or better understand the QUAN data This is the design demonstrated here and the one that I have found most useful.

Concurrent Transformative Design Guided by a specific theoretical perspective The QUAN and QUAL data are collected during the same phase The integration of data occurs during the analysis phase The integration of data could occur in the interpretation phase Again, the purpose is to use methods that will best serve the theoretical perspective of the researcher

Process of Integrating QUAL and QUAN data The process of integrating QUAL and QUAN research needs to be well thought out prior to the study QUAL portion needs to be constructed in a way so that more novel information can be discovered Need to decide if QUAL portion is exploratory or confirmatory If exploratory, the purpose is to identify other dimensions that the QUAN portion is missing If confirmatory, the purpose is to support the QUAN relationship QUAL results can also be used to explain why there wasn’t a statistically significantly difference In examples shown here the purpose of QUAL was both exploratory and confirmatory. For instance, do the open-ended responses support the QUAN results and, if not, why not? May not know whether QUAL findings are confirmatory or exploratory until after the study is conducted.

Guidelines for Integrating QUAL and QUAN results Selection of research methods need to be made after the research questions are asked Some methods work well in some domains and not in others There is no model of integration that is better than another When there are results that support each other, it is possible that both the QUAN and QUAL results are biased and both are not valid The main function of integration is to provide additional information where information obtained from one method only was is insufficient If the results lead to divergent results, then more than one explanation is possible Need to first ask the research questions to know what best methods are. The research questions dictate the selection of research methods. XXX The best model needs to be decided each time that a research project is conducted.

Integrating QUAL and QUAN data One process of incorporating QUAL data with QUAN data is known as quantitizing, or quantifying the open-ended responses Dummy Coding (i.e. binarizing) – refers to giving a code of 1 when a concept is present and a code of 0 if it is not present A helpful tool for dummy coding open-ended responses is SPSS Text Analysis, now known as PASW Text Analysis for Surveys 3.0 run’s for $1,300.

Presenting MM Research Findings As with any research findings, if they cannot be communicated to the people who can use the information than the findings are worthless Presenting MM research can be more challenging because we are trying to communicate two types of information to readers For instance, writing-up QUAN research is very well defined, and QUAL research is more often about discovery

Insuring that MM Findings are Relevant Include stakeholders in the planning of the research Using MM research design may help a wider range of audiences connect to the material Make sure to define the language used in the report It is important to decide how the MM research findings are going to be written: combined or separately Defining the language used in the report is important because MMs is a new field and there isn’t a lot of agreement on many of the definitions. For instance, triangulation may refer to methodology or different points of view.

MM Research Study Example The IR Office at Chaffey was asked to examine the satisfaction of K-12 Districts with Chaffey College students who were working at a K-12 school in Chaffey’s District as paid tutors 29 tutors were evaluated MENTION that I am now going to show how I broke a lot of the rules that I just covered. Discuss why I chose this program first. This was first MM study I conducted. It was very small and much easier to manage then the last one I conducted, which I will discuss later. Main thought for wanting to conduct MM Research is that I’ve done numerous satisfaction surveys, and usually presenting the QUAN data indicates that students are mostly satisfied with everything the program offers, which in reality cannot be that accurate.

MM Research Study Example The form was not developed by IR Evaluated paid tutors on five job qualification areas Job skills Job knowledge Work habits Communication skills Attitude Three point rubric was used to evaluate paid tutors Did not meet the requirement Met the requirement Exceeded requirements Evaluators were also asked to provide comments Most closely matches a Concurrent Nested Design Gathers both QUAL and QUAN data during the same phase Either QUAL or QUAN dominates the design The analysis phase mixes both the QUAL and QUAN data The QUAL data is used to help explain or better understand the QUAN data

MM Research Study Example How did I combine the qualification ratings (QUAN) with the evaluator comments (QUAL)? Found an example of how to do this from Sandelowski (2003) Sandelowski provided an example where the QUAN responses were categorized and themes for each category were generated from the open-ended comments

MM Research Study Example First step is to create the categories from the QUAN data This step involves being very familiar with your data, and also some creativity With the paid tutor evaluation it was fairly easy to develop the categories Paid tutors who received a perfect rating in every category (n = 13) Paid tutors who had an average ranking equal to or above the mean (n = 5) Paid tutors who had an average below the mean (n = 11) Developing the categories for the paid tutor study was relatively easy. The two examples that I will go over later required a little more work. Overall average was computed by summing all of the job qualification category ratings and dividing by the number of job qualification categories (M = 2.51).

MM Research Study Example Mixing both the QUAL and QUAN data in the analysis phase After I created the three categories I printed out the comments associated with the paid tutors for each category and identified a theme for each one

MM Research Study Example Evaluator comments about tutors with a lower than average (i.e. 2.51) rating Themes identified included the following: lack of initiative, low attendance, and poor behavior management skills Sample of Evaluator Comments “[NAME] had plenty of subject smatter knowledge just needs support in behavior management. Perhaps that could be included in prep program at Chaffey.” “She was late several times and therefore couldn't complete the task assigned. She was positive and caring with children. The students really liked her and were motivated, but she had some difficult to handle students who occasionally got out of control. “

MM Research Study Example Evaluator comments about tutors with an average or above average rating (2.57-2.99) Themes were very positive, but paid tutors were rated low in one or two areas Sample of Evaluator Comments “[NAME] worked very well with my students. She had a lot of patience with them. “ “[NAME] is an excellent role model for my students. His attendance is his weakness; we depend on him and it impacts our program when he doesn't come and work. “

MM Research Study Example Evaluator comments about tutors were rated as exceeding job expectations in all areas Received very positive comments Sample of Evaluator Comments “[NAME]'s enthusiastic attitude, ability to relate to students, and knowledge of content assisted him in helping our students become successful.” “[NAME] was reliable, hard working, and a wonderful communicator to the student. [NAME] always offered to do more no matter what the task. Thorough tutor!”

Creating QUAN Categories for a Second MM Research Study Students in Fall 2007 and Spring 2008 rated SI Leaders in nine areas on a four point agreement scale A much smaller percentage of students provided comments about their SI Leader An overall average was computed for those who commented by summing student scores and dividing by 9 Left out N because the number of students rated with comments is very small for two reasons. Only a small portion of students commented and quantitatively rated their SI Leader on 6 or more of the 9 areas. M = 3.45

Creating QUAN Categories for a Second MM Research Study The categories in the SI study were a little more difficult to develop Students who rated SI Leaders below the average of 3.45 (n = 7) Students who rated SI Leaders average or above to 1 standard deviation above the mean (SD = .35, 3.45 – 3.64, n = 8) Students who scored 1 SD above the mean (3.65 – 4.00, n = 8) Includes just Fall data. Needed to something a little different here to try and get a somewhat even distribution across categories.

Limitations Proportion of open-ended responses to quantitative responses The amount of time required to do any MM Research Study (How do you choose?) Activity In activity, ask participants to review success center tables and find a theme for each category. Have people do different ones and compare. Leading to idea that a limitation is that interpretations may different. Mixed Methods Findings (see Tables 21 – 26) Students who were less satisfied with the success centers expressed concerns about the noise level in the centers as well as feeling like they were not treated well “The success center at times is extremely noisy. It has happened where I get frustrated and leave because I can not focus due to the distractions.” “Need to improve customer service, front staff are rude.”

Stakeholder Comment “Based on survey results from the annual Student Satisfaction Survey, I have made several decisions regarding tutor training, center-related curriculum, and staffing.  While the majority of students were satisfied with their center experience and thought the tutors were friendly and helpful (3.62 rating out of 4.0), students gave a lower rating to some other aspects of tutoring and center-related activities (see Table 19D in Spring 2008 Survey results).  As a result, I asked my tutors this year to complete a self-assessment in order to cause them to think more about their tutoring and how they can improve their tutoring approach.”

Stakeholder Comment Even when presenting data in a variety of way (i.e. charts, graphs, and other visuals), quantitative research seems difficult to absorb for many campus stakeholders. For those lacking a broader statistical context for understanding the information, even significant results can lose their impact. By combining quantitative data with narrative responses from open-ended questions, the 2008 Student Satisfaction Survey provided a more accessible tool to communicate program efficacy to the various constituent groups that support and rely on the Chaffey College Success Centers. When showcasing results in this manner to department faculty and administrators, individuals had a much clearer understanding of the information and had less difficulty relating that information directly to student success.

References Blank, E.C., Venkatachalam, P., McNeil, L., & Green, R.D. (2005). Racial discrimination in mortgage lending in Washington, D.C.: A mixed methods approach. Review of Black Political Economy, 33, 9-30. Retrieved June 10, 2008 from the SocINDEX database. Creswell, J.W., Clark, V.L.P., Gutmann, M.L., & Hanson, W.E. (2003). Advanced mixed methods research designs. In A. Tashakkori & C. Teddie’s (Ed.) Handbook of mixed methods in social and behavioral research (pp. 209-240). Thousand Oaks, CA: Sage. Greene, J.C., & Caracelli, V.J. (2003). Making paradigmatic sense of mixed methods practice. In A. Tashakkori & C. Teddie’s (Ed.) Handbook of mixed methods in social and behavioral research (pp. 91-110). Thousand Oaks, CA: Sage. Erzberger, C., & Kelle, U. (2003). Making inferences in mixed methods: The rules of integration. In A. Tashakkori & C. Teddie’s (Ed.) Handbook of mixed methods in social and behavioral research (pp. 457-488). Thousand Oaks, CA: Sage. Mertens, D.M. (2003). Mixed methods and the politics of human research: The transformative-emancipatory perspective. In A. Tashakkori & C. Teddie’s (Ed.) Handbook of mixed methods in social and behavioral research (pp. 135-164). Thousand Oaks, CA: Sage. Miller.S. (2003). Impact of mixed methods and design on inference quality. In A. Tashakkori & C. Teddie’s (Ed.) Handbook of mixed methods in social and behavioral research (pp. 423-455). Thousand Oaks, CA: Sage. Miller, S.I., & Gatta, J.L. (2006). The use of mixed methods and designs in the human sciences: Problems and prospects. Quality & Quantity, 40, 595-610. Retrieved June 1, 2008 from the Academic Search Premier database. Moghaddam, F.M., Walker, B.R., & Harre, R. (2003). Cultural distance, levels of abstraction, and the advantages of mixed methods. In A. Tashakkori & C. Teddie’s (Ed.) Handbook of mixed methods in social and behavioral research (pp. 111-134). Thousand Oaks, CA: Sage. Morse, J.M. (2003). Principles of mixed methods and multimethod research design. In A. Tashakkori & C. Teddie’s (Ed.) Handbook of mixed methods in social and behavioral research (pp. 189-208). Thousand Oaks, CA: Sage. Newman, I., Ridenour, C.S., Newman, C., & DeMarco, G.M.P. (2003). A typology of research purposes and its relationship to mixed methods. In A. Tashakkori & C. Teddie’s (Ed.) Handbook of mixed methods in social and behavioral research (pp. 167-188). Thousand Oaks, CA: Sage. Oermann, M.H., Galvin, E.A., Floyd, J.A., & Roop, J.C. (2006). Presenting research to clinicians: strategies for writing about research findings. Nurse Researcher, 13, 66-74. Retrieved July 25, 2008 from the Academic Search Premier database. Onwuegbuzie, A.J., & Teddlie, C. (2003). A framework for analyzing data in mixed methods research. In A. Tashakkori & C. Teddie’s (Ed.) Handbook of mixed methods in social and behavioral research (pp. 351-383). Thousand Oaks, CA: Sage. Posavac, E.J., & Carey, R.G. (2007). Program evaluation: Methods and case studies (7th Ed.). Upper Saddle River, NJ: Prentice-Hall. Sandelowski, M. (2003). Tables or tableauz? The challenges of writing and reading mixed methods studies. In A. Tashakkori & C. Teddie’s (Ed.) Handbook of mixed methods in social and behavioral research (pp. 321-350). Thousand Oaks, CA: Sage. Shulha, L.M. & Wilson, R.J. (2003). Collaborative mixed methods research. In A. Tashakkori & C. Teddie’s (Ed.) Handbook of mixed methods in social and behavioral research (pp. 639-669). Thousand Oaks, CA: Sage. Tashakkori, A., & Teddlie, C. (2003). Major issues and controversies in the use of mixed methods in the social and behavioral sciences. In A. Tashakkori & C. Teddie’s (Ed.) Handbook of mixed methods in social and behavioral research (pp. 3-50). Thousand Oaks, CA: Sage.