Download presentation
Presentation is loading. Please wait.
Published byBaldwin Hubbard Modified over 8 years ago
1
Studying the use of research knowledge in public bureaucracies Mathieu Ouimet, Ph.D. Department of Political Science Faculty of Social Sciences CHUQ Research Center KT National Seminar Series December 9, 2010 12:00- 13:00 ET
2
Learning objectives 1.To define research knowledge and use 2.To report the results of a cross-sectional study of the use of research evidence in health and non-health ministries 3.To invite researchers to use a variety of methodological approaches
3
1. Key definitions and clarifications
4
Defining scientific knowledge Seldom conceptually defined in RU studies Unresolved demarcation problem KKV * definition of scientific research ①The goal is inference - causal or descriptive ②The procedures are public - to allow assessment ③The conclusions are uncertain – this should be made explicit ④The content is the method - rather than the subject matter Object of contention - Nomothetic vs Idiographic * King G, Keohane RO, Verba S. (1994). Designing Social Inquiry. Princeton (New Jersey): Princeton University Press.
5
Defining research utilization (RU) Instrumental / conceptual / symbolic RU standards (Knott & Wildavsky, 1980) – Outcomes to measure ①Reception - when studies reach users ②Cognition - when studies are read, digested and understood ③Reference – when studies change users’ frame of reference ④Effort – when users fight for the adoption of studies’ recommendations ⑤Adoption – when studies influence policy adoption ⑥Implementation – when studies influence policy implementation ⑦Impact – when policy stimulated by studies yields tangible benefits (outcomes)
6
2. Cross-sectional study of policy analysts in health and non-health ministries
8
STUDY AIM OVERALL OBJECTIVE: to identify significant correlates of research use among policy analysts working at the ministerial level. SPECIFIC OBJECTIVE: to provide empirical evidence on the magnitude of the association between direct interactions with researchers and research use, while adjusting for other correlates
9
STUDY LIMITATIONS Cross-sectional nature of the data Self-reported data (social desirability bias, recall bias) Study does not document what determine which research articles, reports or books the policy analysts read Observational rather than experimental (study misses the required step of demonstrating experimentally that changes in the correlates will have the desired effects and are not simply manifestations of some deeper cause).
10
METHODS /1 DESIGN: A random-digit-dialing telephone cross- sectional survey. PARTICIPANTS: Policy analysts defined as civil servants belonging to 14 professional groups. SETTING: 17 ministries, including the Health & Social Services DATA COLLECTION: Questionnaire administered by a small survey firm between 26 September and 25 November 2008 using the CATI technology, which allows for simultaneous data entry and data coding. N= 1614 (response rate = 62.48%)
11
METHODS /2 SURVEY QUESTIONS: Only closed-ended questions MAIN OUTCOMES: ①Consultation of scientific articles ②Consultation of academic research reports ③Consultation of academic books (chapters)
12
METHODS /3 3 ordinal regression models (3 outcomes) – Modifiable correlates (eg direct interactions with researchers, perceived relevance of academic research, etc.) – Unmodifiable correlates (eg Formation type, displinary fields of training, gender, age, policy sectors, policy stages, etc.) Post-estimation simulations
13
Percentage distribution for the correlates considered in the study /1
14
Percentage distribution for some correlates considered in the study /2
15
Percentage distribution for the types of documents consulted monthly or weekly (n= 1614)
16
Consultation of different types of documents – Health and Social Services (n= 100) (Monthly & Weekly consultation combined)
17
Consultation of scientific articles across policy sectors (monthly & weekly consultation combined)
18
Percentage distribution for the three outcome variables across policy sectors (monthly and weekly consultation combined)
19
Correlates positively and significantly associated with the three outcome variables - - - - - - - - - - Only two correlates were not significantly associated with any outcome variable gender and being solely involved in policy evaluation rather than in policy formulation
20
Statistical simulations /1 Each unmanipulable correlate was fixed to a specific value using descriptive statistics as a guideline. Fixed unmanipulable correlates: disciplinary field of training (fixed at = human and social sciences); English reading (fixed at = yes); type of studies preferred (fixed at = quantitative studies); age (fixed at = 40–49 years); gender (fixed at = men); production of written advice (fixed at = yes); proportion of working time spent in meetings (fixed at = second quartile, 6.68%–14.28%); policy stages (fixed at = two policy stages).
21
Statistical simulations /2 Four manipulable correlates shifted simultaneously from their minimum (0) to their maximum value (1): interactions with researchers in human and social sciences; continuing professional development involving scientific content; reported access to electronic bibliographic databases from own workstation; perceived relevance of academic research evidence. Combined marginal effect computed 16 times, for each policy sector (then the average was calculated). Same procedure repeated by changing the training type value from ‘undergraduate’ to ‘research Master’s/PhD’ Also reported: lowest and highest combined marginal effects observed in specific policy sectors.
22
Percentage points increase or decrease in the probability of weekly consultation of scientific articles* *Simulated marginal effect of a simultaneous change in modifiable correlates on weekly consultation of scientific articles
23
3. Some challenges for future research in this field
24
Research challenges Measuring research use objectively in public bureaucracies Measuring research use according to other standards (eg benefits, health outcomes, etc.) Documenting the research knowledge infrastructure found in ministries and studying its effect on policy analysts’ utilization behaviour Opening up the black box of research knowledge Opening up the black box of direct interactions Conducting experimental research in ministries and agencies
25
Thank you
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.