Janet Maher November 1, 2011
Curriculum Development: design and monitor a collaborative process for achieving consensus on core competencies Participant/Learner satisfaction with learning experience Participant/Learner outcomes linked to learning experience
To ensure that study protocols meet the highest possible standards with regard to: ◦ Quality of Evidence ◦ Format (engagement, ease of use) ◦ Practicality (usefulness in applied setting) ◦ Feasibility (ease of implementation) ◦ Maintenance (amount of time, cost required to maintain knowledge )
Background ◦ Historically a disjuncture between quantitative and qualitative research over rigour and trustworthiness of analysis and by implication of data used : methodological debates developed ‘parallel’ standards for qualitative and quantitative research Since 2000: move back to reviewing both types in terms of validity and reliability and strategies for achieving them This presentation focuses primarily on quality of evidence and relies on 2 main sources which are posted on igloo: Morse et al., 2002 in nursing and Golafshani, 2003 in education
Refers to ◦ Reproducibility of an outcome Tested by ◦ Study protocol sufficiently specifies a procedure so that others can use it to achieve consistent and stable results with other similar populations
Refers to ◦ The extent to which a given test accurately represents the features of the phenomena it is intended to describe, explain or theorize (Hammersley 1992) Tested by ◦ An agreement between 2 or more efforts to measure the same thing using different indicators or measures
Main objective is to persuade peers that the results can be generalized ◦ Protocol should summarize criteria, including interest in typical or critical cases, in a similar fashion to quantitative strategies, including Sampling strategy Review at several points by different coders/analysts in data collection not just at end Triangulation—use and document different strategies to validate Verification strategies—audit trail, member check
Suggest we revisit the process of the past 10 days or so collaborating on design of our brand Sampling strategy—whole core team with shared objectives, range of expertise/competencies, some overlapping Review at several points by different coders/analysts in data collection not just at the end—moderator(s) intervene and take account of contrary information Triangulation—not formally, but could be done based on the materials on our forums Verification strategies—not formally, but documentation is available—would be good if it were all in one place
To summarize, consider advantages of mixed method approach Look at the other criteria for study tools ◦ Format (engagement, ease of use) ◦ Practicality (usefulness in applied setting) ◦ Feasibility (ease of implementation) ◦ Maintenance (amount of time, cost required to maintain knowledge )
Do you agree on the primary focus on quality of evidence? Can we commit to careful documentation of process through the igloo community space(s)? Can you help me with some better search terms around polling and distance education Thanks Janet