Developing a feedback questionnaire: Principles and steps Workshop for NHS staff 28 Dec 1999 (Tuesday) Kam-Por Kwan, EDU 2766 6287

Slides:



Advertisements
Similar presentations
This ad from Greenfield Online suggests that well-executed research can save a company from making a costly mistake on new product introductions.
Advertisements

Key Stage 3 National Strategy
Science Subject Leader Training
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Managing SEN: Monitoring and Evaluation – Gathering Evidence: Making Judgements Day 1.
Module 2 Sessions 10 & 11 Report Writing.
The Course experience questionnaire (P. Ramsden) Designed as a performance indicator 24 statements relating to 5 aspects 1 overall satisfaction statement.
Supporting managers: assessment and the learner journey
Small scale project workshop September 2008 Ivan Moore Director CPLA Centre for Excellence in Teaching and Learning (Promoting Learner Autonomy)
School Based Assessment and Reporting Unit Curriculum Directorate
A Vehicle to Promote Student Learning
Correction, feedback and assessment: Their role in learning
Feedback Session Supervisory Workshop
Day 2: Learning and Teaching Session 3: Effective Feedback NYSED Principal Evaluation Training Program.
Objectives Explain the purpose of the RIME feedback method.
The Framework for Teaching Charlotte Danielson
7 Developing Employees Human Resources Management and Supervision
1 The influence of the questionnaire design on the magnitude of change scores Sandra Nolte 1, Gerald Elsworth 2, Richard Osborne 2 1 Association of Dermatological.
Understanding the ELA/Literacy Evidence Tables. The tables contain the Reading, Writing and Vocabulary Major claims and the evidences to be measured on.
Session 2: Introduction to the Quality Criteria. Session Overview Your facilitator, ___________________. [Add details of facilitators background, including.
Victorian Curriculum and Assessment Authority
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 27 Slide 1 Quality Management.
Assessment & Evaluation adapted from a presentation by Som Mony
You can use this presentation to: Gain an overall understanding of the purpose of the revised tool Learn about the changes that have been made Find advice.
Appraising and Managing Performance (c) 2007 by Prentice Hall7-1 Chapter 7.
7: Designing the Questionnaire
Seminar for Teacher Assistants
The Rubric Reality Cobb Keys Classroom Teacher Evaluation System.
Key Stage 3 National Strategy Standards and assessment: session 3.
SCIA Special Circumstances Instructional Assistance
Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
1 © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training English K-6 Syllabus Using the syllabus for consistency of assessment.
Summative Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Cross Cultural Research
Chapter Thirteen Fieldwork 13-1 © 2007 Prentice Hall.
South Pacific Board for Educational Assessment M & E Teacher Performance Improving teaching effectiveness Capacity Building Workshop on ‘Monitoring and.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
The Research Consumer Evaluates Measurement Reliability and Validity
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
Assessing Personality
Assessing and Evaluating Learning
1 DEVELOPING ASSESSMENT TOOLS FOR ESL Liz Davidson & Nadia Casarotto CMM General Studies and Further Education.
Overall Teacher Judgements
© Myra Young Assessment All rights reserved. Provided for the use of participants in AM circles in North Lanarkshire Council.
Classroom Assessments Checklists, Rating Scales, and Rubrics
EDU 385 Education Assessment in the Classroom
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
Social Factors Collecting Information on the impact of Social Factors on Your Teams Performance.
Assessment and Testing
The Peer Review Process in graduate level online coursework. “None of us is as smart as all of us” Tim Molseed, Ed. D. Black Hills State University, South.
Criteria for selection of a data collection instrument. 1.Practicality of the instrument: -Concerns its cost and appropriateness for the study population.
Parent Workshop Year 2 Assessment without levels January 2016.
Applied Opinion Research Training Workshop Day 3.
KS2 Parent Workshop Assessment without levels End of KS2 tests
Observation System Kidderminster College January 2012.
Questionnaire-Part 2. Translating a questionnaire Quality of the obtained data increases if the questionnaire is presented in the respondents’ own mother.
Evidence Based Practice & Research in Nursing Level 8, Academic Year (AY) 1434—1435 H Vanessa B. Varona, RN, MAN.
EVALUATING EPP-CREATED ASSESSMENTS
Classroom Assessments Checklists, Rating Scales, and Rubrics
School – Based Assessment – Framework
Assessing Personality
QUESTIONNAIRE DESIGN AND VALIDATION
Classroom Assessments Checklists, Rating Scales, and Rubrics
Assessing Personality
Consider the Evidence Evidence-driven decision making
Providing feedback to learners
Presentation transcript:

Developing a feedback questionnaire: Principles and steps Workshop for NHS staff 28 Dec 1999 (Tuesday) Kam-Por Kwan, EDU

2 Workshop outline Why use a feedback questionnaire? How to develop a feedback questionnaire that give useful and truthful information? –What are the common problems? –How to write good evaluation items? –How to determine if the questionnaire is valid and reliable? How to interpret student feedback in a meaningful way? Developing a student feedback questionnaire on clinical experience

3 Why use a feedback questionnaire? Economical to administer to the whole group, both in terms of time and effort Allow anonymity of responses Allow respondents to control own pace of response but Less chance to probe for clarification Emphasis on evaluators rather than respondents perspectives Reliability, validity, and usefulness depends on items included

4 Common problems Feedback questionnaires often fail to provides true and useful information because: –the items are constructed in an ad hoc basis without any theoretical framework behind –they ask about things that the (student) raters cannot validly comment on –the items are ambiguous to (student) raters and/or difficult to understand / interpret –the interpretations of the items are not clear –the items are too standardised to be useful –respondents are not motivated to complete it seriously

5 A systematic approach 7 steps to developing a questionnaire: –determining the focus of the evaluation –identifying all underlying dimensions and sub- dimensions involved –drafting questionnaire items –designing questionnaire: instructions and sequencing, etc. –pilot testing the questionnaire –revising questionnaire and items –implementing the questionnaire

6 Clinical supervision Clear guidelines Feedback Support Learning Identifying focus and dimensions Providing useful comments on what and how to improve Providing regular feedback on students clinical performance

7 Examining the dimensions Task 1 –Examine the draft questionnaire and identify for each item: the underlying dimension that it pertains to measure, and what kind of variable (presage, process, or product) is being measured. –What other dimensions or variables do you think should be included in the questionnaire?

8 Problematic items Task 2: Whats wrong with the items? In groups, discuss the problems of including the following items in a student feedback questionnaire. –The teacher seemed to have an up-to-date knowledge of evidence-based nursing practice. –The teacher worked hard to demonstrate clearly to me the proper skills of history-taking. –My progress was a major concern of the teacher.

9 More problematic items –I was provided with informed and constructive feedback on my performance by the teacher immediately after my clinical practice. –I found it difficult to apply the theory I learned at university to my clinical practice. –Every student was encouraged to participate in class discussions. –The teacher did not discourage me from using techniques that are not evidence-based. –Appropriate computer technology and AV aids were used by the teacher to facilitate learning.

10 Principles of item writing (1) Use simple English and simple sentence structure Avoid questions that the intended respondents do not have the knowledge to comment on Avoid ambiguous questions or wordings that may have alternative interpretations Avoid double-barreled questions (items containing more than one ideas)

11 Principles of item writing (2) Avoid unnecessary jargons that may not be understood by the intended respondents Avoid words like all the time, never, every,... Avoid double negatives Avoid questions with unwarranted underlying assumptions

12 Using open-ended questions Limitations of ratings: –reflect evaluators rather than respondents perspectives –suggest whether improvements are needed, but not why or how –give a false sense of objectivity and precision Open-ended questions –allow respondents perspectives to emerge –offer chances for respondents to clarify personal meanings and suggest changes

13 Optional questions Standardised questions: –allow comparisons across units or teachers BUT –cannot cater for individual needs Optional questions: –allow users to collect information on aspects specific to the individual units or teachers –useful for improvement purposes

14 Revising the draft questionnaire Task 3 In groups, suggest how the draft questionnaire might be further modified to make it more useful and valid. You might consider: –adding new items /deleting redundant items –rewording the items as needed to make their meaning clearer and less ambiguous –inserting open-ended questions –the possibility of allowing optional questions to be included

15 Good design as short and sharp as possible (a few short questionnaires at different points may be better than a long one at the end) appeals to the intended respondents with clear instruction questions arranged in good psychological order, from general to more specific attractive and neat in appearance clearly duplicated / printed easy to code and interpret

16 Validity and Reliability (1) Validity –Does the questionnaire measure what it is supposed to measure? –Do the items together measure the most significant aspects of the evaluation question? –Improving validity by: judgment of a panel of experts pilot testing on a sample of intended respondents relating to theory of teaching

17 Validity and Reliability (2) Reliability –Does the questionnaire give a consistent results of it is measuring? –Do the items yield results that agree with each other, and are they consistent over time? –Improving reliability by establishing the: internal reliability of the instrument, scales, and sub-scales test-retest reliability

18 Student feedback: what research says Quite reliable and consistent Reasonably valid Relatively uncontaminated by variables seen as sources of potential bias Useful for a number of purposes An imperfect measure of teaching Must be interpreted in contexts Useful insofar as one source of information Can be abused if not interpreted properly BUT

19 Nature of student feedback Subjective perceptions Based on what students have directly experienced Influenced by their own characteristics such as prior knowledge, motivation, interest, etc. Reflected students implicit theory of teaching and learning

20 Some pitfalls in interpretation Treating student feedback as if it were a totally objective, precise, and truthful indicator of teaching Over-interpreting small differences in ratings Comparing ratings across units or teachers without considering the context Ranking units/teachers by their total scores

21 Interpretation guidelines Avoid over-interpreting small differences: only crude judgements can be made Focus on the relative strengths and weaknesses as reflect in the profile of ratings rather than the total scores Interpret feedback in context: need to take into consideration the features of the centre and the students Consider ratings from different classes, and over a number of years Need to check student feedback against other sources of evidence

22 Some final words Student feedback is a useful source of information, not a verdict Student feedback cannot replace professional judgment of the teacher Teachers self reflection on the feedback collected is the key to improvement