Because organizational environments are constantly demanding that employees learn continuously, it is crucial that training programs be effective in reaching.

Slides:



Advertisements
Similar presentations
Developing Satisfaction Surveys: Integrating Qualitative and Quantitative Information David Cantor, Sarah Dipko, Stephanie Fry, Pamela Giambo and Vasudha.
Advertisements

The Robert Gordon University School of Engineering Dr. Mohamed Amish
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
One common way to evaluate such effectiveness is to measure learners’ satisfaction with the various aspects of a course. According to Borges-Andrade (2002),
Navigating on the web: a qualitative analysis of the instructional material from a distance education course 1 - ABSTRACT What were the evaluation Tools?
EDD/581 Action Research Proposal
Training Kirkpatrick’s Four Levels of Evaluation Kelly Arthur Richard Gage-Little Dale Munson Evaluation Strategies for Instructional Designers.
Course and sample characteristics  Characteristics of the course: this distance course intended to teach attendants how to sell credit cards and was offered.
Although many alternatives have already been developed through instructional technology in order to achieve more and more people, adult distance learners.
Introducing the Analysis of the Didactic Material of a web-based training 1 - ABSTRACT Evaluation Tools  The Analysis of the Didactic Material Checklist:
Prepared by: Marelize Gorgens-Albino, The World Bank Global AIDS M&E Team (GAMET) 1 12 Components of a Functional National HIV M&E System A short introduction.
Seminar topic : Types of data collection tools
Business and Management Research WELCOME. Lecture 9.
Selecting Your Evaluation Tools Chapter Five. Introduction  Collecting information  Program considerations  Feasibility  Acceptability  Credibility.
SOCI 380 INSTRUCTIONS RE. RESEARCH PAPER DUE DATE: The research paper is due on the last day of class You are required to write and submit a detailed research.
© Tefko Saracevic, Rutgers University 1 EVALUATION in searching IR systems Digital libraries Reference sources Web sources.
Formative and Summative Evaluations
1 A Comparison of Traditional, Videoconference-based, and Web-based Learning Environments A Dissertation Proposal by Ming Mu Kuo.
Gender Issues in Systems Design and User Satisfaction for e- testing software Prepared by Sahel AL-Habashneh. Department of Business information systems.
1 A Comparison of Traditional, Videoconference-based, and Web-based Learning Environments A Dissertation Defense by Ming Mu Kuo.
UNIWERSYTET MIKOLAJA KOPERNIKA Turon, Polen – Subject integration through Transitions and boundary crossing Assistant Professor Vibeke.
Formulating the research design
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
Evaluation of Training
Presenter: Yun-Ting, Wong Adviser: Ming-Puu,Chen Date: Dec. 09, 2009 Liu, F. I., Chen, M. C., Sun, Y. S., Wible, D., & Kuo, C. H. (2010). Extending the.
Developing a Pedagogical Model for Evaluating Learning Objects Dr. Robin Kay and Dr. Liesel Knaack University of Ontario Institute of Technology Oshawa,
Measuring the Psychosocial Quality of Women’s Family Work: Initial Findings Tamara Colton 1 BA (Hons), Laurie Hellsten 1 PhD & Bonnie Janzen 2 PhD 1 Department.
26 TH ACADEMIC COUNCIL ST APRIL 2015 Breakout session Group C “EXPERIENTIAL LEARNING”
Qualitative Methods vs. Quantitative Methods. Qualitative Methods? Quantitative Methods?
Formative evaluation: using diverse tools in order to refine and improve instructional procedures in distance trainingABSTRACT THE COURSE THE EVALUATION.
Tools in Media Research In every research work, if is essential to collect factual material or data unknown or untapped so far. They can be obtained from.
Research Design and Methods Professor Tamo Chattopadhay Institute for Educational Initiatives & Kellogg Institute for International Studies University.
Presenter : Ching-ting Lin Instructor: Ming-puu Chen Developing a Usability Evaluation Method for E-learning Application: From Functional Usability to.
National Public Health Institute, Finland Open risk assessment Lecture 7: Evaluating assessment performance Mikko Pohjola KTL, Finland.
Quantitative SOTL Research Methods Krista Trinder, College of Medicine Brad Wuetherick, GMCTE October 28, 2010.
Final Project Presentation ETEC 550
Information commitments, evaluative standards and information searching strategies in web-based learning evnironments Ying-Tien Wu & Chin-Chung Tsai Institute.
NAME Evaluation Report Name of author(s) Name of institution Year.
Reading and Writing Research Reports Reporting research results has a few basic components: 1) What has the author(s) read that prompted the research ?
Common to some 90% of organizations Acknowledged by CEOs to drive strategy Failure rates of 80%-90% Produces conflict & competition Some have advocated.
Distance education courses from a Brazillian financial company: a qualitative analysys of the instructional material 1 - ABSTRACT 2 - INTRODUCTION 4 -
Unit 5—HS 305 Research Methods in Health Science
1.  Interpretation refers to the task of drawing inferences from the collected facts after an analytical and/or experimental study.  The task of interpretation.
Introduction to Scientific Research. Science Vs. Belief Belief is knowing something without needing evidence. Eg. The Jewish, Islamic and Christian belief.
Systems Development Life Cycle
This material is based upon work supported by the National Science Foundation under Grant No and Any opinions, findings, and conclusions.
Statistical process model Workshop in Ukraine October 2015 Karin Blix Quality coordinator
Object of Study  Four distance courses, available via intranet through the corporative university, all related to different products of the institution.
RES 320 expert Expect Success/res320expertdotcom FOR MORE CLASSES VISIT
Effects of Word Concreteness and Spacing on EFL Vocabulary Acquisition 吴翼飞 (南京工业大学,外国语言文学学院,江苏 南京211816) Introduction Vocabulary acquisition is of great.
Action Research for School Leaders by Dr. Paul A. Rodríguez.
PROCESSING DATA.
Traditional descriptors Multi-item measures Preference measurement
WP8: Demonstrators (UniCam – Regione Marche)
Standards-Based Assessment Linking up with Authentic Assessment
REPORT WRITING.
Computer Assisted Language Learning Literacy
Oleh: Beni Setiawan, Wahyu Budi Sabtiawan
Systems Analysis and Design
Measuring Social Life: How Many? How Much? What Type?
Critical Analysis of Ochoa
Lesson 1 Foundations of measurement in Psychology
Assessing learners’ needs
Asist. Prof. Dr. Duygu FIRAT Asist. Prof.. Dr. Şenol HACIEFENDİOĞLU
EDD/581 Action Research Proposal (insert your name)
Conduction of a simulation considering cascading effects
EDD/581 Action Research Proposal (insert your name)
Case studies: interviews
M.A. Vargas Londoño E.O. Cardoso Espinosa J.A. Cortés Ruíz
Presentation transcript:

Because organizational environments are constantly demanding that employees learn continuously, it is crucial that training programs be effective in reaching every individual. According to Kirkpatrick (1976), there are four levels of evaluation training: reaction, learning, the performance in the job and results. Trainee reaction to training is a very relevant variable, usually associated with high levels of impact of training at work. Despite such importance, valid and reliable trainee reaction measures are still rare (Abbad, Gama &Borges-Andrade, 2000). According to Borges-Andrade (2002), studying reaction usually involves the use of a questionnaire with a Likert-like scale and at least one open question, which demands content analysis. Usually, qualitative and quantitative analyses present high degrees of correspondence. This study presents two experiences in examining trainee reaction to distance training programs. The first one consists of the validation process of a questionnaire that measures trainee reaction to the instructional procedures used in two distance education courses. The second one describes the use of qualitative methods in the formative evaluation of a web-based training program. Evaluating reaction to distance training programs: combining quantitative and qualitative methods ABSTRACT METHOD This study is an attempt to trigger a discussion about the combined use of both quantitative and qualitative methods to evaluate trainee reaction to training programs. Recent qualitative research conducted in Brazil resulted in a valid and consistent questionnaire containing 12 items that evaluated training reactions to traditional procedures and procedures based on new information technology. Research on trainee reaction has also been done with qualitative methods, such as the content analysis of learners’ answers to open questions related to strategies, content, layout and other aspects of web-based courses. Once both methods provide researchers with precious, useful information, the combined use of them is suggested. As Borges-Andrade (2002) points out, the process of creating an instrument for collecting quantitative data may also include qualitative tools (interviews and open-questioned surveys). Therefore, it is suggested that qualitative and quantitative methods are used together to evaluate reaction. Study 2, in particular, exemplify the qualitative approach in the formative evaluation of a web-based training program. Its results present agreement with previous findings from technical evaluation (See the poster Formative evaluation: using diverse tools in order to refine and improve instructional procedures in distance training). Such practice has shown great value, once the qualitative methods have added specific information that clarifies data collected with quantitative tools. Combining both qualitative and quantitative methods is suggested for further research with web-based training, once it allows a more holistic treatment of the multiple variables involved in the continuous learning process that should take place in a sustainable organization of work. DISCUSSION AND RECOMMENDATIONS Abbad, G., Gama, A. L. G. & Borges-Andrade, J. E. (2000). Treinamento: Análise do relacionamento da avaliação nos níveis de reação, aprendizagem e impacto no trabalho. Revista de Administração Contemporânea, 4, Borges-Andrade, J.E. (2002). Desenvolvimento de Medidas em Avaliação de Treinamento. Estudos de Psicologia, 7 (Número Especial), Kirkpatrick, D. L. (1976). Evaluation of Training. Em R. L. Craig (Org),Training and Development Handbook (pp ). New York,NY: Mc Graw-Hill. REFERENCES RESULTS University of Brasília - Brazil Institute of Psychology Impacto: Research on Training and Organizations of Work Authors: Lidia Parachin André Wogel Gardênia Abbad Maria Emília Araújo Talita Custódio Karen da Matta INTRODUCTION PRONEX Fubra Table 2: Factorial Loads for the items in Study 1 Study 1: Descriptives: Courses ‘MA’ and ‘FC’ were both well evaluated, as it is shown in the Table 1. Factorial analysis and internal consistency analysis presented two factors: Traditional procedures (alpha de Cronbach=0,97), and Procedures based on new information technology (alpha de Cronbrach=0,88), as it is shown in the Table 2. Study 2: First, researchers proceeded to a technical evaluation of the didactic material of the course, making use of the The Analysis of the Didactic Material Checklist. Then, participants of the course answered to a questionnaire containing five open questions about: 1. application of what participants learned; 2. what aspects of the participant’s work improved because of the training program; 3. participants’ opinion about the exercises; 4. what aspects should be improved in the training program; 5. what participants would like to learn in the course and haven’t learnt. Study 1Study 2 What was the context? 2 distance education courses: (1) Program ‘MA’ — 60h at distance, 40h at presence; 710 participants. (2) Program ‘FC’ — 60h at distance; 223 participants. A web-based training program with the main objective of capacitating consultants from a big Brazilian financial institution. What was the object? Training reaction to procedures: (a) Traditional procedures — the ones common to both distance and presential modalities (e.g.: quality of instructional objectives, sequence of content); (b) Procedures based on new information technology (e.g.: use of chat, tutoring). Training reaction to different aspects of the course, such as: (a) Instructional Strategies; (b) Content; (c) Layout; (d) Type of language used; (e) Tutoring. How was it done? Quantitative method: Questionnaire 12 items Likert scale (0=awful; 10=excellent) Qualitative methods: Qualitative Analysis of the didactic material; The content analysis of learners’ answers to open questions. Technical Evaluation - Weakness of exercises - Layout problems - Navigability problems - Lack of practical tools - Lack of interactive alternatives - Weakness of the content Participants' Answers Positive: “Today I feel more prepared to sign a contract with a client, acting accord his style, to negociate” / “The content helped me in the selection of the consultants in my area.” Negative: “I haven’t applied it yet, at any kind of consult”. / “The contents at the web training are theorics, distant of the practice and the day by day of a internal consult.” Positive: “I learned the differences of many types of a consult.” “We feel more secure to propose actions”. Negative: "Nothing. I already knew what was there." Positive: “The exercises were very good, because they helped me to memorize the contents” / “They are relevant.” Negative: “There were few exercises” / “In general, they were not good”. / “I exercised my memory, not the application of the theory.” / “Totally incompatible with the contents in the unit.” Navigability Interaction Grammar Errors Applicability Sequence and Distribution of content “The use of tools for projects.”/“The real role of a consultant at the organization.”/“Practical guidelines about leading with clients.” Table 1: Descriptives in Study 1 Table 3: Results in Study 2