EXPLORING THE EFFECTIVENESS OF RCR EDUCATION IN THE SOCIAL AND BEHAVIORAL SCIENCES Jim Vander Putten Department of Educational Leadership Amanda L. Nolen.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

LOGO Principal Professional Growth & Effectiveness System Conducting Meaningful Site Visits.
Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
A GUIDE TO CREATING QUALITY ONLINE LEARNING DOING DISTANCE EDUCATION WELL.
Is it Research?. Is It Research? 2 Elements –The project involves a systematic investigation –The design (meaning goal, purpose, or intent) of the investigation.
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
NORTH CAROLINA TEACHER EVALUATION PROCESS TRAINING 2-Day Training for Phase I, II and III *This 2-Day training is to be replicated to meet.
Teacher Librarians. Contact Information Mary Cameron Iowa Department of Education (515)
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
Purpose of Evaluation  Make decisions concerning continuing employment, assignment and advancement  Improve services for students  Appraise the educator’s.
By Michael Morris Samika Thompson Integrating and Implementing Technology Fall 2009.
Introduction to Student Learning Outcomes in the Major
Curriculum, Instruction, & Assessment
University of Delaware Assessment of Learning in Student-Centered Courses Institute for Transforming Undergraduate Education Courtesy of Sue Groh and Barb.
SUNITA RAI PRINCIPAL KV AJNI
University of Delaware Introduction to Assessment Institute for Transforming Undergraduate Education Courtesy of Sue Groh.
INACOL National Standards for Quality Online Teaching, Version 2.
Becky Smerdon Council of Great City Schools October 21, 2010
1 DEVELOPING ASSESSMENT TOOLS FOR ESL Liz Davidson & Nadia Casarotto CMM General Studies and Further Education.
UNIVIRTUAL FOR INSTRUCTIONAL DESIGN Versione 00 del 29/07/2009.
SMART Goal All teachers should be able to recognize and implement 4 out of the 8 depth icons by the end of the workshop.
Higher-Level Cognitive Processes
Organizational Culture for Research Integrity in Academic Health Centers Jim Vander Putten (University of Arkansas-Little Rock) Carol R. Thrush (University.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Preceptor Orientation
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Beyond Multiple Choice: Using Performance and Portfolio Assessments to Evaluate Student Learning.
ADEPT 1 SAFE-T Evidence. SAFE-T 2 What are the stages of SAFE-T? Stage I: Preparation  Stage I: Preparation  Stage II: Collection.
Project Based Learning What, Why & How. Objectives for Today Have you experience the beginning of a project (= Making your own project) Analyze your experience,
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
FALCON Meeting #3 Preparation for Harnett County Schools Thursday, March 8, 2012.
Governor’s Teacher Network Action Research Project Dr. Debra Harwell-Braun
Challenges and Trade-offs in Measuring the Outcomes of NSF’s Mathematics and Science Partnership Program: Lessons from four years on the learning curve.
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
Assessment. Levels of Learning Bloom Argue Anderson and Krathwohl (2001)
An Integral Part of Professional Learning Communities.
Standards-Based Science Assessment. Ohio’s Science Cognitive Demands Science is more than a body of knowledge. It must not be misperceived as lists of.
Course, Curriculum, and Laboratory Improvement (CCLI) Transforming Undergraduate Education in Science, Technology, Engineering and Mathematics PROGRAM.
Jeanne Ormrod Eighth Edition © 2014, 2011, 2008, 2006, 2003 Pearson Education, Inc. All rights reserved. Educational Psychology Developing Learners.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
National Science Education Standards. Outline what students need to know, understand, and be able to do to be scientifically literate at different grade.
Bloom’s Taxonomy The Concept of “Levels of Thinking”
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
Research Ethics Office of Research Compliance. Responsible Conduct of Research (RCR) Covers 9 content areas –Animal Subjects (IACUC) –Human Subjects (IRB)
FSM NSTT Teaching Competency Test Evaluation. The NSTT Teaching Competency differs from the three other NSTT tests. It is accompanied by a Preparation.
SCIENCE Assessment Amanda Cantafio.
Workshop 2014 Cam Xuyen, October 14, 2014 Testing/ assessment/ evaluation BLOOM’S TAXONOMY.
Planning for Instruction and Assessments. Matching Levels Ensure that your level of teaching matches your students’ levels of knowledge and thinking.
A Collaborative Mixed-Method Evaluation of a Multi-Site Student Behavior, School Culture, and Climate Program.
Pedagogy supplants technology to bridge the digital divide. Mat Schencks Lisette Toetenel Institute of Educational Technology and Technology Enhanced Learning,
BLOOM'S TAXONOMY OF EDUCATIONAL OBJECTIVES From: Benjamin S. Bloom, Taxonomy of Educational Objectives: The Classification of Educational Goals.
Assessment.
NORTH CAROLINA TEACHER EVALUATION INSTRUMENT and PROCESS
Using Cognitive Science To Inform Instructional Design
Oleh: Beni Setiawan, Wahyu Budi Sabtiawan
Introduction to Assessment in PBL
MUHC Innovation Model.
November 17, 2016 Cathy Sanders Director of Assessment
مركز تطوير التدريس والتدريب الجامعي ورقة بعنوان
مركز تطوير التدريس والتدريب الجامعي ورقة بعنوان إعداد
COMPETENCIES & STANDARDS
Critically Evaluating an Assessment Task
Introduction to Assessment in PBL
Assessment of Learning in Student-Centered Courses
Exploring Assessment Options NC Teaching Standard 4
What is a Performance Task
Introduction to Assessment
Final Exam Reflection IDT3600 SARAH HERBERT.
Presentation transcript:

EXPLORING THE EFFECTIVENESS OF RCR EDUCATION IN THE SOCIAL AND BEHAVIORAL SCIENCES Jim Vander Putten Department of Educational Leadership Amanda L. Nolen Department of Teacher Education University of Arkansas-Little Rock Paper presented at the 2009 Research Conference on Research Integrity, May , 2009 Niagara Falls, NY.

Purpose and Background  Critical issue analysis of problems and issues in conducting research on responsible conduct of research education, instruction and training (RCR EIT) in the Social and Behavioral Sciences  From descriptive institutional data collected over two years, we recommend strategies for future research in this area  Importance of studying less prestigious segments of professions  can yield perspectives very different from elite cultures that receive the majority of media attention  University of Arkansas-Little Rock (UALR) is a metropolitan commuter institution in the Doctoral/Research-Intensive Carnegie institutional classification

Context August 2003: UALR implemented the CITI online social and behavioral sciences RCR EIT program Between 2004 and 2006, approximately 875 UALR faculty, staff, and students participated in the CITI training program: 88.5% (n=770) completion rate: - Students (n=551) - Principal Investigators (n=197), or - IRB Members (n=22 ) Non-completers (11.5%, n=97) identified themselves in one of several research roles:, Student (74), Principal Investigator (18), or IRB Member (5).

Institutional Data  Between 2004 and 2006, approximately 625 IRB proposals were submitted for review  Roughly 25% were unsuccessful (defined as being disapproved on first submission) for inappropriate research practices. Most common reasons for IRB protocol rejection:  incomplete IRB protocols,  inappropriate consent processes,  missing elements in informed consent documents,  insufficient data security measures  coercion in the data collection process.  Inspection of CITI training completion dates on unsuccessful IRB proposals indicated that the vast majority had been completed within 2 weeks prior to IRB proposal submission and review.

Analysis of Issues Evaluation surveys of CITI basic and refresher courses “have been very positive” In the Social and Behavioral Sciences, the least useful testing is indirect measurements polling students and asking them how they feel about their education. At UALR, the short timelines between CITI training completion and IRB proposal rejections don’t yield any insights into what researchers learn. Braunschweiger (2007) noted “the very fact that only 2% of survey responders reported that they preferred traditional classroom instruction to the Web-based approach that CITI provides clearly indicates that old paradigms must be reviewed and new ways to effectively deliver ethics education tested.” Question: “Although students are the best judges of what they want, are they the best judges of what they need?”

Critical Issues in ‘Learning’ LEARNING – A relatively permanent change in the capacity of an organism to make a response (adaptation) that cannot be explained by maturation or the passage of time. ISSUES :  CITI Training Validity  Problem Solving – Algorithm vs Heuristic  Knowledge Transfer

CITI Training Validity  Format  Presents content, case studies, scenarios  Researcher answers a series of multiple choice and true/false items based on content  Multiple choice and true false items assess the researcher’s understanding at the knowledge level.

TAXONOMY OF LEARNING  KNOW – Recall or Recognition  UNDERSTAND – Internalize  APPLY – Apply to new similar problem  ANALYZE– Reduce problem into parts  EVALUATE– Identify criteria and judge quality  CREATE– Novel problem Anderson, et al. 2001; Bloom, et al. 1958

TAXONOMY OF LEARNING  KNOW – Recall or Recognition  UNDERSTAND – Internalize  APPLY – Apply to new similar problem  ANALYZE– Reduce problem into parts  EVALUATE– Identify criteria and judge quality  CREATE– Novel problem CITI Training Assessment RCR Behavior

Problem Solving  Algorithm – Well defined problems  If X, then apply Y, the result will be Z  A recipe, a math problem, a science experiment  Heuristic – Poorly defined problems  Social sciences research involving human subjects with free will! Ormrod, 2008

Knowledge Transfer  Applying knowledge, skills, and abilities from the classroom/hypothetical setting to real situations with real consequences  Multiple models/examples  Relevant models  New knowledge built on prior knowledge Bandura, 1986

Conclusions and Recommendations  Create a more appropriate assessment of researchers’ knowledge, skill, and ability before completing RCR education modules;  Teach researchers appropriate heuristic problem solving techniques;  Provide examples and scenarios from across the disciplines that more closely resemble the research being conducted at that university;  Provide a breadth of examples of ethical issues involving human participants (Breadth & Depth)