Background Method Discussion References Acknowledgments Results  RateMyProfessors.com (RMP.com) launched in 1999 as an avenue for students to offer public.

Slides:



Advertisements
Similar presentations
Urban Youth: Their School, Community and Perceived Education Sara Adan New York University.
Advertisements

Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Student Survey Results and Analysis May Overview HEB ISD Students in grades 6 through 12 were invited to respond the Student Survey during May 2010.
Bridging the Sophomore Gap: A Developmental Model of Information Literacy Shawn Bethke, Head of Library Public Services George Loveland, Library Director.
Indiana State University Assessment of General Education Objectives Using Indicators From National Survey of Student Engagement (NSSE)
AGE VARIATION IN MATING STRATEGIES AND MATE PREFERENCES AMONG COLLEGE STUDENTS Danielle Ryan and April Bleske-Rechek, University of Wisconsin-Eau Claire.
Is College Success Associated With High School Performance? Elizabeth Fisk, Dr. Kathryn Hamilton (Advisor), University of Wisconsin - Stout Introduction.
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
Evaluating the Preparation-Guide as a Tool for Increasing Students’ Accountability for Reading the Textbook Etty Vandsburger, Ph.D., LCSW, Rana Daston-Duncan,
Guidelines and Methods for Assessing Student Learning Karen Bauer, Institutional Research & Planning, Undergraduate Studies; Gabriele Bauer, CTE.
Classroom Crushes: An Exploration of Student-Instructor Attraction Emily L. Travis and Traci A. Giuliano Southwestern University Student-teacher romances.
Student Consensus on RateMyProfessors.com April Bleske-Rechek, Amber Fritsch, and Brittany Henn University of Wisconsin Eau Claire Background Method Discussion.
Abstract Existing Survey Instrument Items Graphs Jasmine Olson  Dr. Bingen Mathematics  University of Wisconsin-Eau Claire  The purpose of this study.
Potential Biases in Student Ratings as a Measure of Teaching Effectiveness Kam-Por Kwan EDU Tel: etkpkwan.
WASHBURNWASHBURN Friends of Mabee Library October 28, 2004 Standardized Assessment of Information Literacy Skills Presented by Judy Druse Martha Imparato.
Using Growth Models for Accountability Pete Goldschmidt, Ph.D. Assistant Professor California State University Northridge Senior Researcher National Center.
Review of SUNY Oneonta Course Evaluation Form Report and Recommendations from The Committee on Instruction: Part II October 4, 2010.
Student Technological Mastery: It's Not Just the Hardware Wm. H. Huffman, Ph.D. Ann H. Huffman, Ph.D.
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Fostering “Habits of Mind” for Student Learning in the First Year of College: Results from a National Study Linda DeAngelo, CIRP Assistant Director for.
Registration Satisfaction Survey FAS Report, Fall Presented by: K. El Hassan, PhD. Director, OIRA.
Assessing the Heritage Planning Process: the Views of Citizens Assessing the Heritage Planning Process: the Views of Citizens Dr. Michael MacMillan Department.
Advisors Are Selling, But Are Faculty Buying? Assessing Faculty Buy-in of an Early Academic Alert System Abstract The advising community recognizes the.
Data Collection and Preliminary Analysis Our survey addressed the first two of the questions presented in this study. The Qualtrics survey was framed by.
AN EVALUATION OF THE EIGHTH GRADE ALGEBRA PROGRAM IN GRAND BLANC COMMUNITY SCHOOLS 8 th Grade Algebra 1A.
METHODS Study Population Study Population: 224 students enrolled in a 3-credit hour, undergraduate, clinical pharmacology course in Fall 2005 and Spring.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Business and Management Research
Quantitative vs. Categorical Data
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
“Would Someone Say Something, Please?” Increasing Student Participation in College Classrooms Jane L. Kenney & Padmini Banerjee Presented by Amy Stonger.
Exploring Honors Students’ Levels of Academic Motivation, Perfectionism, and Test Anxiety Hannah Geis, Kelly Hughes, and Brittany Weber, Faculty Advisor:
Exploring the Relationships Among College Students’ Goal Orientations, Perfectionism, and Academic Self-Efficacy Hannah Geis and Brittany Weber, Faculty.
Exploring College Students’ Perceptions of Their Peers With Disabilities Katie Beck a and Kellie Risberg a, Faculty Mentor: Mary Beth Leibham b, Ph.D.
Background RateMyProfessors.com (RMP.com) is a public forum where students rate instructors on several characteristics: Clarity Helpfulness Overall Quality.
Student Engagement Survey Results and Analysis June 2011.
EVALUATION REPORT Derek R. Lane, Ph.D. Department of Communication University of Kentucky.
Student Centered Teaching Through Universal Instructional Design Part II.
Maximizing Learning Using Online Assessment 2011 SLATE Conference October 14, /12/ P. Boyles, Assistant Professor, Chicago State University,
TEST DESIGN Presented by: Danielle Harrison. INTRODUCTION  What is a test? “Any activity that indicates how well learners meet learning objectives is.
Tailoring Course Evaluations/Student Feedback to Improve Teaching Jeffrey Lindstrom, Ph.D. Siena Heights University Webinar 6 October 2014.
Data analysis was conducted on the conceptions and misconceptions regarding hybrid learning for those faculty who taught in traditional classroom settings.
Friends as Rivals: Perceptions of Attractiveness Predict Mating Rivalry in Female Friendships Stephanie R. A. Maves, Sarah L. Hubert, and April Bleske-Rechek.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Teaching for Civic Competency Jeffrey L. Bernstein Department of Political Science Eastern Michigan University Dirksen Congressional Center Congress in.
Students’ and Faculty’s Perceptions of Assessment at Qassim College of Medicine Abdullah Alghasham - M. Nour-El-Din – Issam Barrimah Acknowledgment: This.
Main components of effective teaching. SRI validity A measuring instrument is valid if it measures what it is supposed to measure SRIs measure: Student.
S&T Student Course Evaluations: Beyond Question 7 – Part 2 Dan Cernusca, Ph.D. Instructional Design Specialist Missouri University of Science and Technology.
The Call to Write, Third edition Chapter 23, Writing Portfolios.
Background Sample Adwoa Lynn  Jessica Holbach  Senior Nursing Students Dr. Rita Sperstad, Assistant Professor  Dr. Rachael Haupt-Harrington, Assistant.
Skills: successful learning behaviors Concepts: Internet education applications This work is licensed under a Creative Commons Attribution-Noncommercial-Share.
Abstract: The definition of effective teaching is fluid and dependent on the teaching environment and its community members (faculty, students and administrators).
Tackling the Complexities of Source Evaluation: Active Learning Exercises That Foster Students’ Critical Thinking Juliet Rumble & Toni Carter Auburn University.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
USING MUSIC TO SUPPORT LEARNING How Can the Use of Music as a Teaching Tool Support and Enhance Learning and Improve Learning Outcomes ? Millicent Howard.
Concerns, Data, Next Steps.  New Administration Software from Scantron  New Academic Senate Policy  New Items/Survey Form (ACE, Iowa Item Pool)  New.
Do Easier Classes Make for Happier Students? Amanda J. Watson, PhD Murray State University Background Grade inflation has been of concern in higher education.
Instructional Strategies Teacher Knowledge, Understanding, and Abilities The online teacher knows and understands the techniques and applications of online.
Acknowledgments We thank Dr. Yu, Dr. Bateman, and Professor Szabo for allowing us to conduct this study during their class time. We especially thank the.
CALL in TESOL Teacher Training Greg Kessler Ohio University.
Instructors’ General Perceptions on Students’ Self-Awareness Frances Feng-Mei Choi HUNGKUANG UNIVERSITY DEPARTMENT OF ENGLISH.
UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing An Exploratory Study.
Kenneth C. C. Yang The University of Texas at El Paso Presented at 2016 Sun Conference TEACHING INFORMATION LITERACY SKILLS IN COLLEGE CLASSROOMS: EMPIRICAL.
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
Crystal Reinhart, PhD & Beth Welbes, MSPH Center for Prevention Research and Development, University of Illinois at Urbana-Champaign Social Norms Theory.
Taeho Yu, Ph.D. Ana R. Abad-Jorge, Ed.D., M.S., RDN Kevin Lucey, M.M. Examining the Relationships Between Level of Students’ Perceived Presence and Academic.
Overview of Results and Demographics Amanda Krueger  Dr. Robert J. Eierman  ORSP  University of Wisconsin-Eau Claire prioritieschallenges The Mindful.
Usability of E-Commerce Web Sites Longwood College Taryn L. Fox.
Kayla McCaleb, Sara Sohr-Preston, and Karen Phung
Presentation transcript:

Background Method Discussion References Acknowledgments Results  RateMyProfessors.com (RMP.com) launched in 1999 as an avenue for students to offer public ratings and commentary about their university instructors.  The site has ratings on 1,000,000 instructors from more than 6,000 schools  For various reasons, the website may be laughable: -It contains ratings of instructor attractiveness, which some say is irrelevant and detracts from potential validity of the site 1, 2 -Limited research using focus groups of RMP.com posters suggests that students post ratings when they want to express negative views about an instructor 3  For other reasons, the website should be taken seriously: -As with traditional student evaluations of instruction, 4 ratings of various instructor characteristics (such as helpfulness and clarity) correlate positively 3 and the overall distribution of ratings is more positive than negative 3 -Logically, RMP.com provides students a chance to voice their opinion voluntarily and anonymously (which may be important at UWEC where student evaluation of instruction procedures are not standardized or conducted for 100% of instructors).  In the current study, we utilized a broad sample of college students at UWEC (rather than focus groups) to determine rates of use, forms of use, and reasons for posting on RMP.com.  We also investigated whether RMP.com users and non-users differ by gender, academic goals, performance, and class status. 1. What percent of students at UWEC view ratings and post ratings on RateMyProfessors.com? 2. What information do students perceive as most important when viewing an instructor’s page on RateMyProfessors.com? 3. For those who have posted ratings on RateMyProfessors.com, why have they? 4. Do students who view or post ratings on RateMyProfessors.com differ from students who don’t? 1. Felton, J., Koper, P. T., Mitchell, J., Stinson, M. (2008). Attractiveness, easiness, and other issues: Student evaluations of professors on Ratemyprofessors.com. Assessment and Evaluation in Higher Education, 33, Felton, J., Mitchell, J., & Stinson, M. (2004). Web- based evaluations of professors: The relations between perceived quality, easiness and sexiness. Assessment & Evaluation in Higher Education, 29, Kindred, J., & Mohammed, S. N. (2005). He will crush you like an academic ninja: Exploring teacher ratings on Ratemyprofessor.com. Journal of Computer-Mediated Communication, Marsh, H. W., & Roche, L. A. (1997). Making students’ evaluations of teaching effectiveness effective: The critical issues of validity, bias, and utility. American Psychologist, 52, Milton, O., Pollio, H. R., & Eison, J. A. (1986). Learning for grades versus learning for its own sake. In Making sense of college grades (pp ). San Francisco, CA: Jossey Bass. Associated with Viewing Ratings on RMP.com? Associated with Posting Ratings on RMP.com? Gender NoMen > Women Class Status No GPA Yes, weak negative association (r = -.19)No Learning Orientation Yes, weak negative association (r = -.18)No Grade Orientation Yes, weak positive association (r =.15)No Participants  208 UWEC students were surveyed in fall 2008 regarding their perceptions of instruction, academic goals, academic performance, and use of RMP.com.  The sample was broadly representative of class status with 26.5% freshmen, 19.1% sophomores, 17.6% juniors, 25% seniors, and 11.8% senior +. Materials and Procedure  Students reported the following demographic data: gender, class status, GPA, and major.  Students completed a 32 item scale to measure learning orientation (e.g. “It is important for me to understand the content of my courses as thoroughly as possible.”) and grade orientation (e.g. “My goal in my courses is to get a better grade than most of the other students.”). 5  The students answered questions about their use of RMP.com. First, students reported the number of times they had viewed ratings and the number of times they had posted ratings. Second, they placed in rank order of importance the seven pieces of information about an instructor available on RMP.com (e.g., quality rating, number of ratings, or attractiveness rating). Third, students rated 18 different reasons for posting ratings on an instructor (if they had). Sample reasons include “I thought the workload was too heavy” and “I thought the instructor was an excellent teacher.”  We designed this study to investigate use of RateMyProfessors.com among a typical, broad sample of college students. To our knowledge, this is the first nomothetic study of students ’ experiences of the website.  The majority of students reported having viewed ratings on RMP.com; and the majority of those had viewed ratings five or more times. However, only 23% of students had actually posted ratings on RMP.com.  Students rated quality as the most important rating and attractiveness the least important when viewing an instructor, suggesting that students take the website more seriously than people might assume.  As with research using focus groups, students in the current study reported that they posted ratings to both exclaim and complain.  We failed to document robust characteristics of RMP.com users compared to non-users. Students of lower GPA and students who were more grade- oriented and less learning-oriented also viewed ratings more frequently. But, learning and grade orientations were not associated with having posted ratings on the website. Do students view ratings for reasons that are different from those given by students who post the ratings?  Future research needs to head in two directions: (1) test the reliability among multiple students ’ ratings of the same instructor within a course and across courses; (2) assess similarities and differences between viewers ’ and posters ’ ratings of a common list of reasons for use. We thank the Office of Research and Sponsored Programs and the Center for Excellent in Teaching and Learning for supporting this research. Mean Rank (SD) (1 = Most, 7 = Least) Quality Rating 2.61 (1.53) Helpfulness Rating 3.07 (1.47) Clarity Rating 3.30 (1.48) Easiness Rating 3.47 (1.74) Open-ended Comments 4.09 (1.73) Number of Postings 5.02 (1.68) Hotness Total (Number of chili peppers) 6.38 (1.56)