Current status of Blue  Implementation of Blue is effective since October  Presentations to Colleges/Schools and Departments are currently ongoing 

Slides:



Advertisements
Similar presentations
Online Course Design Online Course Design EVALUATION AND REVISION
Advertisements

Under New Management : Developing a Library Assessment Program at a Small Public University Library Assessment Conference: Building Effective, Sustainable,
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Review of SUNY Oneonta Course Evaluation Form Report and Recommendations from The Committee on Instruction: Part I March 16, 2009.
Mark Troy – Data and Research Services –
TEACHER EVALUATION IN NEW JERSEY: PREVIEW OF FINAL REPORT RUTGERS GRADUATE SCHOOL OF EDUCATION.
Study on the outcomes of teaching and learning about ‘race’ and racism Kish Bhatti-Sinclair (Division of Social Work Studies) Claire Bailey (Division of.
Student Evaluations. Introduction: Conducted: Qualtrics Survey Fall 2011 o Sample Size: 642 o FT Tenured: 158, FT Untenured: 59 o Adjunct: 190 o Students:
SUPERINTENDENT AND BOARD OF EDUCATION MEMBER PERCEPTIONS REGARDING PREFERRED LEADERSHIP BEHAVIORS FOR SUPERINTENDENTS IN WEST VIRGINIA Keith A. Butcher.
OIRE Systems Projects Department Heads Meeting, 3/11/2013 SURESH NAIR, Ph.D. Interim Associate Vice Provost for Institutional Effectiveness Professor,
IN SUPPORT OF STUDENT INVOLVEMENT IN THE COURSE TRANSFORMATION PROGRAM Senate Resolution 1012.
Analysis of Institutional Data of the Assessment of SLOs October 27, 2009.
Assurance of Learning The School of Business and Economics SUNY Plattsburgh.
Course Outline Process Overview for Course Convenors.
3.2 Sampling Design. Sample vs. Population Recall our discussion about sample vs. population. The entire group of individuals that we are interested in.
Do ATLS Instructor Courses Have an Impact on Teaching Skills and Attitude of Instructors? Mohammed Y. Al-Naami FRCSC, FACS, M Ed.
A Multi-method Approach: Assessment of Basic Communication Cheryl E Drout, Ph.D. SUNY-Fredonia.
1 A Comparison of Traditional, Videoconference-based, and Web-based Learning Environments A Dissertation Proposal by Ming Mu Kuo.
Steps in the Research Process I have a research question, what do I do next?
Coordinator of Assessment Coordinate assessment efforts on campus Maintain the NCCC General Education Assessment Plan Collect assessment results from course.
Business research methods: data sources
Chapter Three Research Design.
Survey of Current Teaching Evaluation Forms Teaching Effectiveness Committee.
Miguel Martins, Senior Lecturer – Marketing Department eLearning Celebration 22 July 2007 Improving performance on a Marketing module through the use of.
LINC 2007 M-Learning from a Cell Phone: Improving Students’ EMP Learning Experience through Interactive SMS Platform By: Jafar Asgari Arani
E-learning has been used as a method of paedogogical transfer with the aim of enhancing satisfaction, retention and ultimately achievement for many years.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Florida Online Instructional Materials Review Training.
University Senate August 26, 2014 KEY FINDINGS FROM THE COACHE FACULTY JOB SATISFACTION SURVEY.
+ Measuring Teaching Quality in the Online Classroom Ann H. Taylor Director, Dutton e-Education Institute College of Earth and Mineral Sciences.
Improved Performance and Critical Thinking in Economics Students Using Current Event Journaling Sahar Bahmani, Ph.D. WI Teaching Fellow INTRODUCTION.
COMPASS National and Local Norming Sandra Bolt, M.S., Director Student Assessment Services South Seattle Community College February 2010.
Library User Studies Hsin-liang Chen SLIS, Indiana University.
New Annual Faculty Assessment... after the Beta.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
Marquette University Office of Institutional Research & Assessment (OIRA) Summary of the Marquette On-Line Course Evaluation System (MOCES): Fall semester.
University Writing Project Faculty Feedback
Use of survey (LFS) to evaluate the quality of census final data Expert Group Meeting on Censuses Using Registers Geneva, May 2012 Jari Nieminen.
E-learning as Scholarly Activity Connie Jones, Ed.D. James Calder, Ph.D. Middle Tennessee State University.
Copyright © 2007 Pearson Education Canada 3-1 Marketing Research Marketing research serves many roles. It can: 1.Link companies with customers via information.
Chapter 12: Survey Designs
CLU Online Course Evaluations Melinda Wright November 2009.
End of Course Evaluation Taimi Olsen, Ph.D., Director, Tennessee Teaching and Learning Center Jennifer Ann Morrow, Ph.D., Associate Professor of Evaluation,
Thepphayaphong Setkhumbong Master Degree Student, Department of Educational Technology, Faculty of Education, Silpakorn University, Thailand.
Design of the Virtual Learning Environment with Scaffolding System to support a Brain-based Learning to develop the Creativity of undergraduate student.
Final Update on the New Faculty Course Evaluation & Online System November, 2003.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
My Course Was Selected For LAE Assessment Reporting. What Do I Do Now??? LAE Assessment.
Brenda I. López Ortiz Assistant Professor of Educational Technology Queens, NY.
Visioning 2 Committee 15,800 by 2018 JoyceArmstrongFamily Sciences 13 JessicaGullionSociology GovernorJacksonFinancial AidMarkHamnerInstitutional Research.
Strengthening Student Success Conference October 7 – 9, 2009 Assessing and Improving Student Learning throughout Student Services Presenters from Sacramento.
1 Promotion & Tenure Process Thoughts NFTS Meeting 16 May 2006.
eXPlorance Blue: The new electronic student course evaluation system
David Ackerman, Associate VP Crystal Butler, Research Associate.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
3-1 Copyright © 2010 Pearson Education, Inc. Chapter Three Research Design.
Teacher Survey Design National Research Coordinators Meeting Amsterdam, October 2006.
IN SUPPORT OF CONSISTENT FACULTY PEER REVIEW Senate Resolution S.R
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
1 Promotion & Tenure Process and Thoughts Barry Flachsbart Bruce McMillin NFTS Course Design Retreat, 14 May 2007.
Model of an Effective Program Review October 2008 Accrediting Commission for Community and Junior Colleges.
Mid-Atlantic Regional Conference
LAE Assessment My Course Was Selected For LAE Assessment Reporting.
The Efficacy of Student Evaluations of Teaching Effectiveness
Course Evaluation Ad-Hoc Committee Recommendations
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Student Evaluations of Teaching (SETs)
Pilot Goals Focus Consistency Response
Presentation transcript:

Current status of Blue  Implementation of Blue is effective since October  Presentations to Colleges/Schools and Departments are currently ongoing  The Electronic Course Assessment Implementation (ECAI) committee (subcommittee of the FDAI committee) is supervising the implementation, in conjunction with the Provost’s Office and OIT  ECAI is the point of contact for the Faculty Senate with regard to all issues about electronic evaluation  You can find information about online course evaluation at:

Adoption of Blue  What are the implications  What could be done to improve confidence on the survey for each class

Implications of adopting Blue Inter-system comparison  Consistency of evaluation over time  Quality Intra-system comparison  Representativeness  Accuracy  Quality

Inter-system comparison  Q: How my evaluation in Blue measures against IAS?  Qualitatively, the two questionnaire are different (no “effectiveness of teaching” is assessed).  Quantitatively, the scores are expected to be different because the structure and the content of the surveys are different.  Each Blue survey should be compared to Campus- wide Blue aggregate data.  Few rounds of evaluation will be needed to be able to make comparison with IAS, if needed.  Revise unit criteria for tenure and promotion.

Inter-system comparison: quality  Q: Students who are strongly negative about the course or the instructor have been the most likely group to complete the online evaluation.  Results from many studies on this topic have proven this to be a misconception, with results from online evaluations shown to be as trustworthy as those from paper-based evaluations (Liu, 2005; Thorpe, 2002; Johnson, 2002).  A large scale study of the results of the Individual Development and Educational Assessment (IDEA) student rating system between 2002 and 2008 (Benton et al., October, 2010) examined a total of 651,587 classes that used paper-based evaluations and 53,000 classes that used web-based evaluations. This comparison showed no meaningful differences between survey methods.

Intra-system comparison: representativeness  Q: Those who have responded to the survey have very different views than those who have not, hence the results from the survey would not reflect the opinion of the population as a whole.  The link between response rate and non-response bias has not been established (Marketing Research and Intelligence Association, October 2003 and 2011; Curtin et al., 2000, Langer, 2003; Holbrook et al., 2005).

Intra-system comparison: accuracy  Q: Small sample size leads to greater margin of error of the results.  Depends on the class size and the opinions’ skewness

What can be done to increase confidence toward this system  Response analysis will reveal biases and potential correlation with the polled cohort.  Results of the analysis will be shared with students and instructors.  If the wording and/or the content of one or more questions appear to skew quality of responses, those questions will be re-evaluated and re-worded or eliminated/substituted.

What can be done to increase confidence toward this system  Response rates:  Showing evaluation matters  Communication  Making it easy for students  Providing incentives  UAF evaluation portal:  Intensify the use of Blue:  Mid-term evaluation  Department-specific questions