Download presentation
Presentation is loading. Please wait.
Published byPhebe Gilmore Modified over 9 years ago
1
Crossing the Rubricon Assessing the Instructor Ned Fielden, Mira Foster San Francisco State University San Francisco, California USA
2
Case Study Assessment of Librarian Instructors Literature Review Theoretical Issues Rubric Design and Implementation Preliminary Review
3
Instructor Assessment Several Methods Supervisor Review Peer Evaluation Surveys Performance Assessment (Learning outcomes of students assessed)
4
Institutional Need for Instructor Assessment Retention of probationary candidates, Tenure and Promotion CSU as public institution, criteria based, strict rules about personnel review Summative vs. formative assessment
5
Process Literature Review Identify Suitable Mechanism for Review Create Draft Consult with Library Education Committee Formally Adopted by Library Faculty
6
Rubrics Powerful, easy to use, standardized Considerable literature on rubric use for students/programs/outcomes Little on library instructor usage
7
Value of Rubrics Standardised Easy to use (minimal training) Insures all criteria of review met Possibilities of quantitative data analysis, introduction of new values Can be employed both for summative and formative assessment
8
Rubric Basics Glorified “checklist”, annotated to establish criteria, distinct items A.Preparation 1.Communicated with course instructor before the session to determine learning objectives and activities 2.Learned about course assignment(s) specifically related to library research 3.Customized instruction session plan to curriculum, specific course assignments and/or faculty/student requests
9
Rubric Complexity May be designed to reflect highly nuanced categories *Oakleaf, M.L., 2006. Assessing information literacy skills, Dissertation, University of North Carolina. Evaluation Criteria BeginningDevelopingExemplaryStudent Learning Outcomes Articulates Criteria 0 – Student does not address authority issues 1 – Student addresses authority issues but does not use criteria terminology 2 – Student addresses authority issues and uses criteria terminology such as: author, authority, authorship or sponsorship LOBO 3.1.1 The student will articulate established evaluation criteria (ACRL 3.2 a)
10
Types of Rubrics Analytic – Specific Criteria – Isolated Facets – Capacity For Highly Granular Scoring Analytic rubrics “divide … a product or performance into essential traits or dimensions so that they can be judged separately…” * Holistic – Big Picture – Fuzzier Focus “overall, single judgment of quality” * *Arter and Tighe, Scoring rubrics, 2001.
11
Rubric Design What criteria to include Opportunity to introduce specific values in program Involvement of all constituents (evaluators/evaluatees)
12
Rubric Implementation Formative – Raw data given to candidate – Pre- and post-consultation – Candidate to use data however desired Summative – Framework for formal letter for RTP file
13
Summary Powerful, easy to use tool, levels playing field, highly customizable Issues of mixing formative and summative functions
14
Further Study Explore different varieties of instructor assessment tools Test different rubrics Establish balance point between depth of data and ease of use Evaluate outcomes
15
Crossing the Rubricon Assessing the Instructor Bibliography – http://online.sfsu.edu/~fielden/rbib.html Sample Rubric – http://online.sfsu.edu/~fielden/rubrics.html Bridge Photo with permission from robep http://www.flickr.com/photos/robep/
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.