New Faculty Orientation August 2019

Slides:



Advertisements
Similar presentations
TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
Advertisements

Criterion-Referenced Assessment on Dissertation (Level 4 Subject) Faculty T&L Forum on Outcome-based Curriculum and Criterion-referenced Assessment 1 April.
Training Module for Cooperating Teachers and Supervising Faculty
Education 3504 Week 3 reliability & validity observation techniques checklists and rubrics.
Principles of High Quality Assessment
Consistent Teacher judgement
Assessing and Evaluating Learning
Coaching Workshop A good coach will make the players see what they can be rather than what they are. –Ara Parseghian ®
Coaching Workshop.
Coaching and Providing Feedback for Improved Performance
Bloom’s Critical Thinking Level 1 Knowledge Exhibits previously learned material by recalling facts, terms, basic concepts, and answers.
Emporia State University Phil Bennett (Some Slides by Dr. Larry Lyman) Teacher Work Sample The Teachers College.
Overall Teacher Judgements
PLAN AND ORGANISE ASSESSMENT. By the end of this session, you will have an understanding of what is assessment, competency based assessment, assessment.
ASSESSMENT 19 September 2015Laila N. Boisselle 1.
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Teaching Today: An Introduction to Education 8th edition
USEFULNESS IN ASSESSMENT Prepared by Vera Novikova and Tatyana Shkuratova.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Teaching and Learning with Technology ick to edit Master title style Teaching and Learning with Technology Designing and Planning Technology Enhanced Instruction.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Using Groups in Academic Advising Dr. Nancy S. King Kennesaw State University.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
New Faculty Orientation August 2015 Assessment: Challenges and Opportunities.
National PE Cycle of Analysis. Fitness Assessment + Gathering Data Why do we need to asses our fitness levels?? * Strengths + Weeknesses -> Develop Performance.
GRADING n n Grading that reflects the actual ability of the student n n Grading as a personal communication.
Alignment between Pedagogy and Assessment Dr. Christine Tom Griffith University.
Teacher Work Sample. Lectures Objectives: 1.Define the teacher work sample. 2.Integrate lesson plans with a practice Teacher Work Sample in terms of the.
Chapter 1 Assessment in Elementary and Secondary Classrooms
EDU704 – Assessment and Evaluation
Learning Assessment Techniques
EVALUATING EPP-CREATED ASSESSMENTS
NCATE Unit Standards 1 and 2
Dutchess Community College Middle States Self-Study 2015
(A quick ‘taste’ or overview)
3 Chapter Needs Assessment.
SECTION 3 Grading Criteria
Assessment of Learning 1
Classroom Assessment A Practical Guide for Educators by Craig A
Oleh: Beni Setiawan, Wahyu Budi Sabtiawan
Introduction to Assessment in PBL
Teaching and Learning with Technology
Florida’s MTSS Project: Self-Assessment of MTSS (SAM)
Developing Thinking Thinking Skills for 21st century learners
Coaching.
Clinical Assessment Dr. H
Understanding by Design
The Concept of INTERDISCIPLINARY TEACHING
Principles of Assessment & Criteria of good assessment
New Faculty Orientation August 2018
SDSU’s Writing Placement Assessment (WPA)
Developing Thinking Thinking Skills for 21st century learners Literacy
Critically Evaluating an Assessment Task
Introduction to Assessment in PBL
Understanding and Using Standardized Tests
Writing Learning Outcomes
Assessing Academic Programs at IPFW
Writing Criterion Referenced Assessment Criteria and Standards
What to do with your data?
Assessment of Classroom Learning
February 21-22, 2018.
Introduction to Assessment
Assessments-Purpose and Principles
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Skills for Learning, Life and Work
This is the second module of the Collaborative Backward Design series
EDUC 2130 Quiz #10 W. Huitt.
Instructional Plan and Presentation Cindy Douglas Cur/516: Curriculum Theory and Instructional Design November 7, 2016 Professor Gary Weiss.
Presentation transcript:

New Faculty Orientation August 2019 Assessment: Challenges and Opportunities Title page for the presentation Thank you for invitation Thanks to Dr. Kesarwani fro supporting an event The need for faculty across all factulties to have an opportunity .. To address challenging issues

Assessment from the Latin assidere which means “to sit beside” “To sit beside” brings to mind such verbs as to engage, to involve, to interact, to share, to trust. It conjures up team learning, working together, discussing, reflecting, helping, building, collaborating. It makes one think of cooperative learning, community, communication, coaching, caring, consultation… “Sitting beside” implies dialogue and discourse, with one person trying to understand the other’s perspective before giving value judgments. (Braskamp and Ory, 1994) This not current However very much embraces the sprit of new authentic or alternative

Second Optime (pass men) Inferiores (charity passes) Historical Baggage: Optime (honor men) Second Optime (pass men) Inferiores (charity passes) Pejores (unmentionables) Yale 1783 Prior to this read from the book

Pillars of Sound Assessment Transparency Standardization

Learning Outcomes State the relationship between Learning Outcomes and Assessment Activities Identify methods for incorporating Bloom’s Cognitive Taxonomy into assessment methods Identify strategies for constructing effective MCQ test items including methods for writing items at higher cognitive levels Apply the results of Item Analysis to ensure the effectiveness of test items and to help identify items which test at higher cognitive levels Identify strategies to help test takers perform at optimal levels to ensure valid test scores

Bloom’s Taxonomy Knowledge: Recalling facts Comprehension: Demonstrating understanding Application: Transferring previous learning to solve new problems Analysis: Understanding the organizational structure Synthesis: Applying knowledge or skills to create something new Evaluation: Applying judgment

Types of Assessments Formative assessment The goal of formative assessment is to monitor student learning Summative assessment The goal of summative assessment is to evaluate student learning at the end of an instructional unit Criterion what does the student know Norm what does he know related to others What sustains bell curving Meyer 1908 50 med 22 above superior 3 excellent 22 inferior 3 failure Mensa and Densa The appeal of nature’s distribution Grade inflation Fear of success

Assessment Drives BOTH Teaching and Learning Teaching / Learning BUILDING THE BRIDGE Assessment Methods Impacts on student perception of the role of assessment Makes assessment a method for grades Experienced as divorced from learning

RELIABILITY What Reliability represents How we establish Reliability Criteria for scoring Acknowledging subjectivity Sufficient opportunities The concept of reliability addresses the question to what extent does a particular score a candidate obtains in an assessment context actually reflect the candidate’s “true score”. The true score refers to the score a candidate would get in a perfect world where such factors such as the technical characteristics of the test, the abilities of the assessors, the testing conditions and the motivation of the candidate were perfect in nature. Reliability co-efficient ranged from 0.0 to 1.00. The higher the co-efficient the more reliable the test scores are and consequently the more valid the pass-fail decisions. Coefficients in the range of .70 to .80 are highly desirable for licensure decisions. There are a number of reliability co-efficients one could use but the most common are internal consistency estimates of reliability. Such coefficients provide evidence of how well the assessment instrument is measuring a single underlying construct such as case history skills or physical examination skills. These coefficients are generated through the application of statistical techniques and are commonly part of the output of any item analysis procedure. In performance-based examinations (like ours) the issue of reliability is most concerned with the reliability of the judgements and therefore scores made by the assessors. In order to enhance reliability co-efficient and therefore the confidence we have in our past /fail decisions there are a number of strategies we employ. First, we have well developed checklists that are clear and unambiguous. Second, we have highly competent practitioners as examiners who receive extensive training for each administration. Finally, the reliability of our scores is enhanced by our practice of reviewing the videos of candidate's who have fallen below the cut-scores.

VALIDITY What Validity represents How we establish Validity Meaningful activities Representativeness Appropriate format

USABILITY Provision for feedback Appropriate weightings Reasonable frequency Provision for feedback Appropriate weightings You can probably elaborate on these areas quite well

Types of Scoring Methods Criterion-referenced Provides absolute comparisons Describes student performance in relation to pre-set standard Norm-referenced Provides relative comparisons Describes individual performance in comparison to others Criterion what does the student know Norm what does he know related to others What sustains bell curving Meyer 1908 50 med 22 above superior 3 excellent 22 inferior 3 failure Mensa and Densa The appeal of nature’s distribution Grade inflation Fear of success

GRADING Grading that reflects the actual ability of the student Grading as a personal communication Show video sequences

Feedback on your Assessment Selective response Item Analysis Constructive response Descriptive stats Peer/Student feedback

Item Analysis Example Prop. Answering Correctly: 0.74 Discrimination Index: 0.42 Point Biserial: 0.40 Alt. Total Ratio Low (27%) High (27%) Point Biserial Key A 0.74 0.44 0.86 0.40 * B 0.09 0.17 0.03 -0.13 C 0.04 0.06 0.08 -0.03 D 0.13 0.33 -0.38 E 0.00 Other We need to reflect

COLLABORATIVE LEARNING SCORING RUBRIC 4- Thorough Understanding 3- Good Understanding 2- Satisfactory Understanding 1- Needs Improvement

COLLABORATIVE LEARNING SCORING RUBRIC 4- Thorough Understanding Consistently works toward group goals Is sensitive to needs of all the group Willingly fulfills individual role within the group Consistently contributes knowledge, opinions, skills Values the knowledge of all group members Helps group identify necessary group changes

COLLABORATIVE LEARNING SCORING RUBRIC 1- Needs Improvement Works only when prompted Contributes only when prompted Needs reminders to be sensitive to others Participates in needed changes when prompted

Summary Suggestions Integrate assessment into your teaching Seek out feedback regarding your assessment techniques Attempt to introduce one new assessment technique