Troy University eTROY Colloquium April 17-18, 2012.

Slides:



Advertisements
Similar presentations
A Systems Approach To Training
Advertisements

Performance Assessment
Using Embedded Assessment to Collect Evidence of Student Learning Using Embedded Assessment to Collect Evidence of Student Learning.
K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
What “Counts” as Evidence of Student Learning in Program Assessment?
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
Best Practices in Assessment, Workshop 2 December 1, 2011.
The Marzano School Leadership Evaluation Model Webinar for Washington State Teacher/Principal Evaluation Project.
An Assessment Primer Fall 2007 Click here to begin.
Consistency of Assessment
ELAC SLO RETREAT 2009 Veronica Jaramillo, Ph.D. Mona Panchal Anthony Cadavid ELAC SLO RETREAT 2009.
Department Plans for Assessment of Student Learning Teachers Talking February 16, 2011.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
The Academic Assessment Process
Standards and Guidelines for Quality Assurance in the European
Assessing Students Ability to Communicate Effectively— Findings from the College of Technology & Computer Science College of Technology and Computer Science.
Assessment Assessment Planning Assessment Improvements Assessment Data Dialogue & Reflection.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Timothy S. Brophy, Ph.D., Director of Institutional Assessment University of Florida Office of the Provost.
Spring 2012 Pilot Project Module Nine A New Texas Core Curriculum 1.
The Comprehensive School Health Education Curriculum:
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
Direct vs Indirect Assessment of Student Learning: An Introduction Dr. Sheila Handy, Chair Business Management and Co-Chair University Assessment Committee.
TaskStream Training Presented by the Committee on Learning Assessment 2015.
Institutional Effectiveness & B-CU Dr. Helena Mariella-Walrond VP of Institutional Effectiveness Cory A. Potter Director of Assessment Academic.
PDHPE K-6 Using the syllabus for consistency of assessment © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Academic Assessment at UTB Steve Wilson Director of Academic Assessment.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
Mapping Student Learning Outcomes
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
Student Learning Objectives (SLOs) “101”
Performance Development at The Cathedral of the Incarnation A Supervisor’s Guide.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Updating Curriculum to Support Learning Davidson County Community College May
Integrating Differentiated Instruction & Understanding by Design: Connecting Content and Kids by Carol Ann Tomlinson and Jay McTighe.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
What is the TPA? Teacher candidates must show through a work sample that they have the knowledge, skills, and abilities required of a beginning teacher.
Winter  Provide information and examples of the SLO process to help faculty use assessment to improve teaching and learning  Provide tools and.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
Understanding Assessment The Basics Office for Institutional Effectiveness and Assessment.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT PLAN/REPORT By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Chapter 15: Getting Started on the Assessment Path Essential Issues to Consider.
Assessment 101: What do you (really) need to know? Melinda Jackson, SJSU Assessment Director October
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
PLCs Professional Learning Communities Staff PD. Professional Learning Committees The purpose of our PLCs includes but is not limited to: teacher collaborationNOT-
Learning Goals Development & Assessment The Basics of Goals-Based Course Design.
Introduction to Curriculum Mapping
Consider Your Audience
Lakeland Middle School Professional Learning Communities (PLC)
Effective Outcomes Assessment
Creating Analytic Rubrics April 27, 2017
New Goal Clarity Coach Training October 27, 2017
PLCs Professional Learning Communities Staff PD
Presented by: Skyline College SLOAC Committee Fall 2007
What to do with your data?
Presentation transcript:

Troy University eTROY Colloquium April 17-18, 2012

Assessment roles of faculty and adjuncts in eTROY Dr. Wendy Bailey – Sorrell College of Business Dr. Christina Martin – College of Health & Human Services Dr. Isabelle L. Warren – College of Education

Review the purpose and goals of assessment of student learning Clear up concerns about assessment Provide an introduction to the assessment cycle University and Program, Linkages, Outcomes, Data Collection/Reporting, Action Plans Define the assessment role of faculty and adjunct professors Explain how Blackboard Outcomes can help and the timeline for implementation 3

Assessment is the ongoing process of: 1.Establishing clear, measurable expected outcomes of student learning tied to program and university missions 2.Ensuring that students have sufficient opportunities to achieve those outcomes in the curriculum 3.Systematically gathering, analyzing, and interpreting evidence to determine how well student learning matches our expectations 4.Using the resulting information to make changes that improve student learning 5.Continuing to measure, evaluate and make further changes as part of a process of continuous improvement - adapted from Linda Suskie, Assessing Student Learning,

Programs designed curriculum they thought would give students the background they need in a particular field Instructors taught courses in the curriculum and gave grades for their course based on student performance overall If students did well in the course, we assumed they knew what we wanted them to  …. But did they? 5

Learning is a function of the curriculum, not the course. Curriculums need to be more than a collection of courses. They need to ensure that every student has ample opportunity to achieve key institutional and program learning goals. Institutional and program goals are broader than the course goals themselves, and should be reinforced throughout the curriculum.  Students need to see the connections among their courses and other learning experiences as this makes learning deeper and longer lasting. 6

The goal of assessment is to improve our academic programs. Good assessment programs: 1. Help us to stay focused on and to continually evaluate our stated program goals 2. Bring faculty and staff together to discuss important issues related to teaching and standards used 3. Help faculty, staff, and students see how courses link together to achieve program goals 4. Identify issues that may impede student learning 5. Allow us to make better decisions that are based on data, rather than hunches, anecdotes, or intuition 7

When a professor assesses the work done in a course, assessment has a different purpose than program-level assessment. In a class, every student is assessed, the professor sets their own criteria and standards (explicit or not), the professor evaluates the work and the student gets the result. In program-level assessment, sampling is acceptable, the faculty (not one professor) sets explicit criteria and standards, the faculty or outsiders do the evaluation, and the feedback goes to the program faculty. 8

Program-level assessment checks if students are on track to achieve or have achieved important program goals. Since these goals are reinforced throughout the curriculum, an assessment in one course doesn’t measure what that professor did or did not do, but what the program to that date has achieved. Evaluation of individual faculty should never be the goal of program-level assessment and the results should never be used that way. Assessment is a team sport! We celebrate our victories together, and we figure out how to change things together if results are disappointing. 9

“I have assessed students! I assigned them a course grade.” Course grades are not sufficient for program- level assessment. An overall course grade of a B doesn’t tell us which skills and concepts they have mastered. Sally may have gotten a C average on exams and an A in her project, while John may have gotten an A average on his exams and a C in his project.  Both earned a B, but we won’t know looking at the grade alone what each has mastered and what skills need more work. 10

“Mandating program level assessment violates the principle of academic freedom.” “Academic freedom does not absolve instructors of their responsibility to ensure that all students in their program… have sufficient opportunity to achieve those goals that the faculty collectively agree are essential… “Academic freedom also does not relieve faculty of the obligation to assess student learning of their subject… -- Linda Suskie, Assessing Student Learning: A Common Sense Approach,

“I do not have time to conduct assessments – as it is not within my job description.” Assessment is everyone’s responsibility. Accrediting agencies want to see that faculty know the program’s goals and that they are involved in assessment activities. To save time, be smart about assessment. Course- embedded assessments can serve double-duty. Example: capstone research project  Instructor evaluates for course grade (content, etc.)  Samples of the same projects can be evaluated by faculty using two different rubrics for writing and critical thinking skills to evaluate the achievement of program-level goals. 12

13 Mission Set goals, objectives, and outcomes Align curriculum with outcomes Choose how outcomes will be assessed & set criteria Gather the data Evaluate, report, & share the data Use the data to make meaningful changes

 Program goals are broad, conceptual statements that show the long-term aim or purpose of the entire course of study. Goals should be related to the college’s mission, as well as the institution’s.  Program objectives are more specific than goals and are more short-term. They indicating the intended consequences of instruction within a timeframe.  Program student learning outcomes (SLO’s) describe significant & essential learning that students should achieve or reliably demonstrate at the end of a program. SLO’s must be specific, observable, & measurable. 14

Next, examine your curriculum. Where are concepts related to each outcome introduced, reinforced, or mastered? Students should be given multiple opportunities to master important program goals. Are there gaps in your curriculum? 15

Course Program SLO 1 Program SLO 2 Program SLO 3 Program SLO 4 ENG1101II ENG1102RI ENG3341RMI ENG4430MR ENG4442MM 16 I = introduced, R = reinforced, M = mastered

Assessments measure whether students have achieved the program learning outcomes we’ve set. ◦ Direct assessment measures provide the strongest evidence of student learning because they are based on actual student performance on a task. ◦ Indirect assessment measures supplement direct measures. They provide information about student perceptions about learning experiences & attitudes towards the learning process, as well as program quality.  No assessment of learning outcomes should be based on indirect measures of learning alone! 17

Criteria are standards of performance required to meet the objective/outcome, that is, the quality to be judged in the assessment task: Quality words often used in criteria: clarity, accuracy, depth, legibility, impact, relevance, etc.  Example: “Clarity of explanation” is a criterion for “Students will be able to explain how concepts in the subject interrelate.” May be expressed as a percentage, a target number of accomplishment, a rate, an increase over a previous criterion, completion of a task or event, etc.  Example: 85% of the students will be able to analyze … using the correct statistical procedures. 18

 Assessment data needs to be collected in order to analyze it. Accrediting agencies often will allow sampling, rather than a census, provided the sample is representative and data is provided by program and location.  This is one place where products such as Blackboard Outcomes can be very valuable. 19

Assessment data needs to be evaluated and the results communicated to others in order for it to inform decisions about programs. Assessment doesn’t bring improvements in student learning; analysis and use of the results do. If your assessment data is lying in a corner gathering dust, ask yourself whether the information gathered is useful. If not, figure out why. Assessment results also need to be communicated to others (faculty, students, stakeholders) who can use them to make decisions. 20

Assessment is only valuable if the results of our analyses are used to make meaningful changes. “Closing the loop” simply means using the data to make changes. These changes need not be huge, but they should be meaningful. Examples of such changes could be new or modified courses, better coordination among courses or sections, modifications in concentrations, curriculum development grants, new course sequencing or prerequisites, opportunities for remedial work, new common assignments to address weaknesses, etc. 21

Be knowledgeable about your program’s student learning outcomes. 1.Know the courses that are selected for assessment activity 2.If you are the instructor on record for assessment courses, know what measures are being utilized for assessment activity….and USE them. 3.Ensure completion of the assessment activity  This can be a graded or non-graded assignment 4.Ensure that the assessment activity was evaluated and documented! 5.Maintain this documentation; report findings to your assessment point person (program/department chairs). 22

In maintaining your assessment documents, specify the raw number of students who are “satisfactorily” and/or “unsatisfactorily” completing the identified assessments. Example: 7/10 students scored 80% or higher (satisfactorily) on the diversity project. This information will help to make informed decisions and to enhance program quality. Troy University HOMER Report Sample. 23

Curriculum maps Tagging of questions or assignments to particular learning objectives Gathering and sampling of assessment data Analysis of assessment results And more!  Timeline for implementation at Troy 24

Questions? Wendy Bailey Christina Martin Isabelle Warren Kang Wendy