Doing Assessment, not just talking about it ISETL conference October 2009 Philadelphia, PA Steve Culver Office of Academic Assessment Penny Burge Educ.

Slides:



Advertisements
Similar presentations
General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Advertisements

Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
What “Counts” as Evidence of Student Learning in Program Assessment?
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School.
As presented to the Global Colloquium on Engineering Education Deborah Wolfe, P.Eng. October 2008 The Canadian Process for Incorporating Outcomes Assessment.
The Board’s Role in Accreditation Michael F. Middaugh Associate Provost for Institutional Effectiveness University of Delaware Celine R. Paquette Trustee,
Best Practices in Assessment, Workshop 2 December 1, 2011.
An Assessment Primer Fall 2007 Click here to begin.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
The Academic Assessment Process
PPA Advisory Board Meeting, May 12, 2006 Assessment Summary.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Title I Needs Assessment and Program Evaluation
Understanding Teaching Effectiveness and Assessment Projects.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Evaluation. Practical Evaluation Michael Quinn Patton.
Standards and Guidelines for Quality Assurance in the European
Program Assessment Workshop Kathleen Harring. What is Assessment? Assessment is the systematic gathering and analysis of information to inform and improve.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Maureen Noonan Bischof Eden Inoway-Ronnie Office of the Provost Higher Learning Commission of the North Central Association Annual Meeting April 22, 2007.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Adolescent Literacy – Professional Development
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Streamlined NCATE Visits Donna M. Gollnick Senior Vice President, NCATE 2008 AACTE Annual Meeting.
Assessment Strategies, Assessment networks Session II Preliminary Findings from Virginia Tech Using Assessment Data to Improve Teaching & Learning.
Outcome Assessment Reporting for Undergraduate Programs Stefani Dawn and Bill Bogley Office of Academic Programs, Assessment & Accreditation Faculty Senate,
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Assessment Workshop College of San Mateo February 2006.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
PTEU Conceptual Framework Overview. Collaborative Development of Expertise in Teaching, Learning and Leadership Conceptual Framework Theme:
Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
“Assessment Made Easy” Workshop CCSC Eastern Conference October 15, 2004 Based on a presentation by Philip K. Way, University of Cincinnati Frances K.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
March 15-16, Inquiry and Evidence An introduction to the TEAC system for accrediting educator preparation programs 3/15/12, 9:00-10:00a.m. CAEP.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Refresher Course on Assessment A Workshop for Department Chairs, Program Directors, and Others January 22, 2016.
Stetson University welcomes: NCATE Board of Examiners.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
+ Montgomery College Program Assessment Orientation Spring 2013.
Understanding Assessment Projects. Learning Objectives for this Session After completing this session you should be able to… 1.Articulate the requirements.
Refresher Course on Assessment
NCATE Unit Standards 1 and 2
Assessment & Evaluation Committee
Student Learning Outcomes Assessment
Program Assessment Processes for Developing and Strengthening
Assessment & Evaluation Committee
2017: SLOs & Assessment Reboot
Presentation transcript:

Doing Assessment, not just talking about it ISETL conference October 2009 Philadelphia, PA Steve Culver Office of Academic Assessment Penny Burge Educ research & evaluation Virginia Tech

What is Assessment of Student Learning Outcomes? “... the systematic gathering of information about student learning, using the time, resources, and expertise available, in order to improve the learning.” – Walvoord What we want students to know and be able to do when they leave our programs.

How program outcomes fit in the academic schema Institutional outcomes College outcomes Program outcomes Course outcomes Course session outcomes

Setting the Context Federal Government -- Spellings Commission Report – Whether students are learning is not a yes-or-no question. It’s how? How much? And to what effect?” (Chronicle, ) SACS – The institution identifies expected outcomes (including student learning outcomes) in its educational programs... and provides evidence of improvement based on analysis of the results. (SACS Standard ) Professional Accrediting Agencies Example: ABET – “…describe what students are expected to know and be able to do by the time of graduation, – …identify, collect, and prepare data to evaluate the achievement of program outcomes, – …results in decisions and actions to improve the program.”

What is The Process for Assessing Student Learning Outcomes? 2. Gather and Analyze Information About Student Achievement Of Outcomes 3. Use Information Gathered To Improve Student Learning 1.Identify And Articulate Student Learning Outcomes

Identifying & Articulating Student Learning Outcomes Where to begin? Conduct a faculty meeting to brainstorm: What would an ideal graduate know, understand, be able to do? Think of Bloom’s Taxonomy. Consult websites of professional/disciplinary organizations and other institutions. Start with courses. Identify a list of possible outcomes, knowing the list will change. Careful of the “understand” outcome. Narrow to 3-5 key outcomes to consider first.

Student Learning Outcomes Examples Students will be able to demonstrate the approach, logic and application of the scientific method and be able to apply these principles to real-life problems. Students will be able to identify, and describe to the lay person, the important institutions and determinants of economic activity at the local, regional, national, and international levels, including the basics of fiscal and monetary policy and how each affects the economy. Students will demonstrate mastery of note-taking techniques by correctly using at least 3 different note- taking methods for classroom lecture.

An important caveat Whatever is measured becomes important.

Measurement lingo Qualitative assessments = more in- depth, contextual information with smaller samples. Quantitative assessments = a broad overview, typically by using larger samples. Reliability & validity, trustworthiness

Practical tips Don’t have one person do all the work. Cooperate with other departments, even at different schools. Borrow methods and instruments (if they fit) & Use existing data if possible (IR, Registrar’s Office, etc.) Do as much as possible in the context of what you are already doing. Make your instrument, assignment, observation as short as it can be and still provide the information you want. Have a data collection/reporting plan.

Indirect & Direct measures Indirect measures -- surveys, student interviews, focus groups, enrollment trends, job placement numbers, % who go on to graduate school, grades Direct measures -- student work samples from course work, portfolios, eportfolios, observation of student behavior, standardized tests, externally reviewed internship, performance on national licensure exams

The place for grades Course grades provide one source of information about student achievement, but they are typically not direct measures of what students can do in terms of a program outcome. Grades provide a measure of overall proficiency in a class. They provide some information about strengths & weaknesses, but they also are imperfect measures of program outcomes.

Two ways to approach the development of measures #1 – Take an inventory of measures already used in your program (in courses, at entry, at exit, etc.) #2 – Using one of your outcomes, brainstorm at least three ways (at least one direct measure) to measure this outcome.

Lots of data – now what? Turn data into information “We are being pummeled by a deluge of data and unless we create time and spaces in which to reflect, we will be left with only our reactions.” -- Rebecca Blood

Transparency Assessment must be a transparent process. Changes made as a result of the assessment process need to be made public. All constituencies (faculty, students, others) should be involved. Nothing worse than having to fill out surveys and never see anything changed as a result.

Procedural notes All students’ work doesn’t need to be scored; a sample will likely work. A group of faculty need to be involved in scoring the products. The same product is used twice– once for individual student class grade, once for program assessment. Course-embedded overcomes motivation issues. The student product must measure in some way at least one program outcome.

Confidentiality & the use of student work Institutional Review Board (IRB) Family Educational Rights & Privacy Act of 1974 (FERPA) When to use names/student i.d.’s Involvement of and feedback from students in your process

Student privacy & security Restrict access to student work to departmental faculty and campus administrators involved in assessment. If student work is stored on-line, ensure security (e.g., use password protection) to limit access to departmental faculty and campus administrators. Whenever possible, do not keep sensitive (e.g., self- stigmatizing) identifiable student work on-line. Instead of collecting work products from all students, collect a sample. – If one student work product contains sensitive information or the like replace it with another student work product. – If one student work product gives information about others that violates the rules of consent, replace it with another student work product. – If a student requests that their work not be a part of the assessment process, comply.

Anonymity & Confidentiality Anonymize student work; remove any information that identifies individual students (i.e., name and/or ID number) from any student work collected/used for assessment. Consider distribution of a consent form

Reporting & communication Report in the aggregate to avoid identifying individual students. Share your process, (aggregated) findings, and plans for program improvement with program faculty, your students, and campus administrators. Appoint a single faculty member knowledgeable of the program’s assessment process to respond to student concerns, complaints, and/or grievances. Include this person’s contact information in the report.

copyright Students own the copyright for their own works, which gives them the exclusive right to reproduce or distribute their work. While it is appropriate for faculty to review student work to assess the achievement of educational criteria/expectations, if the evaluation involves copying any or all portions of the work, students should be notified of this before submitting their work.

In summary... The assessment process is only worth doing if we focus on improvement rather than accountability. Documenting what we do – the conversations, the curricular changes, our thoughtful reflections on teaching & learning – is the accountability piece.