Consider Your Audience

Slides:



Advertisements
Similar presentations
Student Learning Outcomes and Assessment An Overview of Purpose, Structures and Systems Approved by Curriculum Committee Approved by Academic.
Advertisements

Bringing it all together!
Best Practices in Assessment, Workshop 2 December 1, 2011.
A Guide for College Assessment Leaders Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
Consistency of Assessment
Presented by: Dr. Sue Courtney Janice Stoudemire, CPA, ATA, ABA Associate Degree Board of Commissioners Copyright Protected: Material can not be use or.
The Academic Assessment Process
Understanding Teaching Effectiveness and Assessment Projects.
Measuring Learning Outcomes Evaluation
Assessment Assessment Planning Assessment Improvements Assessment Data Dialogue & Reflection.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Principles of Assessment
1 Let’s Meet! October 13,  All four people have to run.  The baton has to be held and passed by all participants.  You can have world class speed.
BY Karen Liu, Ph. D. Indiana State University August 18,
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
Reading Comprehension Strategies Rubrics Standards-based Assessment of and for Learning.
Academic Assessment Accountability: Are we what we say we are? Program Improvement: How can we be even better? External audiences: SACS.
Winter  Provide information and examples of the SLO process to help faculty use assessment to improve teaching and learning  Provide tools and.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
Full-Time Faculty In-Service: Program and Student Learning Outcomes Fall 2005.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Information Literacy Module for Majors Available to support any department Tony Penny, Research Librarian – Goddard Library Supporting the Architecture.
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Understanding Assessment The Basics Office for Institutional Effectiveness and Assessment.
Student Growth Goals for Coaching Conversations. Requirements for Student Growth Goals Time line Reference to an Enduring Skill Proficiency and Growth.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
THE SOLUTION TO OUR WOES? PRESENTATION BY DR. RICHARD LEIBY DIRECTOR OF STRATEGIC PLANNING AND ASSESSMENT Competency-based Assessment.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Understanding Assessment Projects. Learning Objectives for this Session After completing this session you should be able to… 1.Articulate the requirements.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
Assessment Basics PNAIRP Conference Thursday October 6, 2011
CRITICAL CORE: Straight Talk.
The assessment process For Administrative units
The assessment process
Lakeland Middle School Professional Learning Communities (PLC)
Program Learning Outcomes
Effective Outcomes Assessment
Mastery-Based Learning:
Classroom Assessments Checklists, Rating Scales, and Rubrics
New Student Experience
Using Action Research to Guide Student Learning Experiences & Assessments Exploring the synergistic activities of curriculum design/delivery and assessment.
Institutional Effectiveness USF System Office of Decision Support
Student Learning Outcomes Assessment
Advanced Program Learning Assessment
Analyze Student Work Sample 2 Instructional Next Steps
Program Assessment Processes for Developing and Strengthening
Program Assessment Plans Step by Step
Assessing Student Learning
Assessments: Beyond the Claims
Presented by: Skyline College SLOAC Committee Fall 2007
Assessing Academic Programs at IPFW
What to do with your data?
Developing a Rubric for Assessment
Instructional Plan and Presentation Cindy Douglas Cur/516: Curriculum Theory and Instructional Design November 7, 2016 Professor Gary Weiss.
Student Learning Outcomes Assessment
Presentation transcript:

Consider Your Audience UNDERGRADUATE LEARNING ASSESSMENT Quick Guide August 2016 http://assessment.dasa.ncsu.edu Purpose & Basic Process Key Definitions NC State Academic Assessment is intended to provide systematic data for the continual enhancement of programs rather than to provide proof for accountability. As such, the process is designed so that faculty determine what is important (curricular outcomes), how it is taught, how it is measured (using direct evidence), how the data is interpreted (identifying strengths and areas for improvement), and what actions, if any based on findings, should be taken to enhance the curriculum.   In addition to the facilitation of the academic assessment process, the Office of Assessment evaluates General Education competencies, the THINK (QEP) program, and the co-curricular programs in DASA. The reporting process is intended to be a snapshot of the assessment within each degree program to provide information regarding how outcome data is being utilized for decisions by the academic leadership in the department, college and university. Examples of plans and reports can be found here: https://assessment.dasa.ncsu.edu/academic- assessment/undergraduate-academic-assessment-process/ Direct Evidence- collecting information that requires the students to display their knowledge and skills (Direct Evidence) rather than the student or others indicating that they believe the student learned something (Indirect Evidence). Embedded Assessment- gathering information about student learning that is built into the teaching /learning process. This may include test questions, projects, presentations etc. that are already part of the course/curriculum.   Value-added Assessment- using pre/post data to determine how much learning has occurred over a specified period of time. Objective- are broad, general statements of [1] what the program wants students to be able to do and to know or [2] what the program will do to ensure what students will be able to do and to know (as defined by CUPR faculty at NC State). Outcome- more detailed and specific statements derived from the objectives/goals; they are detailed and meaningful enough to guide decisions in program planning and improvement, and decisions regarding pedagogy and practice. (as defined by CUPR faculty at NC State) Curricular outcomes are less specific than course outcomes and will need to be defined in such a way that the content elements within the outcome can be measured to provide information for strengths and areas for improvement. Consider Your Audience When determining your outcomes, evidence to be collected and analysis strategies, it is important to take into account your audience. Think about how they will respond to each part of the process and if your procedures meet both the spirit of the process and its requirements. What will each audience require to support the interpretation of your findings and any decisions made based on the data? Do they agree with the curricular outcomes? What type of data will they expect? What projects and from what courses? Potential Audiences: Program Faculty Program, Department, or College Department Head Associate Dean/Dean Office of Assessment Accreditors: SACS/COC (Regional Accreditor) Program Accreditors   Basic Requirements Each degree program and transcripted certificate: 1. Will have a set of comprehensive student learning outcomes (often 4 to 7) which are measurable and can all be assessed within a 3 to 5 year cycle; 2. Will use direct measures of learning that are specifically aligned with the outcomes; 3. Will analyze the data at a level that allows for the identification of strengths and areas for improvement within the curricular outcome (e.g. no holistic rubric scores, no test or course grades, BUT scores for elements within the rubric or sets of test questions mapped to an element of the outcome are suited for analysis); 4. Will make clear decisions based on the data collected (e.g. change to the curriculum, change to a course, change to an assignment, or the determination that no change is currently necessary).   Office of Assessment, DASA, NC State University  

Collecting Evidence: Program Level Analysis Decisions Reporting 2016 2017 2018 Fall 17, Spring 18, and/or Fall 18: Collect and analyze data from appropriate* semesters for Spring 2019 Report Fall 16 thru Jan 17: Faculty review of data and make decisions for Spring 2017 Report Jan –Feb 17: Write Spring 2017 Report Fall 16, Spring 17, and/or Fall 17: Collect and analyze data from appropriate semesters* for Spring 2018 Report Fall 17 thru Jan 18: Faculty review of data and make decisions for Spring 2018 Report Feb 28, 2018: Spring 2018 Report Due Jan-Feb 18: Write Spring 2018 Report Feb 28, 2017 Spring 17 Report Due Fall 15, Spring 16, and/or Fall 16: Collect and analyze data from appropriate semesters* for Spring 2017 Report Fall 18 thru Jan 19: Faculty review of data and make decisions for Spring 2019 Report Jan-Feb 19: Write Spring 2019 Report 2019 Fall Spring * Appropriate semesters of data for any report may include data from the previous few semesters/years depending assessment cycle, when courses were taught and size of the program. The same data can not be used to measure the same outcome more than once. Feb 28, Spring 2019 Report Due Collecting Evidence: Program Level Many options for collecting evidence do not involve additional measures, such as using data from an upper-level course (test questions, projects, etc..) The intention is not to assess the course or faculty member, but to assess the curriculum. By the time a student reaches upper-level courses or capstone classes, they are usually displaying knowledge gained throughout the curriculum when creating projects or other tasks. Upper-level courses often have assignments that can be used to measure multiple, if not all, curricular outcomes. Some common ways faculty collect evidence of student learning for curricular assessment: Comprehensive Discipline Exam: Can be created in-house or by using a national instrument. The vital piece to keep in mind is how you will retrieve the data upon completion and that you are provided with more than holistic scores so that you can determine strengths and areas for improvement. Test Questions: Some faculty take groups of test questions and map them to the content elements within an outcome. This provides evidence at the appropriate level to determine strengths and areas for improvement. Rubric: “a scoring tool that lays out the specific expectations for an assignment” (Stevens & Levi, 2005, p. 3). For assessment of the curriculum, it is a way of organizing criteria to systematically determine if the outcome is met by articulating the key elements within the outcome. Faculty often use rubrics to assess curricular outcomes by applying them to activities in a senior-level course or capstone such as presentations, capstone project, portfolios, research paper, or case studies. There are multiple types of rubrics that have varying levels of detail. The common types are: Check-list, Rating Scale, Descriptive, Holistic, and Structured Observation Guide. To see a presentation on creating, applying and analyzing rubric data, please go to this Rubric Presentation Analysis Data must be reported at a level such that strengths and areas for improvement within the outcome can be identified. Overall means or other holistic scores, such as grades, do not allow for the identification of either. For information on how to report data at the appropriate level, please go to this Assessment Data Presentation. When using a rubric, data should be presented at the item/element level (i.e. each aspect of the rubric). When reporting means, frequencies/percentages should also be provided, as this too can help uncover patterns or trends useful in highlighting strengths and areas for improvement.   Decisions For each area for improvement, the report must include new actions that the program has already begun to implement to improve students’ achievement of the specific outcome, such as: “We developed…,” “We revised…,” “We implemented our plan to…” *Note usage of past tense. Please do not use language in the future tense or language that is indecisive in tone such as “We are considering. ., We may…” If there is no evidence that any change is needed, then it is ok to say something like: “Based on the data, no changes are needed at this time.” Not all needed changes are big! Some changes will be things like additional class time, materials, assignments or practice on a given topic in an existing course. Reporting The report is just a snapshot of the process. It does not need to be long to demonstrate that the faculty are engaged. Clear alignment from the outcome to the decisions is very important. Be sure that it is apparent how the method used measures that particular outcome, how the data identifies strengths and areas for improvement, and how any areas for improvement (and sometimes strengths) were addressed with a specific decision.   Resources/References NC State University Office of Assessment (https://assessment.dasa.ncsu.edu) “Assessing Student Learning: a common sense guide” Suskie (2010) /