Beginner’s Guide to Assessment

Slides:



Advertisements
Similar presentations
What “Counts” as Evidence of Student Learning in Program Assessment?
Advertisements

Standards Definition of standards Types of standards Purposes of standards Characteristics of standards How to write a standard Alexandria University Faculty.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
College of Education Graduate Programs Portfolio Workshop.
An Assessment Primer Fall 2007 Click here to begin.
The Academic Assessment Process
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
Identifying and Assessing Learning Outcomes for Professional Development Programming Diane E. Waryas, Ph.D. Kim E. VanDerLinden, Ph.D.
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
If you don’t know where you’re going, any road will take you there.
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
Results Student Engagement : Students generally found logbooks easy to use and practical in the hospital setting. Purpose : There appeared to be a perceived.
Presenters: Erin Nunn, Jarrett Kealey, and Katelin Getz Ohio University.
Assessment Formats Charlotte Kotopoulous Regis University EDEL_450 Assessment of Learning.
College of Education Graduate Programs
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT PLAN/REPORT By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
College of Education Graduate Programs Portfolio Workshop.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
INSTITUTIONAL RESEARCH PLANNING AND ASSESSMENT DR. SHEMEKA MCCLUNG DIRECTOR ARNITRA HUNTER RESEARCH ASSOCIATE.
Assessment and Evaluation of CAREER Educational Components Center for Teaching Advancement and Assessment Research.
INSTITUTIONAL RESEARCH PLANNING AND ASSESSMENT DR. SHEMEKA MCCLUNG DIRECTOR ARNITRA HUNTER RESEARCH ASSOCIATE.
Publisher to insert cover image here Chapter 9
Outline of Quality assurance and accreditation
CAS Standards Student Learning Outcomes and Assessment THE BASICS
This project has been funded with support the European Commission
DATA COLLECTION METHODS IN NURSING RESEARCH
Assessment Basics PNAIRP Conference Thursday October 6, 2011
The assessment process For Administrative units
M-LANG project  Ref. n NO01-KA Interactive Exchange Workshop on how to use response systems and ICT tools for creating interactive learning.
Lisa Brun Western Kentucky University
Student Affairs Assessment
Highline community college
Director of Policy Analysis and Research
Assessment and Feedback – Module 1
Action Research Designs
Working with Students on Academic Probation: A Hands on Approach
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
ASSESSMENT OF STUDENT LEARNING
Assessment planning: It’s the parts that make up the whole!
Helping students know what they know
Derek Herrmann & Ryan Smith University Assessment Services
Creating Assessable Student Learning Outcomes
Student Assessment and Evaluation
Farris Child Dan Chandler
Creating Meaningful Student Learning Outcomes
Assessing Student Learning
Dr. Ron Atwell Ms. Kathleen Connelly
Presented by: Skyline College SLOAC Committee Fall 2007
Using Data to Assess High Quality Professional Development Findings From Interviews With Massachusetts Educators May 2015 Massachusetts Department.
The Heart of Student Success
بسم الله الرحمن الرحيم.
Evaluation and Testing
SGM Mid-Year Conference Gina Graham
Welcome to Your New Position As An Instructor
Changing the Game The Logic Model
Student Assessment and Evaluation
Fort Valley State University
NON-ACADEMIC ASSESSMENT AND REPORTING FY’17
NON-ACADEMIC ASSESSMENT REPORTING FY’17
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Leading the Charge to Develop an Assessment Plan for Academic Advising
Advisor Evaluation Jo Anne Huber The University of Texas at Austin
NON-ACADEMIC ASSESSMENT REPORTING FY’19
Institutional Self Evaluation Report Team Training
Presentation transcript:

Beginner’s Guide to Assessment NACADA 2017 St. Louis, MO Matt Dunbar & Theresa Duggar Introduce speakers

Learning Outcomes Recognize different types of assessment Identify common techniques for assessing advising programs Judge the applicability of assessment methods Develop questions to further knowledge of assessment Demonstrating effectiveness is crucial for advising offices as we seek to serve our students better and meet the accountability calls from stakeholders at all levels. In this session, participants will learn basic assessment types, lenses, and applications. Participants will also experience parts of the assessment process during the program.

Outline Introductions Defining “Assessment” Why Assess? The Assessment Cycle Types of Assessment Lenses of Assessment Q/A A summative, value-added assessment The presentation will focus on different types and methods of assessment and the different lenses assessment personnel utilize. The goal of each section is to provide participants with enough knowledge to ask informed questions and seek out more in-depth learning, and are not intended to be comprehensive.

Who’s in the room Name, institution, comfort/familiarity with assessment? Ask attendees their familiarity or comfort level with assessment to gauge the direction of the conversation.

Defining Assessment Any effort to gather, analyze, and interpret evidence which describes program effectiveness. Upcraft & Schuh (1996) An ongoing process aimed at understanding and improving _______. Angelo (1995) To ground our discussion in literature, we will primarily utilize Upcraft and Schuh’s (1996) work from Assessment in Student Affairs. Their work provides approachable, straightforward language for those who have not done assessment previously. Likewise, Upcraft and Schuh’s book provides a springboard for further discussions of assessment. In this presentation, Upcraft and Schuh’s definitions of terms will inform a majority of the discussion and definitions used. “. . . a lack of assessment data can sometimes lead to policies and practices based on intuition, prejudice, preconceived notions, or personal proclivities - note of them desirable bases for making decisions.” (Upcraft and Schuh, 2002, p. 20)

Why Assess? Why should you? Why do you? Why don’t you? Advisors are a part of the University process! The goal of assessment is for development (improvement, to get better, to better meet students’ needs, etc.) and data provides us the information to do that! Literature points to the importance of advising on student success, thriving, persistence, graduation, etc. Reasons for assessment: To evaluate attainment of goals (effectiveness) To engage faculty, students, administration and staff in continuous, systematic cycles of question asking, feedback and refinement (improvement) To generate information for stakeholders (accountability) To serve the university’s mission (student success: RPG) To enhance advisement, student learning, and program design

The Assessment Cycle

Types of Assessment Qualitative Quantitative Direct Indirect Formative Summative

Qualitative Dealing with quality, value, words, ideas “What did you learn at the advisement session?” “How do you feel about the advisement session?” Some methods include: Interviews Observations Focus groups Open-ended questionnaires Documents

Quantitative Dealing with numbers, statistics, counts “How strongly do you (dis)agree with the following statement?” “Rate the following skills from 1-10.” Some methods include: Experiments Closed questions on a questionnaire (rating scale) National Survey of Student Engagement (NSSE) CORE Institute Alcohol and Drug Survey

Questions to pose to attendees - have them guess qualitative or quantitative: What kind of emotions and attitudes motivate individuals to participate in advisement? (qualitative) How often do students participate in advisement? (quantitative) How regularly do students go home for holiday breaks? (quantitative) How do juniors living in housing perceive their situation and how are they dealing with it? (qualitative)

Direct Measures in specific ways and demonstrates knowledge “How many guests may you sign into the residence hall at one time?” “Where is Campus Life located?” Examples: Direct observation of advising interactions Pretest/posttest Standardized tests of students’ learning Counts of use of advising appointments Advisor: student ratios Portfolios Performance appraisals Advantages: Require students to demonstrate knowledge Provide data that directly measure achievement of expected outcomes

Indirect Captures reflections on learning, the feelings about the thing “How do you plan to use today’s advisement in your daily life?” “What emotions describe how you feel after today’s session?” Examples: Focus groups Interviews Surveys/questionnaires Tracking students’ perceptions (ratings of advisors, ratings of service, advisor satisfaction) Tracking advisors’ perceptions (student preparedness, estimation of student learning) Archival data Final grades (because they include corrections not related to learning outcomes like extra credit or penalties for unexcused absences) Advantages: Ask students to reflect on their learning Provide clues about what could be assessed directly Easy to administer Particularly useful for ascertaining values and beliefs

Questions to pose to attendees - have them guess direct or indirect: Data was gathered using pretests and posttests at new student orientation. (direct) Data was gathered from students’ journals of their advisement experiences. (direct) Data was gathered from an exit survey. (indirect) Data was gathered from graduation and retention rates. (indirect)

Formative Gathering information in order to shape our future work “How comfortable are you with the student organization registration system?” “Do you currently work on-campus?” Formative assessments: Help students identify their strengths and weaknesses and target areas that need work Help advisors recognize where students are struggling and address problems immediately Help advisors assess their own progress Types of formative assessment: Observations Homework Reflections journals Advising appointments Student feedback

Summative Following-up, seeing where participants have ended-up Types of summative assessment: Examinations Portfolios Student evaluation of advisement Advisor self-evaluation

Questions to pose to attendees - have them guess formative or summative: Advising appointments. (formative) Final exam in a senior seminar course. (summative) Asking students to submit one to two sentences identifying the main point of the advisement meeting. (formative) Advising portfolios. (summative) Advising portfolio: advising philosophy, advising goals/objectives, advisee demographics, advising responsibilities, evidence of growth, and essay about the artifacts.

Assessment Lenses Multiple Lenses of Assessment Criterion: Comparing to a goal or set of requirements (criteria) Benchmarks: Comparing to peers, standards, or others. Longitudinal: Comparing to past selves Value-added: Comparing to where we were before the experience

Assessment Lenses Hypothetical Scenario: 66% of students register for courses on time on their registration day.

Criterion Key Concept: How did we do against a predetermined standard? In Action: The institution set a goal that every student (100%) will register on time during registration day.

Benchmark Key Concept: How do we compare to our peers? In Action: Nationwide, institutions report that 80% of students register for courses on time.

Longitudinal Key Concept: Have we improved or declined? In Action: On time registration has improved by 15% over the last three years. Reason for change? Did we actively impact this increase?

Value-Added Key Concept: Think about outcomes you want to achieve. Is the targeted population better off? In Action: More students are registering on-time. → What benefits does this entail? Advisors are effective; departments can prepare future faculty and semester schedules; institutions can report higher and more accurate enrollment numbers; etc.

Summary Assessment is Important Types of Assessment Lens of Assessment Why? Types of Assessment What are you comfortable with? What type of data fulfills your needs? Lens of Assessment Important to apply context to data

Activity Start Your Assessment Process! University Mission Advising Goals Center Mission Center Goals Student Learning Outcomes Assessment Plan

Questions?