Assessment Methods Office of Institutional Research, Planning, and Assessment Neal F. McBride, Ed.D., Ph.D. Associate Provost.

Slides:



Advertisements
Similar presentations
Using Embedded Assessment to Collect Evidence of Student Learning Using Embedded Assessment to Collect Evidence of Student Learning.
Advertisements

Capstone Assessment An Introduction Office of Assessment and Accreditation Indiana State University.
What “Counts” as Evidence of Student Learning in Program Assessment?
PD Plan Agenda August 26, 2008 PBTE Indicators Track
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Using Embedded Assessment to Collect Evidence of Student Learning Using Embedded Assessment to Collect Evidence of Student Learning.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
Assessment Report Computer Science School of Science and Mathematics Kad Lakshmanan Chair Sandeep R. Mitra Assessment Coordinator.
Best Practices in Assessment, Workshop 2 December 1, 2011.
Assessing student learning. Diagnostic - The gathering of information at the outset of a course or program of study to provide information for both the.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
1 Aligning Assessment Methods with Learning Outcome Statements and Curricular Design Presented at CCRI April 8, 2005 Material from Maki,
Curriculum Assessment
B121 Chapter 7 Investigative Methods. Quantitative data & Qualitative data Quantitative data It describes measurable or countable features of whatever.
The Academic Assessment Process
Standards and Guidelines for Quality Assurance in the European
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
FLCC knows a lot about assessment – J will send examples
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Catherine Wehlburg, Ph.D. Assistant Provost for Institutional Effectiveness Texas Christian University TAMU Assessment Conference 2011.
AET/515 Spanish 101 Instructional Plan SofiaDiaz
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Principles of Assessment
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
LEARNING PROFILE Title of Degree Program PROGRAM CHARACTERISTICS (Description, Unique Experiences, Inputs, Outcomes) (EXAMPLES) Year Established. Accreditation.
Direct vs Indirect Assessment of Student Learning: An Introduction Dr. Sheila Handy, Chair Business Management and Co-Chair University Assessment Committee.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Learning Assurance School of Business & Industry Faculty Retreat August 19, 2008.
ABET Assessing Program Outcomes Amir Rezaei. Outline Context of Assessment Process of Assessment Similarities and differences between classroom and program.
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
ADEPT 1 SAFE-T Evidence. SAFE-T 2 What are the stages of SAFE-T? Stage I: Preparation  Stage I: Preparation  Stage II: Collection.
UCF University-wide System for Assessing Student Learning Outcomes Dr. Julia Pet-Armacost Assistant VP, Information, Planning, and Assessment University.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
Student Learning Outcomes at CSUDH. Outcomes assessment can tell us if our students are really learning what we think they should be able to do.
Florida Education Center Tallahassee, Florida December 3, 2003 Dr. R. E. LeMon Vice Chancellor for Academic and Student Affairs 1 Florida Board of Governors.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
What the*!?# is an SLO? Workshop on Student Learning Outcomes For De Anza College Faculty.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Texas Tech University PROGRAM-LEVEL ASSESSMENT WORKSHOP WRAP-UP SESSION FRIDAY, NOVEMBER 11,
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
Student Learning Outcomes and SACSCOC 1.  Classroom assessment ◦ Grades ◦ Student evaluation of class/course  Course assessment –????  Academic program.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Assessment of Student Learning: Phase III OSU-Okmulgee’s Evidence of Student Learning.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
CHOOSING AN ASSESSMENT APPROACH An Inventory of Available Options Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Assessing Student Learning Outcomes Andrew Swan What are Student Learning Outcomes?  Education reform in the 1990s pushed for more sophisticated goals.
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
Student Learning Outcomes (SLOs) Module #2: Writing SLOs Office of Academic Planning & Accountability Institutional Effectiveness Moderator: Dr. Cathy.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Introduction to Indirect Student-Learning Assessment (Part I)
Consider Your Audience
Director of Policy Analysis and Research
Student Learning Outcomes Assessment
Presented by: Skyline College SLOAC Committee Fall 2007
What to do with your data?
Student Learning Outcomes at CSUDH
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Presentation transcript:

Assessment Methods Office of Institutional Research, Planning, and Assessment Neal F. McBride, Ed.D., Ph.D. Associate Provost

How would you assess these SLOs? Graduates are able to critique a brief draft essay, pointing out the grammatical, spelling, and punctuation errors and offer appropriate suggestions to correct identified deficiencies Senior undergraduate psychology majors perform above the national average on the GRE Psychology Subject Test In a “capstone course” during the final semester prior to graduation; required to critique a supplied essay containing predetermined errors; evaluated by a 3-person faculty panel (criteria: appropriate suggestions to remediate 90% of the errors) GRE Psychology Subject Test; completed during the senior year, required for graduation. Compare average GRE Psychology Subject Test scores with average scores of all examinees nationwide

Assessment Methods Assessment Methods Assessment methods are ways to ascertain (“measure”) student achievement levels associated with stated student learning outcomes (SLOs) “Outcome” is a generic term for goals, objectives, and/or aims

MissionVision UniversityStudentOutcomesStudentLearningOutcomes AssessmentMethods A specific assessment method(s) is selected for a specific outcome... “How do I ‘measure’ this outcome?” Basis for Selecting Appropriate Assessment Methods

Assessment Methods Assessment Methods Assessment methods include both direct and indirect approaches... We’ll define these terms in a few minutes. First, let’s explore a few criteria or considerations to keep in mind as you select appropriate assessment methods...

Qualitative Versus Quantitative Methods Qualitative assessment: collects data that does not lend itself to quantitative methods but rather to interpretive criteria; “data” or evidence are often representative words, pictures, descriptions, examples of artistic performance, etc. Quantitative assessment: collects representative data that are numerical and lend themselves to numerical summary or statistical analysis Programs are free to select assessment methods appropriate to their discipline or service.... choices must be valid and reliable

Valid and Reliable Methods Valid: The method is appropriate to the academic discipline and measures what it is designed to measure Reliable: The method yields consistent data each time it is used and persons using the method are consistent in implementing the method and interpreting the data Basic Aim: “defensible methods”

Embedded assessment - “measurement” strategies included as part of the requirements within existing courses, internships, or other learning experiences– “double duty” assessment; e.g., “critical assignments” Ancillary assessment - “measurement” strategies added on or in addition to requirements within existing courses, internships, or other learning experiences– “additional duty” assessment Locus of Assessment

Sources for Finding Assessment Methods  Professional associations and organizations  Other programs/departments at CBU  Similar programs/departments at other universities  Published Resources Dunn, D. S., Mehrotra, C. M. and Halonen J. S. (2004). Measuring Up: Educational Assessment Challenges and Practices for Psychology. APA: Washington, DC.  Web... In general or for your specific area  Literature search by a professional librarian  Personal experience – yours or colleagues

 Does it “fit” the SLO?  Did the faculty or student services staff select the method and are they willing to participate in its use?  Will all students in the program or provided service be included in the assessment (ideally, yes) or a sample of students (maybe)?  How much time is required to complete the assessment method? Determine how this affects faculty, staff, and students When SELECTING ANY ASSESSMENT method, here are some questions to consider carefully:

 When and where will the assessment be administered?  Are there financial costs? Are program and/or university resources available?  Is the method used at one point in time (cross-sectional method) or utilized with students over several points in time (longitudinal method)?  Does the program faculty/staff have the skills and/or knowledge necessary to use the method and analyze the results?  Most importantly... WHO is responsible to make certain the assessment is accomplished?

TIP Ideally…. as you write or rewrite SLOs keep in mind the question: “What method(s) can I use to assess this SLO?” Why is this tip potentially useful?

Direct Methods Direct assessment methods are “measurement” strategies that require students to actively demonstrate achievement levels related to institutional and program- specific learning outcomes Direct assessment methods focus on collecting evidence on student learning or achievement directly from students using work they submit (assignment, exam, term paper, etc.) or by observing them as they demonstrate learned behaviors, attitudes, skills, or practice

Capstone or Senior-Level projects, papers, presentations, performances, portfolios, or research evaluated by faculty or external review teams... effective as assessment tools when the student work is evaluated in a standard manner, focusing on student achievement of program-level outcomes Exams - locally developed comprehensive exams or entry-to- program exams, or national standardized exams, certification or licensure exams, or professional exams Internship or Practicum - evaluations of student knowledge and skills from internship supervisors, faculty overseers, or from student participants themselves. This may include written evaluations from supervisors focused on specific knowledge or skills or evaluation of student final reports or presentations from internship experiences. Direct Methods: Examples

Portfolios (hard-copy or web-based) - reviewed by faculty members from the program, faculty members from outside the program, professionals, visiting scholars, or industrial boards Professional Jurors or Evaluators to evaluate student projects, papers, portfolios, exhibits, performances, or recitals Intercollegiate Competitions - useful for assessment when students are asked to demonstrate knowledge or skills related to the expected learning outcomes within appropriate programs Course assessments - these are projects, assignments, or exam questions that directly link to program-level expected learning outcomes and are scored using established criteria; common assignments may be included in multiple sections taught by various professors (assuming prior agreement) Direct Methods, continued

Direct Methods: Advantages  Require students to actively demonstrate knowledge, attitudes, and/or skills  Provide data to directly measure expected outcomes  Demand less abstract interpretation  Usually “easier” to administer Direct Methods are always our first choice; indirect methods support but cannot replace direct methods

Achievement Levels or Criteria  Rarely does every student achieve all SLOs completely, 100%; nor can we expect this  What “level” of achievement is acceptable? Identified in the “OPlan”  Rubrics recognize varying achievement levels  Rubrics are a scoring method or technique appropriate to many assessment methods

OutcomeNoviceDevelopingProficientAccomplished Correctly analyzes research data 1  Limits analysis to correct basic descriptive analysis. 2  Selects and executes correct basic statistical analyses 3  Selects, articulates, and executes an inferential statistical analysis 4  Selects, articulates, and executes the statistical analysis suitable to the research question A Rubric Example Excellent resource: Stevens, D. D. & Levi, A. J. (2005). Introduction to Rubrics. Sterling, VA: Stylus. CBU utilizes 4-point rubrics, with the specific level criteria appropriate to the outcome in question

Guidelines for Implementing Imbedded, Direct Assessment  Link class assignments to both SLOs and course objectives  If multiple sections of the same course exist and the intent is to aggregate data across sections, ensure that the assessment is the same in all sections (same assignment and grading process)  Make certain faculty collaboration underpins assessment across multiple course sections  Tell students which assignment(s) is being used for SLO assessment as well as course assessment…Why?

Indirect Methods Methods requiring the faculty and student life staff to infer actual student abilities, knowledge, and values rather than observing direct evidence of learning or achievement Indirect assessment is gathering information through means other than looking at actual samples of student work... e.g., surveys, exit interviews, and focus groups Indirect methods provide perceptions of students, faculty, or other people (often alumni or employers) who are interested the program, service, or institution Indirect methods expand on or confirm what is discovered after first using direct methods

Indirect Methods, Continued Exit interviews and Student Surveys - to provide meaningful assessment information, exit interviews and/or student surveys should focus on students’ perceived learning (knowledge, skills, abilities) as well as students’ satisfaction with their learning experiences; including such things as internships, participation in research, independent projects, numbers of papers written or oral presentations given, and familiarity with discipline tools

Faculty Surveys aimed at getting feedback about faculty perceptions of student knowledge, skills, values, academic experiences, etc. Alumni Surveys aimed at evaluating perceptions of knowledge, skills, and values gained while studying in a particular program... surveys frequently target alumni who are 1-and 5- years post-graduation and include program-specific questions Indirect Methods, Continued

Surveys of Employers / Recruiters aimed at evaluating specific competencies, skills, or outcomes Tracking Student Data related to enrollment, persistence, and performance... may include graduation rates, enrollment trends, transcript analysis (tracking what courses students take and when they take them), and tracking student academic performance overall and in particular courses Indirect Methods, Continued

External Reviewers provide peer review of academic programs and the method is a widely accepted in assessing curricular sequences, course development and delivery, as well as faculty effectiveness... using external reviewers is a way to assess whether student achievement reflects the standards set forth in student learning and capacity outcomes... skilled external reviewers can be instrumental in identifying program strengths and weaknesses leading to substantial curricular and structural changes and improvements Indirect Methods, Continued

Curriculum and syllabus analysis – Examining whether the courses and other academic experiences are related to the stated outcomes... often accomplished in a chart or “map.” Indirect Methods, Continued Syllabus analysis is an especially useful technique when multiple sections of a course are offered by a variety of instructors... provides assurance that each section covers essential points without prescribing the specific teaching methods used in helping the students learn the outcomes

Keeping records or observing students' use of facilities and services... data can be correlated with test scores and/or course grades Example: Logs maintained by students or staff members documenting time spent on course work, interactions with faculty and other students, internships, nature and frequency of library use, computer labs, etc. Indirect Methods, Continued

Advantages of Indirect Methods Relatively easy to administer Provide clues about what could/should be assessed directly Able to flesh out subjective areas direct assessments cannot capture Particularly useful for ascertaining values and beliefs Surveys can be given to many respondents at the same time

Indirect Methods Advantages, Continued  Surveys are useful for gathering information from alumni, employers, and graduate program representatives Exit interviews and focus groups allow questioning students face-to-face; exploring and clarifying answers is done more easily External reviewers can bring objectivity to assessment and answer questions the program or department wants answered or questions based on discipline-specific national standards

Disadvantages of Indirect Methods Indirect methods provide only impressions and opinions, not “hard” evidence on learning Impressions and opinions may change over time and with additional experience Respondents may tell you what they think you want to hear Survey return rates are often low and, consequently, not representative

Indirect Methods Disadvantages, Continued  You cannot assume those who did not respond would responded in the same way as those who did respond  Exit interviews take considerable time to complete  Focus groups usually involve a limited number of respondents who are not representative  Unless the faculty agree upon the questions asked during exit interviews and focus groups, there may not be consistency in responses

Suggestions for Implementing Indirect, Ancillary Assessment  Use “purposeful samples” when it is not possible to include all students (which is always the first choice)  Offer incentives to participants  Anticipate low turn-out and therefore over-recruit  Plan carefully logistics and question design (i.e., surveys, interviews, focus groups)  Train group moderators and survey interviewers

Implementation Suggestions, Continued  Consider using web-based or telephone as well as face-to-face interviews or focus groups  Set time limits for focus groups and interviews  Develop and provide very careful, explicit directions  Be wary of FERPA regulations when using archival records  Only use archival records that are relevant to specific outcomes

Capitalize on what you are already doing Capitalize on what you are already doing Integrate imbedded assessment as much as possible Integrate imbedded assessment as much as possible Schedule ancillary assessment during regular class times or times when students are present Schedule ancillary assessment during regular class times or times when students are present Make assessment a graduation requirement Make assessment a graduation requirement Plan an “assessment day” Plan an “assessment day” Seek to make assessment a routine activity within your curriculum or student services programs Seek to make assessment a routine activity within your curriculum or student services programs Implementing Assessment in General

REVIEW: Assessment Strategy Combinations Imbedded, direct assessment Imbedded, direct assessment Imbedded, indirect assessment Imbedded, indirect assessment Ancillary, direct assessment Ancillary, direct assessment Ancillary, indirect assessment Ancillary, indirect assessment Depending on the specific SLO, there are four assessment strategies or frames: REMEMBER: There is more than one way to assess any given SLO! It’s your choice as long as it is valid and reliable.