Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment of Academic Advising: A Primer The presenters acknowledge and appreciate the contributions of NACADA colleagues Susan Campbell, Charlie Nutt,

Similar presentations


Presentation on theme: "Assessment of Academic Advising: A Primer The presenters acknowledge and appreciate the contributions of NACADA colleagues Susan Campbell, Charlie Nutt,"— Presentation transcript:

1 Assessment of Academic Advising: A Primer The presenters acknowledge and appreciate the contributions of NACADA colleagues Susan Campbell, Charlie Nutt, Mike Kirk-Kuwaye, Lynn Higa, Tom Grites, and Eric White in preparation of materials for this presentation NACADA Executive Office Kansas State University 2323 Anderson Ave, Suite 225 Manhattan, KS 66502-2912 Phone: (785) 532-5717 Fax: (785) 532-7732 e-mail: nacada@ksu.edunacada@ksu.edu © 2010 National Academic Advising Association The contents of all material in this presentation are copyrighted by the National Academic Advising Association, unless otherwise indicated. Copyright is not claimed as to any part of an original work prepared by a U.S. or state government officer or employee as part of that person's official duties. All rights are reserved by NACADA, and content may not be reproduced, downloaded, disseminated, published, or transferred in any form or by any means, except with the prior written permission of NACADA, or as indicated below. Members of NACADA may download pages or other content for their own use, consistent with the mission and purpose of NACADA. However, no part of such content may be otherwise or subsequently be reproduced, downloaded, disseminated, published, or transferred, in any form or by any means, except with the prior written permission of, and with express attribution to NACADA. Copyright infringement is a violation of federal law and is subject to criminal and civil penalties. NACADA and National Academic Advising Association are service marks of the National Academic Advising Association. The Global Community for Academic Advising

2 This General Session serves as an overview of the assessment process for academic advising… …a more in-depth, applied opportunity will be offered as a Topical Session on Assessment on XX from XX - XX

3 Understanding Assessment Definitions of Assessment Purposes for Doing Assessment Evaluation versus Assessment Some Key Terms Engaging in Assessment Use of outcome data Agenda

4 Assessment “Assessment is a process that focuses on student learning, a process that involves reviewing and reflecting on practice as academics have always done, but in a more planned and careful way” (Ewell, 2000)

5 Assessment “Assessment is an ongoing process of collecting information* that is aimed at understanding and improving student learning and personal development” (Angelo, 1995) * what we like to call “evidence”

6 Assessment “Assessment is the systematic collection, review, and use of information about educational programs* undertaken for the purpose of improving student learning* and development*” (Marchese, 1993) * Advising is part of the educational process, not simply a “service”

7 Assessment “Assessment is the means used to measure the outcomes of education and the achievement of students with regard to important competencies” (Pellegrino, Chudowsky, and Glaser, 2001)

8 Assessment “Assessment reflects an understanding of learning as multidimensional, integrated, and revealed in performance over time” (Banta, 1996)

9 For Academic Advising… “Assessment is the process through which we gather evidence about the claims we are making with regard to student learning and the process/delivery of academic advising in order to inform and support improvement” (Campbell, 2008) Assessment

10 What Is Assessment – The Intentions Assessment is intended to be a positive process, yet its connotations are often negative The focus has often been on activities that demonstrate accountability to the exclusion of those that are aimed at improvement

11 Definitions of “Quality” have often been externally defined, reinforcing the accountability focus Marketability Productivity Efficiency Effectiveness

12 Program effectiveness Program improvement Program accountability Activities aimed at both improvement and accountability are important Most compelling purpose is “institutional curiosity” (Maki, 2002; 2004) –i.e, student learning and student achievement Assessment Has Multiple Purposes

13 What Assessment is NOT Assessment is NOT episodic Assessment is NOT just about measurement Assessment is NOT about evaluating the performance of an individual staff / faculty / student Assessment is NOT solely an administrative process Assessment is NOT easy or quick

14 Assessment is An on-going cycle of activity A gathering of a variety of information and data Using this feedback for improvement of individual or program performance A team effort with faculty, staff, and administrators actively engaged A complex process of comparison

15 Goals of Assessment Improving academic advising process delivery programs Enhancing student success persistence retention

16 “…a lack of assessment data can sometimes lead to policies and practices based on intuition, prejudice, preconceived notions, or personal proclivities – none of them desirable bases for making decisions” Upcraft and Schuh (2002. p. 20)

17 Assessment or Evaluation? What Distinguishes Assessment from Evaluation? evaluation usually measures advisor effectivenessevaluation usually measures advisor effectiveness assessment usually measures programmatic outcomesassessment usually measures programmatic outcomes evaluation of individual performance and evaluation of effectiveness of processes may be used as part of an overall assessment designed to measure program outcomesevaluation of individual performance and evaluation of effectiveness of processes may be used as part of an overall assessment designed to measure program outcomes

18 Assessment or Evaluation? What Distinguishes Assessment from Evaluation? assessment should be continuous and imbedded in the culture while evaluation is episodicassessment should be continuous and imbedded in the culture while evaluation is episodic assessment focuses on programmatic issues while evaluation focuses on individual performances of advisorsassessment focuses on programmatic issues while evaluation focuses on individual performances of advisors

19 Advisor Evaluation Topical Session on XX from XX – XX

20 The Assessment Cycle (Maki, 2002, 2004) Gather Evidence Interpret Evidence Identify Outcomes Implement Change Mission/Purposes Educational Objectives

21 The Assessment Flowchart (adapted from Darling, 2005) Student Learning Outcomes Cognitive, Psychomotor, AffectiveProcess/Delivery Outcomes Mapping the Experience What experiences? When or by when? Gathering Evidence When gathered? Where gathered? How often gathered? From whom gathered? How gathered? Minimum performance criteria for success? Values Vision Mission Goals Programmatic Outcomes Sharing and Acting Upon the Results Interpret how results inform practice How and with whom to share interpretation Follow up on implemented changes Start the process all over again!

22  Mapping of Outcomes  The Assessment Matrix/Table Institutional Mission Statement Local Mission Statement Specific Goal or Objective Specific Process/Delivery Outcome or Student Learning Outcome Where Outcome Occurs When or By When Outcome Occurs Outcome Measure Minimum Performance Criteria for Success Data Instruments Action(s) Based on Outcome Data

23 Identifying Key Stakeholders: Who Should Be Involved? Colleagues, faculty, administrators, institutional researchers, staff, students, and institutional community Decide how the assessment team will interact, overlap, and/or support other institutional efforts Encourage stakeholders on and off campus Continuous communication and feedback is a must!

24 Building of a shared trust Building of a shared motivation Building of a shared language Building of support for academic advising institutionally-wide The result is a shared ownership and belief in the process Benefits of a Collective and Collaborative Process

25 Need to Involve Stakeholders at Each Step Pre-assessment Establishment of vision, mission, goals, and outcomes Planning for assessment Development of a shared definition and philosophy of academic advising and assessment Identification of assessment criteria and methodology Implementation Reporting of results Facilitating change

26 Values – What is considered important in regard to academic advising on the campus Vision – The aspirations of what academic advising can be on the campus Mission – The statement which reflects the purpose of academic advising on the campus that serves as the institution’s roadmap to reach its vision and affirm its values for academic advising Key Terms

27 Developing a Mission Statement What does your institution value and believe the purpose and goals of academic advising are across the institution AND Is it clearly articulated/communicated to all constituents?

28 Key Terms Goal Statements – how the mission will be achieved – how values, visions, and missions will be enacted Programmatic Outcomes – what you expect to occur – what you expect students to know, do, and appreciate

29 Key Terms Process/Delivery Outcomes – Articulate the expectations for how academic advising is delivered and what information should be delivered through the academic advising experience Student Learning Outcomes (SLOs) – Articulate what students are expected to know (cognitive learning), do (behavioral learning), and appreciate (affective learning) as a result of involvement in the academic advising experience Mapping – The process of determining when, where and how the outcomes for advising will be accomplished over the students’ academic careers

30 Cognitive SLOs What do we want students to KNOW as a result of participating in academic advising? Know general education requirements Know about academic support services Know how to use the student information system to register Know how to use the catalog Etc…….

31 Behavioral/Psychomotor SLOs What do we want students to Doas a result of participating in academic advising? Generate their degree audit Make advising appointments Keep advising appointments Ask for help Access course descriptions and degree requirements using the online catalog Etc….

32 Affective SLOs What do we want students to Value or Appreciate as a result of participating in academic advising? Value/Appreciate general education Value/Appreciate the advising relationship Value/Appreciate the process of learning Etc….

33 Mapping the Learning Experience What should be learned: e.g., student will know the components of the institution’s General Education requirements Where it should be learned: e.g., orientation workshops, advising sessions, personal reading of catalog or curriculum guide When or By When it should be learned: e.g., prior to first year (orientation); by end of first year (via advising sessions); by end of first year (via personal reading)

34  Mapping of Outcomes  The Assessment Matrix/Table Institutional Mission Statement Local Mission Statement Specific Goal or Objective Specific Process/Delivery Outcome or Student Learning Outcome Where Outcome Occurs When or By When Outcome Occurs Outcome Measure Minimum Performance Criteria for Success Data Instruments Action(s) Based on Outcome Data

35 Once the desired process/delivery and student learning outcomes have been identified, as well as when and where they will occur, the next step is to determine who or what will be measured and how the data will be gathered… …using multiple measures of varying types

36 Types of Measurement and Data Qualitative – open-ended survey or focus group questions; exploratory, emerging information from in-depth responses Quantitative - descriptive, structured, numerical interpretation of data (statistical) from surveys, questionnaires, GPAs, retention rates Direct – may be qualitative or quantitative; direct observations, counts, ratios, or other direct measure of student learning Indirect – may be qualitative or quantitative; reports of past behavior or perceptions such as interviews, surveys Multiple measures!!!

37 Examples of Existing Tools To be used as just one measure among multiple measures – ACT Survey of Academic Advising – Noel-Levitz Student Satisfaction Inventory (SSI) – Winston and Sandor’s Academic Advising Inventory (AAI) – CAS Standards for Advising www.nacada.ksu.edu/Clearinghouse/Research_R elated/CASStandardsForAdvising.pdf – NACADA Assessment of Advising Commission www.nacada.ksu.edu/Commissions/C32/index.htm

38 Measures can (and should) include existing institutional data Information from Institutional Research, Admissions, Registrar, etc. can provide tracking data, GPAs, retention rates, and other information you can utilize as assessment data – this can be a source of some of the multiple measures utilized (in addition to formal instruments, satisfaction inventories, and others)

39 Dangers of Satisfaction Surveys there is often a difference between an advisee receiving good, effective academic advising and being satisfied with the advising process: –if any negative information is exchanged during the advising interaction, the student may respond negatively to the survey items even though the information provided was correct and the process of the interaction was appropriate –the student will likely rate the advising provided based on the type of interaction desired (e.g., informational, relational)

40 For both process/delivery and student learning outcomes, you need to identify the minimum criteria for success of the outcome measure, e. g., number of students exhibiting a specific learning performance percentage of students exhibiting a specific learning performance advisor rating of student performance student rating of specific aspect of advising process advisor rating of specific aspect of advising process etc.

41  Mapping of Outcomes  The Assessment Matrix/Table Institutional Mission Statement Local Mission Statement Specific Goal or Objective Specific Process/Delivery Outcome or Student Learning Outcome Where Outcome Occurs When or By When Outcome Occurs Outcome Measure Minimum Performance Criteria for Success Data Instruments Action(s) Based on Outcome Data

42 Advising Syllabus as a Tool for Communicating Key Elements If academic advising is teaching… Advisors are teachers Teachers have a discipline The advisor’s discipline is academic advising Academic advising is a discipline Individual academic advisors offer the “course” academic advising Individuals in disciplines author unique courses Each course has a syllabus

43 What Does an Advising Syllabus Include? Purpose of academic advising Scope of academic advising Tools, texts, resources SLOs for academic advising Advisor responsibilities Advisee responsibilities Criteria for successful academic advising experiences (outcome measurements) Other elements as individually appropriate

44 What do we want to know or demonstrate as a result of academic advising? Focus on student learning Connect learning to mission, vision, values, goals in your advising program – How will your program contribute to student learning? – Who, what, where, when, how will learning take place? Define measures of student learning – Gather evidence, set levels of expected performance

45 So I Have The Data – Now What? Sharing and Acting Upon Results –Interpret results regarding how they inform the advising process/delivery, student learning, and decision-making – Determine with whom and how the results are reported – Decide how you will implement changes based on the results – Start assessment cycle again…

46 Determine How and With Whom Results Are Shared Administration: President, Provost, various committees – via annual report, strategic plan, white paper, Web sites, etc. Faculty: all faculty, curricular committees, faculty advisors – via performance reviews, annual reports, strategic plans, Web sites, etc. Students: all students, student advisees, student senate, student groups – via newsletters, annual reports, Web sites, etc. Budgeting entities – via annual reports, budget requests, Web sites, etc. Accreditors – via self-studies, accreditation reports, Web sites, etc.

47 Revise pedagogy or curriculum Develop/revise advisor training programs Design more effective programming – advising, orientation, mentoring, etc. Increase out-of-class learning opportunities Shape institutional decision making – planning, resource allocation Other… Interpret How Results Will Inform Decision Making

48 Decide How You Will Follow-up on Implemented Changes Timetable to implement changes – implement all or specific components on a schedule Assessment of implemented changes – repeat assessment cycle again Continuous assessment – assessment is on-going

49 Professional Development Using assessment to inform and support professional development Revise advisor training and development programming accordingly Demonstrate the need for additional training and development Demonstrate the need for additional resources to meet goals

50 Building a Culture and Capacity for Assessment The Culture Commitment Communication Collaboration The Capacity Support

51 At the end of the day, assessment of academic advising is all about… developing consensus around collective expectations about student learning gathering evidence in order to understand student learning using this evidence to support improvements in advising that will contribute to improvements in student learning

52 Topical session on Assessment – Topical session on Advisor Evaluation – Remember:

53 Thank You!


Download ppt "Assessment of Academic Advising: A Primer The presenters acknowledge and appreciate the contributions of NACADA colleagues Susan Campbell, Charlie Nutt,"

Similar presentations


Ads by Google