Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment Strategies, Assessment networks Session II Preliminary Findings from Virginia Tech Using Assessment Data to Improve Teaching & Learning.

Similar presentations


Presentation on theme: "Assessment Strategies, Assessment networks Session II Preliminary Findings from Virginia Tech Using Assessment Data to Improve Teaching & Learning."— Presentation transcript:

1 Assessment Strategies, Assessment networks Session II Preliminary Findings from Virginia Tech Using Assessment Data to Improve Teaching & Learning

2 Introductions Office of Academic Assessment 101 Hillcrest Hall (0157) – Ray Van Dyke, 231-6003, rvandyke@vt.edurvandyke@vt.edu – Steve Culver, 231-4581, sculver@vt.edusculver@vt.edu – Kate Drezek, 231-7534, kmdrezek@vt.edukmdrezek@vt.edu – Yolanda Avent, yavent@vt.eduyavent@vt.edu Others here today

3 Today’s agenda Review of Results from Office of Academic Assessment (OAA) SACS 3.3.1.1 Departmental Assessment Report Suggestions for using identified tools/strategies for assessment to more explicitly incorporate direct assessment of student learning outcomes into program-level changes/improvements Open discussion

4 Overview: What is Assessment of Learning Outcomes? “Assessment of student learning is the systematic gathering of information about student learning, using the time, resources, and expertise available, in order to improve the learning.” – Walvoord A student learning outcome states a specific skill/ability, knowledge, or belief/attitude students are expected to achieve through a course, program, or college experience. Example: Upon completion of a B.A. degree in English, a student will be able to read critically and compose an effective analysis of a literary text.

5 What is The Process for Assessing Student Learning Outcomes? 2. Gather and Analyze Information About Student Achievement Of Outcomes 3. Use Information Gathered To Improve Student Learning 1.Identify And Articulate Student Learning Outcomes

6 Big Question: How do we turn this… Into a concrete plan? 2. Gather and Analyze Information About Student Achievement Of Outcomes 3. Use Information Gathered To Improve Student Learning 1.Identify And Articulate Student Learning Outcomes

7 Departmental Assessment Report Part of SACS reaccreditation process: – Standard 3.3.1. The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness)

8 Departmental Assessment Report Part of SACS reaccreditation process: – 3.3.1.1 educational programs, to include student learning outcomes – 3.3.1.2 administrative support services – 3.3.1.3 educational support services – 3.3.1.4 research within its educational mission, if appropriate – 3.3.1.5 community/public service within its educational mission, if appropriate

9 Departmental Assessment Report Process: – Interview all department heads late fall/early spring 2008-2009 academic year – Approximately 1 hour in length – Conducted by OAA graduate research assistant Yolanda Avent (Educational Psychology)

10 Departmental Assessment Report Process: – Focus on concrete changes implemented by departments at related to overall programmatic improvements, improvements in advising, & improvements in specific courses – Participants asked to identify specific assessment tools/strategies/approaches they used that provided the justification for implementing identified changes

11 Departmental Assessment Report Process: – Yolanda Avent used detailed notes from interviews to synthesize information provided by participating departments – Preliminary results reported here today based on interviews with 50+ departments representing all colleges at the University – Common trends/themes identified by Ms. Avent/Kate Drezek for purposes of preliminary report

12 Departmental Assessment Report Caveat: Current Context – Assessment not done in vacuum – Pressures facing departments that also contributed to decisions to make certain program, advising, course changes – OAA Argument – Systematic assessment equally if not more key to departments’ abilities to innovate/improve in tough times as it can highlight strategic areas, help prioritize efforts

13 Departmental Assessment Report Preliminary Results: Program Changes

14 Changes: Programs Curricular Mapping – Reconfiguration of majors – Elimination of duplication – Re-sequencing of courses

15 Programmatic Learning Outcome Identification – Explicitly embedding essential learning outcomes/core competencies (e.g., critical thinking, information literacy) in multiple classes – Incorporation of VIEWS requirements in program – Creative incorporation of “non-traditional” learning outcomes within curriculum (e.g., global awareness) Changes: Programs

16 Development of Standardized Measures of Student Performance Across Program – Common Rubrics for Project Evaluation (Undergraduate & Graduate) – Common Measurable Outcomes for Thesis/Senior Capstone Students Changes: Programs

17 Program Innovations/Incorporation of Current Pedagogical “Best Practices” – Undergraduate: – Undergraduate Research Opportunities, including field experiences – Service Learning Opportunities Changes: Programs

18 Program Innovations/Incorporation of Current Pedagogical “Best Practices” – Graduate: – Teaching Mentoring Programs for Grad Students – Incorporation of “high demand” skills – presentation skills, peer review writing process, ethics, grant- writing – into existing seminars – Creation of new courses and programs around similar topics Changes: Programs

19 Departmental Assessment Report Preliminary Results: Advising Changes

20 Changes: Advising Change in Advising Structure – From advising professional to distribution among faculty – From distribution among faculty to advising professional – Single faculty model

21 Change in Advising Structure – Hybrid – Use of introductory courses as opportunities to advise Changes: Advising

22 Change in Advising Philosophy/Culture – Informal advising opportunities, e.g., Brown Bag Lunches – Creation of advising centers to make advising more visible, holistic, student-friendly – Plans of Study submitted to Advisor and Chair of Department Changes: Advising

23 Leveraging of Technology to Enhance Advising – On-line “self-help” – On-line “tracking” of students for “force-adding” into courses – Carrot/Stick approach – blocking course registration unless see your advisor Changes: Advising

24 Departmental Assessment Report Preliminary Results: Course Changes

25 Changes: Courses Revision/Reinvention of Instructional Design in Specific Courses – “Special Topics” courses – Move to online instruction – Use of best available technology as PEDAGOGICAL tool (e.g., Tablets)

26 Revision of Course Objectives to Ensure Alignment with Larger Learning Goals Changes: Courses

27 Departmental Assessment Report Preliminary Results: Assessment Tools/Strategies that Provided Justification for Change

28 Question: Tools/Strategies for Justification Tools/Strategies Explicitly Mentioned by Participants: PROGRAM-LEVEL DATA – Enrollment numbers – Retention rates – Course-taking patterns

29 Question: Tools/Strategies for Justification Tools/Strategies Explicitly Mentioned by Participants: STUDENTS – Informal feedback – Course evaluations – Focus groups

30 Question: Tools/Strategies for Justification Tools/Strategies Explicitly Mentioned by Participants: STUDENTS – Senior Survey – In-class surveys – Exit surveys (students leaving major as well as students graduating)

31 Question: Tools/Strategies for Justification Tools/Strategies Explicitly Mentioned by Participants: FACULTY – Informal Feedback – Observation, Reflection – Faculty study group feedback – Feedback via Assessment Committee, Curriculum Committee members – Guided Faculty Reflection Pieces

32 Question: Tools/Strategies for Justification Tools/Strategies Explicitly Mentioned by Participants: EXTERNAL CONSTITUENCIES – Alumni: surveys, Alumni Advisory boards – Professionals in Industry: informal feedback from employers, graduate schools; Advisory Boards

33 Question: Tools/Strategies for Justification Tools/Strategies Explicitly Mentioned by Participants: DIRECT, SYSTEMATIC ASSESSMENT OF STUDENT LEARNING – Infrequently cited as tool/strategy justifying programmatic changes – Significance? Mentioned more often as part of changes to courses based on assessment Not explicitly acknowledged, utilized to fullest potential for program review

34 OAA Preliminary Conclusion: Draw better connections between existing practices, tools

35 Connecting the dots Student Learning Assessment for Programmatic Improvement “Double Dip” with Course Data Explicit Faculty Reflection Opportunities Common Measures across courses Tap into advisory boards as external reviewers Office of Academic Assessment – tools like national survey data, VALUE metarubrics Importance of Networking Across Departments, Colleges - Proven Best Practices How can we best facilitate this sharing of workable strategies???

36 Discussion

37 Final Thought: “We are being pummeled by a deluge of data and unless we create time and spaces in which to reflect, we will be left with only our reactions.” – Rebecca Blood


Download ppt "Assessment Strategies, Assessment networks Session II Preliminary Findings from Virginia Tech Using Assessment Data to Improve Teaching & Learning."

Similar presentations


Ads by Google