Presentation is loading. Please wait.

Presentation is loading. Please wait.

Doing Assessment, not just talking about it ISETL conference October 2009 Philadelphia, PA Steve Culver Office of Academic Assessment Penny Burge Educ.

Similar presentations


Presentation on theme: "Doing Assessment, not just talking about it ISETL conference October 2009 Philadelphia, PA Steve Culver Office of Academic Assessment Penny Burge Educ."— Presentation transcript:

1 Doing Assessment, not just talking about it ISETL conference October 2009 Philadelphia, PA Steve Culver Office of Academic Assessment Penny Burge Educ research & evaluation Virginia Tech

2 What is Assessment of Student Learning Outcomes? “... the systematic gathering of information about student learning, using the time, resources, and expertise available, in order to improve the learning.” – Walvoord What we want students to know and be able to do when they leave our programs.

3 How program outcomes fit in the academic schema Institutional outcomes College outcomes Program outcomes Course outcomes Course session outcomes

4 Setting the Context Federal Government -- Spellings Commission Report – Whether students are learning is not a yes-or-no question. It’s how? How much? And to what effect?” (Chronicle, 11-4-06) SACS – The institution identifies expected outcomes (including student learning outcomes) in its educational programs... and provides evidence of improvement based on analysis of the results. (SACS Standard 3.3.1.1) Professional Accrediting Agencies Example: ABET – “…describe what students are expected to know and be able to do by the time of graduation, – …identify, collect, and prepare data to evaluate the achievement of program outcomes, – …results in decisions and actions to improve the program.”

5 What is The Process for Assessing Student Learning Outcomes? 2. Gather and Analyze Information About Student Achievement Of Outcomes 3. Use Information Gathered To Improve Student Learning 1.Identify And Articulate Student Learning Outcomes

6 Identifying & Articulating Student Learning Outcomes Where to begin? Conduct a faculty meeting to brainstorm: What would an ideal graduate know, understand, be able to do? Think of Bloom’s Taxonomy. Consult websites of professional/disciplinary organizations and other institutions. Start with courses. Identify a list of possible outcomes, knowing the list will change. Careful of the “understand” outcome. Narrow to 3-5 key outcomes to consider first.

7 Student Learning Outcomes Examples Students will be able to demonstrate the approach, logic and application of the scientific method and be able to apply these principles to real-life problems. Students will be able to identify, and describe to the lay person, the important institutions and determinants of economic activity at the local, regional, national, and international levels, including the basics of fiscal and monetary policy and how each affects the economy. Students will demonstrate mastery of note-taking techniques by correctly using at least 3 different note- taking methods for classroom lecture.

8 An important caveat Whatever is measured becomes important.

9 Measurement lingo Qualitative assessments = more in- depth, contextual information with smaller samples. Quantitative assessments = a broad overview, typically by using larger samples. Reliability & validity, trustworthiness

10 Practical tips Don’t have one person do all the work. Cooperate with other departments, even at different schools. Borrow methods and instruments (if they fit) & Use existing data if possible (IR, Registrar’s Office, etc.) Do as much as possible in the context of what you are already doing. Make your instrument, assignment, observation as short as it can be and still provide the information you want. Have a data collection/reporting plan.

11 Indirect & Direct measures Indirect measures -- surveys, student interviews, focus groups, enrollment trends, job placement numbers, % who go on to graduate school, grades Direct measures -- student work samples from course work, portfolios, eportfolios, observation of student behavior, standardized tests, externally reviewed internship, performance on national licensure exams

12 The place for grades Course grades provide one source of information about student achievement, but they are typically not direct measures of what students can do in terms of a program outcome. Grades provide a measure of overall proficiency in a class. They provide some information about strengths & weaknesses, but they also are imperfect measures of program outcomes.

13 Two ways to approach the development of measures #1 – Take an inventory of measures already used in your program (in courses, at entry, at exit, etc.) #2 – Using one of your outcomes, brainstorm at least three ways (at least one direct measure) to measure this outcome.

14 Lots of data – now what? Turn data into information “We are being pummeled by a deluge of data and unless we create time and spaces in which to reflect, we will be left with only our reactions.” -- Rebecca Blood

15 Transparency Assessment must be a transparent process. Changes made as a result of the assessment process need to be made public. All constituencies (faculty, students, others) should be involved. Nothing worse than having to fill out surveys and never see anything changed as a result.

16 Procedural notes All students’ work doesn’t need to be scored; a sample will likely work. A group of faculty need to be involved in scoring the products. The same product is used twice– once for individual student class grade, once for program assessment. Course-embedded overcomes motivation issues. The student product must measure in some way at least one program outcome.

17 Confidentiality & the use of student work Institutional Review Board (IRB) Family Educational Rights & Privacy Act of 1974 (FERPA) When to use names/student i.d.’s Involvement of and feedback from students in your process

18 Student privacy & security Restrict access to student work to departmental faculty and campus administrators involved in assessment. If student work is stored on-line, ensure security (e.g., use password protection) to limit access to departmental faculty and campus administrators. Whenever possible, do not keep sensitive (e.g., self- stigmatizing) identifiable student work on-line. Instead of collecting work products from all students, collect a sample. – If one student work product contains sensitive information or the like replace it with another student work product. – If one student work product gives information about others that violates the rules of consent, replace it with another student work product. – If a student requests that their work not be a part of the assessment process, comply.

19 Anonymity & Confidentiality Anonymize student work; remove any information that identifies individual students (i.e., name and/or ID number) from any student work collected/used for assessment. Consider distribution of a consent form

20 Reporting & communication Report in the aggregate to avoid identifying individual students. Share your process, (aggregated) findings, and plans for program improvement with program faculty, your students, and campus administrators. Appoint a single faculty member knowledgeable of the program’s assessment process to respond to student concerns, complaints, and/or grievances. Include this person’s contact information in the report.

21 copyright Students own the copyright for their own works, which gives them the exclusive right to reproduce or distribute their work. While it is appropriate for faculty to review student work to assess the achievement of educational criteria/expectations, if the evaluation involves copying any or all portions of the work, students should be notified of this before submitting their work.

22 In summary... The assessment process is only worth doing if we focus on improvement rather than accountability. Documenting what we do – the conversations, the curricular changes, our thoughtful reflections on teaching & learning – is the accountability piece.


Download ppt "Doing Assessment, not just talking about it ISETL conference October 2009 Philadelphia, PA Steve Culver Office of Academic Assessment Penny Burge Educ."

Similar presentations


Ads by Google