Download presentation
Presentation is loading. Please wait.
Published byLester Parsons Modified over 9 years ago
1
WASC Assessment Leadership Academy Oakland, California August 1, 2011 Presentation by Trudy W. Banta Professor of Higher Education and Senior Advisor to the Chancellor for Academic Planning and Evaluation Indiana University-Purdue University Indianapolis 355 N. Lansing St., AO 140 Indianapolis, Indiana 46202-2896 tbanta@ iupui.edu http://www.planning.iupui.edu
2
Outline The Great Testing Debate Alternatives to standardized tests of generic skills © TWBANTA-IUPUI
3
Group Assessment Has Failed to Demonstrate Institutional Accountability Focus on improvement at unit level Rare aggregation of data centrally Too few faculty involved HE scholars focused on K-12 assessment
4
© TWBANTA-IUPUI 2006 Commission on the Future of Higher Education We need a simple way to compare institutions The results of student learning assessment, including value added measurements (showing skill improvement over time) should be... reported in the aggregate publicly.
5
© TWBANTA-IUPUI Now We Have the Press to Assess with a Test
6
My History Educational psychology Program evaluation & measurement Performance funding in Tennessee 1990 USDOE effort to build a national test 1992 Initiated evidence-based culture at IUPUI © TWBANTA-IUPUI
7
2007 Voluntary System of Accountability ~ Assessment of Learning ~ defined as critical thinking, written communication, analytic reasoning © TWBANTA-IUPUI
8
VSA Recommendations (over my objections) Collegiate Assessment of Academic Proficiency (CAAP) Measuring Academic Proficiency & Progress (MAPP) Collegiate Learning Assessment (CLA) (College BASE) © TWBANTA-IUPUI
9
TN = Most Prescriptive (5.45% of Budget for Instruction) 1. Accredit all accreditable programs (25) 2. Test all seniors in general education (25) 3. Test seniors in 20% of majors (20) 4. Give an alumni survey (15) 5. Demonstrate use of data to improve (15) ___ 100
10
© TWBANTA-IUPUI At the University of Tennessee CAAP Academic Profile (now MAPP) COMP (like CLA and withdrawn by 1990) College BASE
11
© TWBANTA-IUPUI In TN We Learned 1) No test measured 30% of gen ed skills 2) Tests of generic skills measure primarily prior learning 3) Reliability of value added =.1 4) Test scores give few clues to guide improvement actions
12
An Inconvenient Truth.88 = correlation of VSA – recommended test scores with SAT/ACT scores thus 77% of the variance in institutions’ scores Is due to students’ prior learning © TWBANTA-IUPUI
13
How Much of the Variance in Senior Scores is Due to College Effects? Student motivation to attend that institution (mission differences) Student mix based on age, gender socioeconomic status race/ethnicity transfer status college major
14
© TWBANTA-IUPUI How Much of the Variance in Senior Scores is Due to College Effects? (continued) Student motivation to do well Sampling error Measurement error Test anxiety College effects ______ 23 %
15
Threats to Conclusions Based on Test Scores 1. Measurement error 2. Sampling error 3. Different tests yield different results 4. Different ways of presenting results 5. Test bias 6. Pressure to raise scores - Daniel Koretz “Measuring Up” Harvard U. Press - 2008 © TWBANTA-IUPUI
16
Using NSSE 95% variance in responses is WITHIN Institutions - Pike, Kuh, McCormick, Ethington & Smart (2011) © TWBANTA-IUPUI
17
Student Motivation Samples of students are being tested Extrinsic motivators (cash, prizes) are used We have learned: Only a requirement and intrinsic motivation will bring seniors in to do their best
18
Recent University of Texas Experience 30 – 40% of seniors at flagships earn highest CLA score (ceiling effect) flagship campuses have lowest value added scores © TWBANTA-IUPUI
19
Concerns About Value Added Student attrition Proportion of transfer students Different methods of calculating Unreliability Confounding effects of maturation © TWBANTA-IUPUI
20
Word from Measurement Experts Given the complexity of educational settings, we may never be satisfied that value added models can be used to appropriately partition the causal effects of teacher, school, and student on measured changes in standardized test scores. - Henry Braun & Howard Wainer Handbook of Statistics, Vol. 26: Psychometrics Elsevier 2007 © TWBANTA-IUPUI
21
Employing currently available standardized tests of generic skills to compare the quality of institutions is not a valid use of those tests. © TWBANTA-IUPUI
22
Consequences of Scores Below Expectations? Public criticism Loss of students Reduced funding Corrective actions imposed Prep courses for students Profits for coaching consultants
23
2009 NILOA Survey Campus-wide Approaches to Assessment 1. A national survey (76%) 2. Standardized test of general skills (39%) 3. Portfolios, specialized tests, external judges © TBANTA-IUPUI
24
OECD’s AHELO for HEIs from 15 countries 1. Generic skills (CLA) 2. Disciplines (Engineering and Economics) 3. Value added 4. Contextual information indicators © TWBANTA-IUPUI
25
Designing Effective Assessment: Principles & Profiles of Good Practice Trudy W. Banta Elizabeth A. Jones Karen E. Black Jossey-Bass (Wiley) 2009 © TWBANTA-IUPUI
26
Profiles Invited over 1000 Received 146 Selected 49 for use in full Categorized all 146 and published Web sites © TWBANTA-IUPUI
27
Outline for Profiles Background and Purpose Methods over ? Years Resources Required Findings Use of Findings Impact of Using Findings Success Factors Web sites © TWBANTA-IUPUI
28
Standardized tests of generic skills (e.g., writing, critical thinking) used by just 8% always supplemented © TBANTA-IUPUI
29
© TWBANTA-IUPUI Standardized tests CAN initiate conversation
30
Advantages of standardized tests of generic skills promise of increased reliability & validity norms for comparison © TWBANTA-IUPUI
31
Limitations of standardized tests of generic skills cannot cover all a student knows narrow coverage, need to supplement difficult to motivate students to take them! © TWBANTA-IUPUI
32
Better Ways to Demonstrate Accountability Performance Indicators 1. Access (to promote social mobility) 2. Engaging student experience 3. Workforce development 4. Economic development 5. Civic contribution of students, faculty, staff, graduates © TWBANTA-IUPUI
33
Effects of Education on Social Mobility Relate data on high school courses and completion college courses and completion to Career placements & earnings private and public employment military enlistments incarcerations - Florida DOE © TWBANTA-IUPUI
34
Australian DOE 1. Increase participation of individuals from low SES backgrounds as undergraduates as graduate students 2. Improve engagement & satisfaction survey of student engagement first year retention satisfaction of completers © TWBANTA-IUPUI
35
Workforce Development % graduates placed in field % graduates placed locally Degree, certificate, CE programs aligned with regional priorities Internships/coop programs Increased earnings of graduates - APLU © TWBANTA-IUPUI
36
Economic Development Creation of intellectual property Patent and license awards # start-up companies Sponsored research $ % NSF, NIH Use of academic facilities by industry Alignment of assets to support regional economic clusters - APLU © TWBANTA-IUPUI
37
Civic Contribution Internships supported by institution Volunteer hours for students, faculty, staff Service learning placements $ contributed to United Way et al. Pro bono legal & health services No-cost presentations to community groups © TWBANTA-IUPUI
38
Land-Grant Extension 1. Useful information for agricultural producers small business owners rural families 2. After-school STEM programs 3. Driver-ed for teen traffic offenders 4. Green jobs in energy fields 5. Sustainable agriculture practices © TWBANTA-IUPUI
39
National Governors Association Center for Best Practices A simple graduation rate penalizes colleges serving low SES students may discourage open enrollment may lead to lower graduation standards © TWBANTA-IUPUI
40
National Governors Association Center for Best Practices Track intermediate student milestones Successful completion of remedial and core courses Advancement from remedial to credit courses Transfer from 2-to 4-year college Credential attainment © TWBANTA-IUPUI
41
Oregon Community Colleges track: % needing remedial courses % completing remedial or ESL courses # credits earned each year toward degree or certificate Semester-to-semester and fall-to-fall persistence - Inside HE 10/12/09 © TWBANTA-IUPUI
42
Peter Ewell: “... shift state funding formulas so that colleges receive money based on how many students are still enrolled by the end of the academic term rather than at the beginning.” - Inside HE 11/18/09 © TWBANTA-IUPUI
43
If We Must Measure Learning Let’s Use: 1. Standardized tests in major fields licensure and certification tests ETS Major Field Tests 2. Internship performance 3. Senior projects 4. Study abroad performance 5. Electronic portfolios 6. External examiners © TWBANTA-IUPUI
44
Western Governors University offering Competence-based on-line degrees 1) Define the domain of knowledge/skill 2) Develop objective test items 3) Design performance tasks 4) Evaluate performance 5) Use findings to improve curriculum, instruction, student services 6) Continuously improve assessment quality and validity
45
2009 NILOA Survey Program Level Approaches 1. Portfolios (80% in at least 1 area) 2. Performance assessments 3. Rubrics 4. External judges 5. Student interviews 6. Employer surveys © TBANTA-IUPUI
46
© TWBANTA-IUPUI Student Electronic Portfolio Students take responsibility for demonstrating core skills Unique individual skills and achievements can be emphasized Multi-media opportunities extend possibilities Metacognitive thinking is enhanced through reflection on contents - Sharon J. Hamilton IUPUI
47
More use of RUBRICS locally developed VALUE from AAC&U © TWBANTA-IUPUI
48
VALUE Rubrics Critical thinking Written communication Oral communication Information literacy Teamwork Intercultural knowledge Ethical reasoning © TWBANTA-IUPUI
49
Accountability Report 85% achieve Outstanding ratings in writing as defined... 78% are Outstanding in applying knowledge and skills in internships 75% are Outstanding in delivering an oral presentation © TWBANTA-IUPUI
50
For External Credibility Collaborate on rubrics Use employers as examiners Conduct process audits © TWBANTA-IUPUI
51
E-Port Challenges Reliability of rubrics Student motivation if used for assessment (Barrett, 2009) Differences in topics for products to be evaluated (Sekolsky & Wentland, 2010) © TWBANTA-IUPUI
52
Obstacles to Using Performance-Based Measures Defining domains and constructs Obtaining agreement on what to measure and definitions Defining reliability and validity Creating good measures - Tom Zane WGU © TWBANTA-IUPUI
53
Will it take 80 years... ? 3 Promising Alternatives E portfolios Rubrics Assessment communities - Banta, Griffin, Flateby, Kahn NILOA Paper #2 (2009) © TWBANTA-IUPUI
54
PART 2 Building An Evidence-Based Culture Evidence at several levels An institutional example © TWBANTA-IUPUI
55
Organizational Levels for Assessment National Regional State Campus College Discipline Classroom Student
56
Evidence at the Classroom Level Background Knowledge Probe Minute paper in class Just-in-time teaching on line © TWBANTA-IUPUI
57
Background Knowledge Probe (Pre-Test – Indirect Measure) 1. ARCHAEOLOGY A. Have never heard of this B. Have heard of it, but don’t really know what it means C. Have some idea what it means, but not too clear D. Have a clear idea what this means and can explain it - Classroom Assessment Angelo and Cross
58
Primary Trait Scoring Assigns scores to attributes (traits) of a task STEPS Identify traits necessary for success in assignment Compose scale or rubric giving clear definition to each point Grade using the rubric
59
Assessment of Group Interaction The Student Participant: Listened to others Actively contributed to discussion Challenged others effectively Was willing to alter own opinion Effectively explained concepts/insights Summarized/proposed solutions 5=Consistently excellent 3=Generally satisfactory 1=Inconsistent and/or inappropriate
60
Evidence of Generic Skills ePort with rubrics Standardized tests © TWBANTA-IUPUI
61
Evidence at the Program Level Individual and team projects Research papers Internships Electronic portfolios Peer review © TWBANTA-IUPUI
62
Assessment in Sociology and Anthropology Focus groups of graduating students Given a scenario appropriate to the discipline, a faculty facilitator asks questions related to outcomes faculty have identified in 3 areas: concepts, theory, methods. 2 faculty observers use 0-3 scale to rate each student on each question GROUP scores are discussed by all faculty Murphy & Goreham North Dakota State University
63
© TWBANTA-IUPUI Internships Evaluated against specific criteria by Students Faculty Field-based supervisors
64
Elements of Program Review Self Study Review by Respected Peers Recommendations Follow-up © TWBANTA-IUPUI
65
Evidence at the Institutional Level Learning outcomes Questionnaires, interviews, focus groups Productivity measures Cost analyses Management ratios Program evaluation Peer review Accreditation © TWBANTA-IUPUI
66
Outcomes Assessment Requires Collaboration In setting expected program outcomes In developing sequence of learning experiences (curriculum) In choosing measures In interpreting assessment findings In making responsive improvements
67
Faculty and Staff Development Focus faculty and student affairs professionals on improving learning in and outside class Attend conferences together Study literature on student learning Provide workshops on teaching and learning Provide resources (e.g., grants, summer salary, release time) © TWBANTA-IUPUI
68
Some Evaluative Questions If we undertake a new approach: Is instruction more effective? Are students learning more? Are students more satisfied? Are faculty more satisfied? Do outcomes justify costs? © TWBANTA-IUPUI
69
Campus Interest in Assessment WHAT WORKS in…. increasing student retention? general education? use of technology in instruction? curriculum in the major? © TWBANTA-IUPUI
70
Good assessment is good research... An important question An approach to answer the question Data collection Analysis Report -Gary R. Pike (2000) © TWBANTA-IUPUI
71
Involve Students 1. Set learning expectations in recruiting 2. Communicate learning outcomes in orientation 3. Involve student leaders in promoting learning 4. Involve students in evaluating courses/curricula 5. Let students know their recommendations are used.
72
Student Advisory Council at Montevallo A way to provide continuous student assessment Student Recommendations 1 Develop a statement of expected ethical behaviors for students 2 Add a second research course with lab 3 Increase comparative psychology 4 Add terminals for statistics lab 5 Increase opportunities for research, writing, and speaking
73
Engage Graduates Alverno Penn State © TWBANTA-IUPUI
74
Involve Employers Developing curriculum Assessing student learning © TWBANTA-IUPUI
75
Involving Employers Combination of survey and focus groups for employers of business graduates Identified skills, knowledge, personality attributes sought by employers Encouraged faculty to make curriculum changes Motivated student to develop needed skills Strengthened ties among faculty, students, employers - Kretovics & McCambridge Colorado State University
76
Colorado State University College of Business Curriculum changes based on employer suggestions: 1 credit added to Business Communications for team training and more presentations Ethics & social responsibility now discussed in intro courses New Intro to Business course emphasizing career decision-making More teamwork, oral & written communication, problem-solving in Management survey courses - Kretovics & McCambridge
77
Plan Implement Evaluate Improve Culture of Evidence © TWBANTA-IUPUI
78
PLANNING 1. Campus mission, goals 2. Unit goals aligned 3. Programs based on assessable goals with PIs 4. Annual reports on the Web © TWBANTA-IUPUI
79
Outline for Annual Reports IUPUI Theme Unit Goal Objective Actions Taken Actions Planned Evidence of Progress © TWBANTA-IUPUI
80
Evaluation Services 1. Assessment of learning 2. Surveys 3. Program reviews 4. Performance indicators 5. Program cost analysis 6. Web-based evaluation tools 7. Program evaluation/action research 8. Accreditation © TWBANTA-IUPUI
81
Surveys 1. Enrolled Students Our own NSSE 2. Graduates 3. Employers 4. Stop outs 5. Faculty 6. Staff © TWBANTA-IUPUI
82
Information Gateway http://reports.iupui.edu/gateway/ Information about Students Faculty Staff Alumni Finances © TWBANTA-IUPUI
83
Since 1993 Campus-wide surveys have stimulated changes in Curricula Advising Increased writing practice Increased attention to first-year experiences Placement of graduates © TWBANTA-IUPUI
84
Goal and Objectives for Student Learning Enhance undergraduate student learning and success 1. Strengthen generic skills 2. Provide honors programming 3. Offer learning communities 4. Strengthen advising 5. Provide tutoring and mentoring © TWBANTA-IUPUI
85
Employ Multiple Methods 1) Direct Projects, papers, tests, observations 2) Indirect Questionnaires, interviews, focus groups Unobtrusive measures Syllabi, transcripts © TWBANTA-IUPUI
86
Student Learning Oriented Course Evaluation 1. Learners held high expectations for one another 2. Learners interacted frequently with others 3. Learners participated in learning teams 4. Learners respected diverse talents and ways of learning -Cournoyer Advances in Social Work – Fall 2001
87
Since 1994 Assessment of Learning has stimulated changes in Student support programs Curriculum Methods of instruction Internships Methods of assessment © TWBANTA-IUPUI
88
What is ABC? ABC is a costing methodology based upon the fact that different activities and products consume different proportions of resources Resources Product A Product B Product C Activities © TWBANTA-IUPUI
89
Some tasks within instruction curriculum planning course design class preparation class instruction assessment course evaluation © TWBANTA-IUPUI
90
What Is ABC? Traditional vs. ABC Traditional Accounting Perspective Salary & wages 1,350,000 Benefits 495,000 Travel 45,000 Facilities 220,000 Supplies 90,000 Total $2,200,000 Activity-Based Perspective Teach courses 940,000 Perform research 430,000 Provide service 250,000 Administer programs 350,000 Provide tech support 230,000 Total $2,200,000 © TWBANTA-IUPUI
91
Some Applications of Economic Model 1. Estimate costs of administrative services as compared to cost of outsourcing 2. Determine fees for various programs 3. Restructure processes to expedite work flow and minimize costs © TWBANTA-IUPUI
92
Since 1992 Activity-based Costing has stimulated changes in n Planning n Budgeting n Assessment © TWBANTA-IUPUI
93
Elements of Program Review Self Study Review by Respected Peers Recommendations Follow-up © TWBANTA-IUPUI
94
Goals of Program Review at IUPUI To improve student learning To assess and improve program quality To increase cross-disciplinary collaboration To enhance community connections To reinforce importance of aligning unit and campus planning © TWBANTA-IUPUI
95
Following a Program Review 1. Program receives reviewers’ report 2. Faculty meet to consider findings 3. Faculty respond in writing 4. Program chair, dean, provost meet to consider written response 5. Improvements are implemented © TWBANTA-IUPUI
96
Since 1995 Program Reviews have stimulated changes in Planning for the future Research emphases Faculty hiring priorities Advisory councils Cross-disciplinary collaboration © TWBANTA-IUPUI
97
Program Review at IUPUI www.planning.iupui.edu/assessment/ © TWBANTA-IUPUI
98
THE TEAM 1. Chancellor, Provost 2. IMIR 3. Program Review & Assessment Committee (PRAC) 4. Faculty Development
99
© TWBANTA-IUPUI Open sharing of information and evidence-based decision-making Financial and satisfaction data for units Annual planning/budgeting hearings Performance indicators derived from unit reports over time Campus performance report for community
100
Program Review & Assessment Committee 2 reps from each school 2 librarians Other units Student Life Faculty Development Internship coordinator © TWBANTA-IUPUI
101
Program Review and Assessment Committee Provides a forum for exchange of information about assessment Oversees program review Suggests/provides faculty development Develops annual reports
102
© TWBANTA-IUPUI Characterizing the Culture ▪ Appointment of Assessment Specialists Faculty Development Library Student Life Service Learning Enrollment Services University College ▪ Appointment of Associate Deans for Assessment
103
© TWBANTA-IUPUI Characterizing the Culture New initiatives require assessment University College student support programs Distance learning New academic programs
104
© TWBANTA-IUPUI Characterizing the Culture Promotion & Tenure Guidelines Faculty/Staff Development Grants Awards
105
© TWBANTA-IUPUI Build Assessment into Valued Processes 1. Assessment of learning 2. Curriculum review and revision 3. Survey research 4. Program review 5. Scholarship of Teaching & Learning 6. Evaluation of initiatives 7. Faculty development 8. Promotion & tenure 9. Rewards and recognition
106
© TWBANTA-IUPUI Establishing a Culture of Evidence takes Strong leadership Support Time
107
PART 3 Examples of Effective Assessment Practice © TWBANTA-IUPUI
108
Profiles Invited over 1000 Received 146 Selected 49 for use in full Categorized all 146 and published Web sites © TBANTA-IUPUI
109
Outline for Profiles Background and Purpose Methods over ? Years Resources Required Findings Use of Findings Impact of Using Findings Success Factors Web sites © TBANTA-IUPUI
110
Plan Implement Evaluate Improve Culture of Evidence © TBANTA-IUPUI
111
~ Organization ~ of Principles & Profiles Planning Implementing Improving & Sustaining - Building a Scholarship of Assessment Banta & Associates Jossey-Bass 2002 © TBANTA-IUPUI
112
Planning Principles 1. Engaging stakeholders 2. Connecting assessment to valued goals & processes 3. Creating a written plan 4. Timing assessment 5. Building a culture based on evidence © TWBANTA-IUPUI
113
Planning Profiles Brigham Young University Campus Wiki for degree learning outcomes USMA at West Point Interdisciplinary teams assess 10 mission-related goals for learners Kennesaw State University 2008 CHEA Award for linking assessment with planning, program review, faculty development © TWBANTA-IUPUI
114
USMA @ West Point 6 Developmental Domains 1. Intellectual 2. Physical 3. Military 4. Social 5. Moral-ethical 6. Human spirit © TBANTA-IUPUI
115
USMA @ West Point Intellectual Domain 10 Goals (write, speak, think; engineering, math, info tech) A. Stated learner outcomes 1. Standards a. Rubrics developed by faculty © TBANTA-IUPUI
116
USMA @ West Point Interdisciplinary Goal Teams use Curriculum-embedded direct measures of learning Student surveys (fr., sr.) Graduate survey (3 years after) Employer surveys Employer focus groups © TBANTA-IUPUI
117
USMA @ West Point Use of Assessment Findings Review of core curriculum Changes in warranted areas: History English Engineering Information Technology © TBANTA-IUPUI
118
Implementation Principles 1. Providing leadership 2. Creating faculty/staff development 3. Placing responsibility with unit 4. Using multiple methods 5. Communicating findings © TWBANTA-IUPUI
119
Implementation Profiles California State University, Sacramento Strong leadership, multiple methods Texas Christian University Faculty learning communities Tompkins Cortland Community College Capstone rubrics © KBLACK-IUPUI
120
Cal State-Sacramento (1) Sources of Motivation for Assessment 1. New VP for Student Affairs 2. Reaccreditation looming 3. Enrollment & budget challenges 4. Pledge to become more data-driven and focused on student learning © TWBANTA-IUPUI
121
Cal State-Sacramento (2) 1. Align department & division missions 2. Develop SMART goals, 1 for student learning Specific Measurable Aggressive, yet attainable Results-oriented Timely © TWBANTA-IUPUI
122
Cal State-Sacramento (3) Measures Pre-post MC tests on policies, resources Essays with rubrics (reinstatement) Portfolios Observation of skills (Leadership, RA reports on scenarios, role-playing) © TWBANTA-IUPUI
123
Cal State-Sacramento (4) Findings 1. Some SLOs met 2. Some SLOs not met 3. Some measures not effective 4. Too few participants to assess 5. Too many participants to assess effectively © TWBANTA-IUPUI
124
Cal State-Sacramento (5) Use of Findings 1. Better training for RAs in reporting 2. Better training for peer mentors in orientation (emphasizing policies) 3. More time to discuss films 4. Better PowerPoint presentations 5. Increase participation in counseling 6. Redesign vague test items © TWBANTA-IUPUI
125
Implementation Profiles (Continued) Pennsylvania State University PULSE Survey Moravian College Using technology for curriculum maps Alverno College Portfolios in Teacher Education Northeastern Illinois University Multiple methods including national standardized test © KBLACK-IUPUI
126
A Look At The Profiles Leadership 18% Faculty and Staff Development 18% Responsibility at Unit Level 33% Methods Rubrics 37% Surveys 33% Electronic/Technology 20% Portfolios 14% National Standardized Tests 8% © KBLACK-IUPUI
127
Improving/Sustaining Principles 1. Providing credible evidence of learning to multiple stakeholders 2. Reviewing assessment reports 3. Ensuring use of results 4. Evaluating the assessment process © EJONES-WVU
128
Improving/Sustaining Profiles San Jose State University Specialists in each college, awards, learning outcomes in 5-year plans Hocking Technical College Annual assessment work day Colorado State University Integration of learning outcomes in on-line template for program reviews © TWBANTA-IUPUI
129
Sustaining Professional Development: Faculty Learning Communities Texas Christian University --Six areas in general education 1. religious traditions 2. historical traditions 3. literary traditions 4. global awareness 5. cultural awareness 6. social values 7. citizenship © EJONES-WVU
130
Sustaining Professional Development: Faculty Learning Communities Texas Christian University --Created faculty learning communities to address the following: a. identify and create assessment strategies b. share results of assessment processes c. discuss results to enhance teaching and learning experiences © EJONES-WVU
131
Required Resources To Implement and Sustain Assessment 1. Faculty release time 2. Stipends for faculty leaders 3. Assessment committee 4. New full-time assessment position created 5. External consultants © EJONES-WVU
132
Required Resources To Implement and Sustain Assessment 6. Financial resources to pay for tests and purchase surveys 7. Administrative support 8. Professional development 9. Technology © EJONES-WVU
133
Some Big Ideas Influence of accreditation is strong Engaging faculty may require extra pay Standardized tests of generic skills are not used alone Linking assessment with planning and program review works Impact is not measured in learning gains © TWBANTA-IUPUI
134
Group Assessment Has Failed to Demonstrate Institutional Accountability Focus on improvement at unit level Rare aggregation of data centrally Too few faculty involved Involved faculty return to discipline HE scholars focused on K-12 assessment
135
Impact of Using Findings More attention to: improving assessment tools need to do assessment participating in faculty development using assessment findings © TWBANTA-IUPUI
136
Where Learning Has Improved Alverno College – Milwaukee, WI Truman State University – Kirksville, MO © TWBANTA-IUPUI
137
Computer-Based Testing at James Madison University Information Literacy Scientific Reasoning Quantitative Reasoning
138
University of South Florida Cognitive Level & Quality of Writing Assessment (CLAQWA) Rubric of 16 traits X 5 (Bloom’s) levels Used by peers and teachers Improves writing and thinking © TWBANTA-IUPUI
139
San Diego State University In portfolios, masters & doctoral students reflect on curricular & co-curricular learning program learning outcomes Oral presentations of synthesized learning Evaluated by faculty, external professionals Synthesized learning has improved © TWBANTA-IUPUI
140
North Carolina State University DEAL Model for Critical Reflection (Description, Examination, Articulation of Learning) Rubric levels based on Bloom’s Taxonomy Improves higher order reasoning and critical thinking skills © TWBANTA-IUPUI
141
Plan Implement Evaluate Improve Culture of Evidence © TBANTA-IUPUI
142
Building A Scholarship of Assessment National Institute for Learning Outcomes Assessment (NILOA) AAC&U’s VALUE Project Teagle’s Wabash Study and Assessment Scholars Lumina’s big Goal and Degree Qualifications Framework New Leadership Alliance for Student Learning & Accountability © TWBANTA-IUPUI
143
ASSESSMENT UPDATE Bi-monthly Published by Jossey-Bass Since 1989 Articles up to 2000 words 4 Columns Book Reviews © TWBANTA-IUPUI
144
Scholarship Reconsidered Four kinds of scholarship Discovery Integration Application Teaching -Boyer (1990)
145
© TWBANTA-IUPUI SoTL differs from the scholarship of discovery in its focus on the classroom L.Shulman (2004)
146
© TWBANTA-IUPUI SoTL Approach to Classroom Research 1. Articulate learning goals. 2. Formulate a question about the learning situation based on the goals. 3. Design a way to collect data. 4. Teach to the goals. 5. Assess the student learning toward the goal. 6. Analyze the feedback. 7. Reflect on the results for future teaching decisions. 8. Share the results.
147
© TWBANTA-IUPUI SoTL involves 1. Systematic investigation of a research question 2. Study of related literature 3. Going public with findings 4. Critical review by peers 5. Use of research as foundation for further work H. Timberg (2007)
148
© TWBANTA-IUPUI Types of SoTL Questions The context of teaching: institutional factors, physical facilities, organizational support Example: Is it more effective to teach this class in one-hour or two-hour sessions? Example: How does sitting around tables rather than sitting in rows affect learning?
149
© TWBANTA-IUPUI Sound Familiar? The Scholarship of Assessment and Scholarship of Teaching and Learning are integrally related: Can address topics besides learning (civic engage ment) Study of learning issues in actual settings based on evidence resulting in public sharing Can in- clude con- ceptual non- empirical questions or issues SoA SoTL
150
© TWBANTA-IUPUI Building a Scholarship of Assessment - Banta & Associates Jossey-Bass Publishers April 2002
151
© TWBANTA-IUPUI Scholarly Assessment Involves o selecting/creating assessment methods o trying the methods o reflecting on strengths/weaknesses o modifying the methods or trying new ones o improving assessment continuously
152
© TWBANTA-IUPUI Scholarly Assessment Conduct syllabus analysis - Is critical thinking emphasized? Develop student guide to assessment - Do students understand why and how they are assessed?
153
© TWBANTA-IUPUI The Scholarship of Assessment Involves basing assessment studies on relevant theory/practice gathering evidence developing a summary of findings sharing findings with the assessment community
154
© TWBANTA-IUPUI Scholarship of Assessment Compare two teaching methods - Is technology-enhanced instruction more effective? Validate a measure of student civility - Do interventions increase civility?
155
Barriers to Scholarship in Assessment Campus coordinators are trained in other disciplines Scholars in relevant fields don’t do outcomes assessment Assessment scholarship is not rewarded Campus coordinators return to their own disciplines Few graduate programs prepare assessors © TWBANTA-IUPUI
156
Some Research Traditions Underlying Assessment Program evaluation Organizational change and development Cognitive psychology Student development Measurement Informatics © TWBANTA-IUPUI
157
Assessment Methods Improve instruments to measure content knowledge at more complex levels affective development effects of educational interventions changes in learning over time © TWBANTA-IUPUI
158
Assessment Methods How can we use technology in assessment more effectively? How can we demonstrate the validity of locally developed instruments? How can faculty make consensual judgments about the quality of student performance? How can student feedback be designed to help faculty improve their teaching? © TWBANTA-IUPUI
159
Organizational Behavior & Development How can assessment be combined with other systemic changes to improve teaching & learning? What patterns of organizational behavior promote and sustain assessment? What methods of providing and managing assessment information are most effective? Which public policy initiatives are most effective in promoting improvement on campuses? © TWBANTA-IUPUI
160
Shared Reflective Practice Conduct meta-evaluations of approaches to assessment Determine what works best within disciplines Develop consortia of institutions to provide forums for reflection © TWBANTA-IUPUI
161
Engaging Faculty Introduce assessment as research Connect assessment with the scholarship of teaching Support learning about assessment through faculty development © TWBANTA-IUPUI
162
Targets for Research on Engaging Faculty How can we determine the interests and commitments of stakeholders? How should we educate stakeholders for choosing methods? How can we reduce costs and maximize assessment’s benefits? What ethical principles should guide our work? Derived from Michael Quinn Patton’s Utilization – Focused Evaluation (1997) © TWBANTA-IUPUI
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.