WASC Assessment Leadership Academy Oakland, California August 1, 2011 Presentation by Trudy W. Banta Professor of Higher Education and Senior Advisor to.

Slides:



Advertisements
Similar presentations
Ability-Based Education at Alverno College. Proposed Outcomes for Session 1. To introduce you to Alvernos approach to designing integrative general education.
Advertisements

What Did We Learn About Our Future? Getting Ready for Strategic Planning Spring 2012.
The Assessment Imperative: A Work in Progress A Focus on Competencies Ricky W. Griffin, Interim Dean Mays Business School Texas A&M University.
Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
Building an Evidence-Based Culture in Student Affairs Presented at the Ivy Tech Community College Student Affairs Leadership Summit July 1, 2009 By Trudy.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
An Outcomes-based Assessment Model for General Education Amy Driscoll WASC EDUCATIONAL SEMINAR February 1, 2008.
Graduate Program Assessment Report. University of Central Florida Mission Communication M.A. Program is dedicated to serving its students, faculty, the.
WASC Accreditation Process DUE Managers Meeting December 2, 2009 Sharon Salinger and Judy Shoemaker.
What’s in the works for General Education Assessment? Dan McCollum, Ph.D. Associate Director of Academic Assessment.
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
Benchmarking Effective Educational Practice Community Colleges of the State University of New York April, 2005.
Standards and Guidelines for Quality Assurance in the European
Maps, Rubrics and Templates A Primer on Their Uses for Assessment in Student Affairs.
College of Basic and Applied Sciences Advising/Retention Report.
CAA’s IBHE Program Review Presentation April 22, 2011.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
LEARNING PROFILE Title of Degree Program PROGRAM CHARACTERISTICS (Description, Unique Experiences, Inputs, Outcomes) (EXAMPLES) Year Established. Accreditation.
Spring 2012 Pilot Project Module Nine A New Texas Core Curriculum 1.
Maureen Noonan Bischof Eden Inoway-Ronnie Office of the Provost Higher Learning Commission of the North Central Association Annual Meeting April 22, 2007.
St. Petersburg College CCSSE 2011 Findings Board of Trustees Meeting.
The Future of Higher Education in Texas
Engaging the Arts and Sciences at the University of Kentucky Working Together to Prepare Quality Educators.
Building Collaborative Initiatives that Enhance Student Learning Nancy Mitchell and Linda Major.
Assessment Cycle California Lutheran University Deans’ Council February 6, 2006.
Profiles of Good Practice in Assessing Student Learning Outcomes Presented at the Symposium on Tertiary Assessment and Student Outcomes at the Victoria.
AAHE 2004 Connecting Public Audiences to the College Experience: A Model of General Education Assessment Susan L. Davis James Madison University A. Katherine.
The Scholarship of Civic Engagement Adapted from a presentation by Robert G. Bringle Director, Center for Service and Learning Indiana University-Purdue.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Institutional Outcomes and their Implications for Student Learning by John C. Savagian History Department Alverno C O L L E G E.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
ESU’s NSSE 2013 Overview Joann Stryker Office of Institutional Research and Assessment University Senate, March 2014.
UCF University-wide System for Assessing Student Learning Outcomes Dr. Julia Pet-Armacost Assistant VP, Information, Planning, and Assessment University.
NSSE 2013 How to Use Results (or “Why you should care about NSSE”) 8/26/
Florida Education Center Tallahassee, Florida December 3, 2003 Dr. R. E. LeMon Vice Chancellor for Academic and Student Affairs 1 Florida Board of Governors.
WHO Global Standards. 5 Key Areas for Global Standards Program graduates Program graduates Program development and revision Program development and revision.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
Preparing and Evaluating 21 st Century Faculty Aligning Expectations, Competencies and Rewards The NACU Teagle Grant Nancy Hensel, NACU Rick Gillman, Valporaiso.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
U.S. Survey Reveals Small Impact of Outcomes Assessment on Student Learning Presented at the 31 st EAIR Forum Vilnius, Lithuania 24 August, 2009 By Trudy.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
STRATEGIC PLANNING & WASC UPDATE Tom Bennett Presentation to Academic Senate February 1, 2006.
Using Groups in Academic Advising Dr. Nancy S. King Kennesaw State University.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Accreditation Update and Institutional Student Learning Outcomes Deborah Moeckel, SUNY Assistant Provost SCoA Drive in Workshops Fall 2015
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
The Future of Higher Education in Texas Dr. Larry R. Faulkner Vice-Chair, Higher Education Strategic Planning Committee Presentation to Texas Higher Education.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
The University of Texas-Pan American National Survey of Student Engagement 2013 Presented by: November 2013 Office of Institutional Research & Effectiveness.
Service Learning: What is it and how can it enhance student learning? Kim Buch Psychology.
The University of Texas-Pan American National Survey of Student Engagement 2014 Presented by: October 2014 Office of Institutional Research & Effectiveness.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Can Institutions Really Be Compared Using Standardised Tests? Presented at the 2008 Meeting of the European Association for Institutional Research Copenhagen,
Maja Holmes and Margaret Stout West Virginia University
UTRGV 2016 National Survey of Student Engagement (NSSE)
Guided Pathways at California Community Colleges
Guided Pathways at California Community Colleges
Guided Pathways at California Community Colleges
Program Assessment Processes for Developing and Strengthening
The Heart of Student Success
UTRGV 2017 National Survey of Student Engagement (NSSE)
We VALUE HIPs Utilizing VALUE Rubrics and HIP QA Tools in Course Revitalization Presented by Melynda Conner, TBR OSS HIP Specialist 2019.
Finalization of the Action Plans and Development of Syllabus
Internal and External Quality Assurance Systems for Cycle 3 (Doctoral) programmes "PROMOTING INTERNATIONALIZATION OF RESEARCH THROUGH ESTABLISHMENT AND.
Presentation transcript:

WASC Assessment Leadership Academy Oakland, California August 1, 2011 Presentation by Trudy W. Banta Professor of Higher Education and Senior Advisor to the Chancellor for Academic Planning and Evaluation Indiana University-Purdue University Indianapolis 355 N. Lansing St., AO 140 Indianapolis, Indiana iupui.edu

Outline The Great Testing Debate Alternatives to standardized tests of generic skills © TWBANTA-IUPUI

Group Assessment Has Failed to Demonstrate Institutional Accountability Focus on improvement at unit level Rare aggregation of data centrally Too few faculty involved HE scholars focused on K-12 assessment

© TWBANTA-IUPUI 2006 Commission on the Future of Higher Education  We need a simple way to compare institutions  The results of student learning assessment, including value added measurements (showing skill improvement over time) should be... reported in the aggregate publicly.

© TWBANTA-IUPUI Now We Have the Press to Assess with a Test

My History  Educational psychology  Program evaluation & measurement  Performance funding in Tennessee  1990 USDOE effort to build a national test  1992 Initiated evidence-based culture at IUPUI © TWBANTA-IUPUI

2007 Voluntary System of Accountability ~ Assessment of Learning ~ defined as critical thinking, written communication, analytic reasoning © TWBANTA-IUPUI

VSA Recommendations (over my objections)  Collegiate Assessment of Academic Proficiency (CAAP)  Measuring Academic Proficiency & Progress (MAPP)  Collegiate Learning Assessment (CLA)  (College BASE) © TWBANTA-IUPUI

TN = Most Prescriptive (5.45% of Budget for Instruction) 1. Accredit all accreditable programs (25) 2. Test all seniors in general education (25) 3. Test seniors in 20% of majors (20) 4. Give an alumni survey (15) 5. Demonstrate use of data to improve (15) ___ 100

© TWBANTA-IUPUI At the University of Tennessee CAAP Academic Profile (now MAPP) COMP (like CLA and withdrawn by 1990) College BASE

© TWBANTA-IUPUI In TN We Learned 1) No test measured 30% of gen ed skills 2) Tests of generic skills measure primarily prior learning 3) Reliability of value added =.1 4) Test scores give few clues to guide improvement actions

An Inconvenient Truth.88 = correlation of VSA – recommended test scores with SAT/ACT scores thus 77% of the variance in institutions’ scores Is due to students’ prior learning © TWBANTA-IUPUI

How Much of the Variance in Senior Scores is Due to College Effects? Student motivation to attend that institution (mission differences) Student mix based on age, gender socioeconomic status race/ethnicity transfer status college major

© TWBANTA-IUPUI How Much of the Variance in Senior Scores is Due to College Effects? (continued) Student motivation to do well Sampling error Measurement error Test anxiety  College effects ______ 23 %

Threats to Conclusions Based on Test Scores 1. Measurement error 2. Sampling error 3. Different tests yield different results 4. Different ways of presenting results 5. Test bias 6. Pressure to raise scores - Daniel Koretz “Measuring Up” Harvard U. Press © TWBANTA-IUPUI

Using NSSE 95% variance in responses is WITHIN Institutions - Pike, Kuh, McCormick, Ethington & Smart (2011) © TWBANTA-IUPUI

Student Motivation Samples of students are being tested Extrinsic motivators (cash, prizes) are used We have learned: Only a requirement and intrinsic motivation will bring seniors in to do their best

Recent University of Texas Experience  30 – 40% of seniors at flagships earn highest CLA score (ceiling effect)  flagship campuses have lowest value added scores © TWBANTA-IUPUI

Concerns About Value Added Student attrition Proportion of transfer students Different methods of calculating Unreliability Confounding effects of maturation © TWBANTA-IUPUI

Word from Measurement Experts Given the complexity of educational settings, we may never be satisfied that value added models can be used to appropriately partition the causal effects of teacher, school, and student on measured changes in standardized test scores. - Henry Braun & Howard Wainer Handbook of Statistics, Vol. 26: Psychometrics Elsevier 2007 © TWBANTA-IUPUI

Employing currently available standardized tests of generic skills to compare the quality of institutions is not a valid use of those tests. © TWBANTA-IUPUI

Consequences of Scores Below Expectations?  Public criticism  Loss of students  Reduced funding  Corrective actions imposed  Prep courses for students  Profits for coaching consultants

2009 NILOA Survey Campus-wide Approaches to Assessment 1. A national survey (76%) 2. Standardized test of general skills (39%) 3. Portfolios, specialized tests, external judges © TBANTA-IUPUI

OECD’s AHELO for HEIs from 15 countries 1. Generic skills (CLA) 2. Disciplines (Engineering and Economics) 3. Value added 4. Contextual information indicators © TWBANTA-IUPUI

Designing Effective Assessment: Principles & Profiles of Good Practice Trudy W. Banta Elizabeth A. Jones Karen E. Black Jossey-Bass (Wiley) 2009 © TWBANTA-IUPUI

Profiles  Invited over 1000  Received 146  Selected 49 for use in full  Categorized all 146 and published Web sites © TWBANTA-IUPUI

Outline for Profiles  Background and Purpose  Methods over ? Years  Resources Required  Findings  Use of Findings  Impact of Using Findings  Success Factors  Web sites © TWBANTA-IUPUI

Standardized tests of generic skills (e.g., writing, critical thinking)  used by just 8%  always supplemented © TBANTA-IUPUI

© TWBANTA-IUPUI Standardized tests CAN initiate conversation

Advantages of standardized tests of generic skills  promise of increased reliability & validity  norms for comparison © TWBANTA-IUPUI

Limitations of standardized tests of generic skills  cannot cover all a student knows  narrow coverage, need to supplement  difficult to motivate students to take them! © TWBANTA-IUPUI

Better Ways to Demonstrate Accountability Performance Indicators 1. Access (to promote social mobility) 2. Engaging student experience 3. Workforce development 4. Economic development 5. Civic contribution of students, faculty, staff, graduates © TWBANTA-IUPUI

Effects of Education on Social Mobility Relate data on  high school courses and completion  college courses and completion to Career placements & earnings  private and public employment  military enlistments  incarcerations - Florida DOE © TWBANTA-IUPUI

Australian DOE 1. Increase participation of individuals from low SES backgrounds  as undergraduates  as graduate students 2. Improve engagement & satisfaction  survey of student engagement  first year retention  satisfaction of completers © TWBANTA-IUPUI

Workforce Development % graduates placed in field % graduates placed locally Degree, certificate, CE programs aligned with regional priorities Internships/coop programs Increased earnings of graduates - APLU © TWBANTA-IUPUI

Economic Development Creation of intellectual property Patent and license awards # start-up companies Sponsored research $  % NSF, NIH Use of academic facilities by industry Alignment of assets to support regional economic clusters - APLU © TWBANTA-IUPUI

Civic Contribution  Internships supported by institution  Volunteer hours for students, faculty, staff  Service learning placements  $ contributed to United Way et al.  Pro bono legal & health services  No-cost presentations to community groups © TWBANTA-IUPUI

Land-Grant Extension 1. Useful information for  agricultural producers  small business owners  rural families 2. After-school STEM programs 3. Driver-ed for teen traffic offenders 4. Green jobs in energy fields 5. Sustainable agriculture practices © TWBANTA-IUPUI

National Governors Association Center for Best Practices A simple graduation rate  penalizes colleges serving low SES students  may discourage open enrollment  may lead to lower graduation standards © TWBANTA-IUPUI

National Governors Association Center for Best Practices Track intermediate student milestones  Successful completion of remedial and core courses  Advancement from remedial to credit courses  Transfer from 2-to 4-year college  Credential attainment © TWBANTA-IUPUI

Oregon Community Colleges track:  % needing remedial courses  % completing remedial or ESL courses  # credits earned each year toward degree or certificate  Semester-to-semester and fall-to-fall persistence - Inside HE 10/12/09 © TWBANTA-IUPUI

Peter Ewell: “... shift state funding formulas so that colleges receive money based on how many students are still enrolled by the end of the academic term rather than at the beginning.” - Inside HE 11/18/09 © TWBANTA-IUPUI

If We Must Measure Learning Let’s Use: 1. Standardized tests in major fields  licensure and certification tests  ETS Major Field Tests 2. Internship performance 3. Senior projects 4. Study abroad performance 5. Electronic portfolios 6. External examiners © TWBANTA-IUPUI

Western Governors University offering Competence-based on-line degrees 1) Define the domain of knowledge/skill 2) Develop objective test items 3) Design performance tasks 4) Evaluate performance 5) Use findings to improve curriculum, instruction, student services 6) Continuously improve assessment quality and validity

2009 NILOA Survey Program Level Approaches 1. Portfolios (80% in at least 1 area) 2. Performance assessments 3. Rubrics 4. External judges 5. Student interviews 6. Employer surveys © TBANTA-IUPUI

© TWBANTA-IUPUI Student Electronic Portfolio Students take responsibility for demonstrating core skills Unique individual skills and achievements can be emphasized Multi-media opportunities extend possibilities Metacognitive thinking is enhanced through reflection on contents - Sharon J. Hamilton IUPUI

More use of RUBRICS  locally developed  VALUE from AAC&U © TWBANTA-IUPUI

VALUE Rubrics Critical thinking Written communication Oral communication Information literacy Teamwork Intercultural knowledge Ethical reasoning © TWBANTA-IUPUI

Accountability Report  85% achieve Outstanding ratings in writing as defined...  78% are Outstanding in applying knowledge and skills in internships  75% are Outstanding in delivering an oral presentation © TWBANTA-IUPUI

For External Credibility  Collaborate on rubrics  Use employers as examiners  Conduct process audits © TWBANTA-IUPUI

E-Port Challenges Reliability of rubrics Student motivation if used for assessment (Barrett, 2009) Differences in topics for products to be evaluated (Sekolsky & Wentland, 2010) © TWBANTA-IUPUI

Obstacles to Using Performance-Based Measures Defining domains and constructs Obtaining agreement on what to measure and definitions Defining reliability and validity Creating good measures - Tom Zane WGU © TWBANTA-IUPUI

Will it take 80 years... ? 3 Promising Alternatives E portfolios Rubrics Assessment communities - Banta, Griffin, Flateby, Kahn NILOA Paper #2 (2009) © TWBANTA-IUPUI

PART 2 Building An Evidence-Based Culture  Evidence at several levels  An institutional example © TWBANTA-IUPUI

Organizational Levels for Assessment National Regional State Campus College Discipline Classroom Student

Evidence at the Classroom Level Background Knowledge Probe Minute paper in class Just-in-time teaching on line © TWBANTA-IUPUI

Background Knowledge Probe (Pre-Test – Indirect Measure) 1. ARCHAEOLOGY A. Have never heard of this B. Have heard of it, but don’t really know what it means C. Have some idea what it means, but not too clear D. Have a clear idea what this means and can explain it - Classroom Assessment Angelo and Cross

Primary Trait Scoring Assigns scores to attributes (traits) of a task STEPS  Identify traits necessary for success in assignment  Compose scale or rubric giving clear definition to each point  Grade using the rubric

Assessment of Group Interaction The Student Participant: Listened to others Actively contributed to discussion Challenged others effectively Was willing to alter own opinion Effectively explained concepts/insights Summarized/proposed solutions 5=Consistently excellent 3=Generally satisfactory 1=Inconsistent and/or inappropriate

Evidence of Generic Skills ePort with rubrics Standardized tests © TWBANTA-IUPUI

Evidence at the Program Level Individual and team projects Research papers Internships Electronic portfolios Peer review © TWBANTA-IUPUI

Assessment in Sociology and Anthropology Focus groups of graduating students  Given a scenario appropriate to the discipline, a faculty facilitator asks questions related to outcomes faculty have identified in 3 areas: concepts, theory, methods.  2 faculty observers use 0-3 scale to rate each student on each question  GROUP scores are discussed by all faculty Murphy & Goreham North Dakota State University

© TWBANTA-IUPUI Internships Evaluated against specific criteria by Students Faculty Field-based supervisors

Elements of Program Review  Self Study  Review by Respected Peers  Recommendations  Follow-up © TWBANTA-IUPUI

Evidence at the Institutional Level Learning outcomes Questionnaires, interviews, focus groups Productivity measures Cost analyses Management ratios Program evaluation Peer review Accreditation © TWBANTA-IUPUI

Outcomes Assessment Requires Collaboration  In setting expected program outcomes  In developing sequence of learning experiences (curriculum)  In choosing measures  In interpreting assessment findings  In making responsive improvements

Faculty and Staff Development  Focus faculty and student affairs professionals on improving learning in and outside class  Attend conferences together  Study literature on student learning  Provide workshops on teaching and learning  Provide resources (e.g., grants, summer salary, release time) © TWBANTA-IUPUI

Some Evaluative Questions If we undertake a new approach:  Is instruction more effective?  Are students learning more?  Are students more satisfied?  Are faculty more satisfied?  Do outcomes justify costs? © TWBANTA-IUPUI

Campus Interest in Assessment WHAT WORKS in….  increasing student retention?  general education?  use of technology in instruction?  curriculum in the major? © TWBANTA-IUPUI

Good assessment is good research...  An important question  An approach to answer the question  Data collection  Analysis  Report -Gary R. Pike (2000) © TWBANTA-IUPUI

Involve Students 1. Set learning expectations in recruiting 2. Communicate learning outcomes in orientation 3. Involve student leaders in promoting learning 4. Involve students in evaluating courses/curricula 5. Let students know their recommendations are used.

Student Advisory Council at Montevallo A way to provide continuous student assessment Student Recommendations 1 Develop a statement of expected ethical behaviors for students 2 Add a second research course with lab 3 Increase comparative psychology 4 Add terminals for statistics lab 5 Increase opportunities for research, writing, and speaking

Engage Graduates Alverno Penn State © TWBANTA-IUPUI

Involve Employers Developing curriculum Assessing student learning © TWBANTA-IUPUI

Involving Employers Combination of survey and focus groups for employers of business graduates  Identified skills, knowledge, personality attributes sought by employers  Encouraged faculty to make curriculum changes  Motivated student to develop needed skills  Strengthened ties among faculty, students, employers - Kretovics & McCambridge Colorado State University

Colorado State University College of Business Curriculum changes based on employer suggestions:  1 credit added to Business Communications for team training and more presentations  Ethics & social responsibility now discussed in intro courses  New Intro to Business course emphasizing career decision-making  More teamwork, oral & written communication, problem-solving in Management survey courses - Kretovics & McCambridge

Plan Implement Evaluate Improve Culture of Evidence © TWBANTA-IUPUI

PLANNING 1. Campus mission, goals 2. Unit goals aligned 3. Programs based on assessable goals with PIs 4. Annual reports on the Web © TWBANTA-IUPUI

Outline for Annual Reports  IUPUI Theme  Unit Goal  Objective Actions Taken Actions Planned  Evidence of Progress © TWBANTA-IUPUI

Evaluation Services 1. Assessment of learning 2. Surveys 3. Program reviews 4. Performance indicators 5. Program cost analysis 6. Web-based evaluation tools 7. Program evaluation/action research 8. Accreditation © TWBANTA-IUPUI

Surveys 1. Enrolled Students  Our own  NSSE 2. Graduates 3. Employers 4. Stop outs 5. Faculty 6. Staff © TWBANTA-IUPUI

Information Gateway Information about  Students  Faculty  Staff  Alumni  Finances © TWBANTA-IUPUI

Since 1993 Campus-wide surveys have stimulated changes in  Curricula  Advising  Increased writing practice  Increased attention to first-year experiences  Placement of graduates © TWBANTA-IUPUI

Goal and Objectives for Student Learning  Enhance undergraduate student learning and success 1. Strengthen generic skills 2. Provide honors programming 3. Offer learning communities 4. Strengthen advising 5. Provide tutoring and mentoring © TWBANTA-IUPUI

Employ Multiple Methods 1) Direct  Projects, papers, tests, observations 2) Indirect  Questionnaires, interviews, focus groups  Unobtrusive measures Syllabi, transcripts © TWBANTA-IUPUI

Student Learning Oriented Course Evaluation 1. Learners held high expectations for one another 2. Learners interacted frequently with others 3. Learners participated in learning teams 4. Learners respected diverse talents and ways of learning -Cournoyer Advances in Social Work – Fall 2001

Since 1994 Assessment of Learning has stimulated changes in  Student support programs  Curriculum  Methods of instruction  Internships  Methods of assessment © TWBANTA-IUPUI

What is ABC? ABC is a costing methodology based upon the fact that different activities and products consume different proportions of resources Resources Product A Product B Product C Activities © TWBANTA-IUPUI

Some tasks within instruction  curriculum planning  course design  class preparation  class instruction  assessment  course evaluation © TWBANTA-IUPUI

What Is ABC? Traditional vs. ABC Traditional Accounting Perspective  Salary & wages 1,350,000  Benefits 495,000  Travel 45,000  Facilities 220,000  Supplies 90,000 Total $2,200,000 Activity-Based Perspective  Teach courses 940,000  Perform research 430,000  Provide service 250,000  Administer programs 350,000  Provide tech support 230,000  Total $2,200,000 © TWBANTA-IUPUI

Some Applications of Economic Model 1. Estimate costs of administrative services as compared to cost of outsourcing 2. Determine fees for various programs 3. Restructure processes to expedite work flow and minimize costs © TWBANTA-IUPUI

Since 1992 Activity-based Costing has stimulated changes in n Planning n Budgeting n Assessment © TWBANTA-IUPUI

Elements of Program Review  Self Study  Review by Respected Peers  Recommendations  Follow-up © TWBANTA-IUPUI

Goals of Program Review at IUPUI  To improve student learning  To assess and improve program quality  To increase cross-disciplinary collaboration  To enhance community connections  To reinforce importance of aligning unit and campus planning © TWBANTA-IUPUI

Following a Program Review 1. Program receives reviewers’ report 2. Faculty meet to consider findings 3. Faculty respond in writing 4. Program chair, dean, provost meet to consider written response 5. Improvements are implemented © TWBANTA-IUPUI

Since 1995 Program Reviews have stimulated changes in Planning for the future Research emphases Faculty hiring priorities Advisory councils Cross-disciplinary collaboration © TWBANTA-IUPUI

Program Review at IUPUI © TWBANTA-IUPUI

THE TEAM 1. Chancellor, Provost 2. IMIR 3. Program Review & Assessment Committee (PRAC) 4. Faculty Development

© TWBANTA-IUPUI Open sharing of information and evidence-based decision-making Financial and satisfaction data for units Annual planning/budgeting hearings Performance indicators derived from unit reports over time Campus performance report for community

Program Review & Assessment Committee Ÿ 2 reps from each school Ÿ 2 librarians Ÿ Other units  Student Life  Faculty Development  Internship coordinator © TWBANTA-IUPUI

Program Review and Assessment Committee Provides a forum for exchange of information about assessment Oversees program review Suggests/provides faculty development Develops annual reports

© TWBANTA-IUPUI Characterizing the Culture ▪ Appointment of Assessment Specialists Faculty Development Library Student Life Service Learning Enrollment Services University College ▪ Appointment of Associate Deans for Assessment

© TWBANTA-IUPUI Characterizing the Culture New initiatives require assessment  University College student support programs  Distance learning  New academic programs

© TWBANTA-IUPUI Characterizing the Culture  Promotion & Tenure Guidelines  Faculty/Staff Development Grants  Awards

© TWBANTA-IUPUI Build Assessment into Valued Processes 1. Assessment of learning 2. Curriculum review and revision 3. Survey research 4. Program review 5. Scholarship of Teaching & Learning 6. Evaluation of initiatives 7. Faculty development 8. Promotion & tenure 9. Rewards and recognition

© TWBANTA-IUPUI Establishing a Culture of Evidence takes  Strong leadership  Support  Time

PART 3 Examples of Effective Assessment Practice © TWBANTA-IUPUI

Profiles  Invited over 1000  Received 146  Selected 49 for use in full  Categorized all 146 and published Web sites © TBANTA-IUPUI

Outline for Profiles  Background and Purpose  Methods over ? Years  Resources Required  Findings  Use of Findings  Impact of Using Findings  Success Factors  Web sites © TBANTA-IUPUI

Plan Implement Evaluate Improve Culture of Evidence © TBANTA-IUPUI

~ Organization ~ of Principles & Profiles  Planning  Implementing  Improving & Sustaining - Building a Scholarship of Assessment Banta & Associates Jossey-Bass 2002 © TBANTA-IUPUI

Planning Principles 1. Engaging stakeholders 2. Connecting assessment to valued goals & processes 3. Creating a written plan 4. Timing assessment 5. Building a culture based on evidence © TWBANTA-IUPUI

Planning Profiles  Brigham Young University Campus Wiki for degree learning outcomes  USMA at West Point Interdisciplinary teams assess 10 mission-related goals for learners  Kennesaw State University 2008 CHEA Award for linking assessment with planning, program review, faculty development © TWBANTA-IUPUI

West Point 6 Developmental Domains 1. Intellectual 2. Physical 3. Military 4. Social 5. Moral-ethical 6. Human spirit © TBANTA-IUPUI

West Point Intellectual Domain 10 Goals (write, speak, think; engineering, math, info tech) A. Stated learner outcomes 1. Standards a. Rubrics developed by faculty © TBANTA-IUPUI

West Point Interdisciplinary Goal Teams use Curriculum-embedded direct measures of learning Student surveys (fr., sr.) Graduate survey (3 years after) Employer surveys Employer focus groups © TBANTA-IUPUI

West Point Use of Assessment Findings  Review of core curriculum  Changes in warranted areas: History English Engineering Information Technology © TBANTA-IUPUI

Implementation Principles 1. Providing leadership 2. Creating faculty/staff development 3. Placing responsibility with unit 4. Using multiple methods 5. Communicating findings © TWBANTA-IUPUI

Implementation Profiles California State University, Sacramento Strong leadership, multiple methods Texas Christian University Faculty learning communities Tompkins Cortland Community College Capstone rubrics © KBLACK-IUPUI

Cal State-Sacramento (1) Sources of Motivation for Assessment 1. New VP for Student Affairs 2. Reaccreditation looming 3. Enrollment & budget challenges 4. Pledge to become more data-driven and focused on student learning © TWBANTA-IUPUI

Cal State-Sacramento (2) 1. Align department & division missions 2. Develop SMART goals, 1 for student learning  Specific  Measurable  Aggressive, yet attainable  Results-oriented  Timely © TWBANTA-IUPUI

Cal State-Sacramento (3) Measures  Pre-post MC tests on policies, resources  Essays with rubrics (reinstatement)  Portfolios  Observation of skills (Leadership, RA reports on scenarios, role-playing) © TWBANTA-IUPUI

Cal State-Sacramento (4) Findings 1. Some SLOs met 2. Some SLOs not met 3. Some measures not effective 4. Too few participants to assess 5. Too many participants to assess effectively © TWBANTA-IUPUI

Cal State-Sacramento (5) Use of Findings 1. Better training for RAs in reporting 2. Better training for peer mentors in orientation (emphasizing policies) 3. More time to discuss films 4. Better PowerPoint presentations 5. Increase participation in counseling 6. Redesign vague test items © TWBANTA-IUPUI

Implementation Profiles (Continued) Pennsylvania State University PULSE Survey Moravian College Using technology for curriculum maps Alverno College Portfolios in Teacher Education Northeastern Illinois University Multiple methods including national standardized test © KBLACK-IUPUI

A Look At The Profiles Leadership 18% Faculty and Staff Development 18% Responsibility at Unit Level 33% Methods Rubrics 37% Surveys 33% Electronic/Technology 20% Portfolios 14% National Standardized Tests 8% © KBLACK-IUPUI

Improving/Sustaining Principles 1. Providing credible evidence of learning to multiple stakeholders 2. Reviewing assessment reports 3. Ensuring use of results 4. Evaluating the assessment process © EJONES-WVU

Improving/Sustaining Profiles  San Jose State University Specialists in each college, awards, learning outcomes in 5-year plans  Hocking Technical College Annual assessment work day  Colorado State University Integration of learning outcomes in on-line template for program reviews © TWBANTA-IUPUI

Sustaining Professional Development: Faculty Learning Communities  Texas Christian University --Six areas in general education 1. religious traditions 2. historical traditions 3. literary traditions 4. global awareness 5. cultural awareness 6. social values 7. citizenship © EJONES-WVU

Sustaining Professional Development: Faculty Learning Communities  Texas Christian University --Created faculty learning communities to address the following: a. identify and create assessment strategies b. share results of assessment processes c. discuss results to enhance teaching and learning experiences © EJONES-WVU

Required Resources To Implement and Sustain Assessment 1. Faculty release time 2. Stipends for faculty leaders 3. Assessment committee 4. New full-time assessment position created 5. External consultants © EJONES-WVU

Required Resources To Implement and Sustain Assessment 6. Financial resources to pay for tests and purchase surveys 7. Administrative support 8. Professional development 9. Technology © EJONES-WVU

Some Big Ideas Influence of accreditation is strong Engaging faculty may require extra pay Standardized tests of generic skills are not used alone Linking assessment with planning and program review works Impact is not measured in learning gains © TWBANTA-IUPUI

Group Assessment Has Failed to Demonstrate Institutional Accountability Focus on improvement at unit level Rare aggregation of data centrally Too few faculty involved Involved faculty return to discipline HE scholars focused on K-12 assessment

Impact of Using Findings More attention to:  improving assessment tools  need to do assessment  participating in faculty development  using assessment findings © TWBANTA-IUPUI

Where Learning Has Improved  Alverno College – Milwaukee, WI  Truman State University – Kirksville, MO © TWBANTA-IUPUI

Computer-Based Testing at James Madison University Information Literacy Scientific Reasoning Quantitative Reasoning

University of South Florida Cognitive Level & Quality of Writing Assessment (CLAQWA) Rubric of 16 traits X 5 (Bloom’s) levels  Used by peers and teachers  Improves writing and thinking © TWBANTA-IUPUI

San Diego State University In portfolios, masters & doctoral students reflect on  curricular & co-curricular learning  program learning outcomes Oral presentations of synthesized learning Evaluated by faculty, external professionals Synthesized learning has improved © TWBANTA-IUPUI

North Carolina State University DEAL Model for Critical Reflection (Description, Examination, Articulation of Learning) Rubric levels based on Bloom’s Taxonomy Improves higher order reasoning and critical thinking skills © TWBANTA-IUPUI

Plan Implement Evaluate Improve Culture of Evidence © TBANTA-IUPUI

Building A Scholarship of Assessment National Institute for Learning Outcomes Assessment (NILOA) AAC&U’s VALUE Project Teagle’s Wabash Study and Assessment Scholars Lumina’s big Goal and Degree Qualifications Framework New Leadership Alliance for Student Learning & Accountability © TWBANTA-IUPUI

ASSESSMENT UPDATE Bi-monthly Published by Jossey-Bass Since 1989 Articles up to 2000 words 4 Columns Book Reviews © TWBANTA-IUPUI

Scholarship Reconsidered Four kinds of scholarship  Discovery  Integration  Application  Teaching -Boyer (1990)

© TWBANTA-IUPUI SoTL differs from the scholarship of discovery in its focus on the classroom L.Shulman (2004)

© TWBANTA-IUPUI SoTL Approach to Classroom Research  1. Articulate learning goals.  2. Formulate a question about the learning situation based on the goals.  3. Design a way to collect data.  4. Teach to the goals.  5. Assess the student learning toward the goal.  6. Analyze the feedback.  7. Reflect on the results for future teaching decisions.  8. Share the results.

© TWBANTA-IUPUI SoTL involves 1. Systematic investigation of a research question 2. Study of related literature 3. Going public with findings 4. Critical review by peers 5. Use of research as foundation for further work H. Timberg (2007)

© TWBANTA-IUPUI Types of SoTL Questions  The context of teaching: institutional factors, physical facilities, organizational support  Example: Is it more effective to teach this class in one-hour or two-hour sessions?  Example: How does sitting around tables rather than sitting in rows affect learning?

© TWBANTA-IUPUI Sound Familiar?  The Scholarship of Assessment and Scholarship of Teaching and Learning are integrally related: Can address topics besides learning (civic engage ment) Study of learning issues in actual settings based on evidence resulting in public sharing Can in- clude con- ceptual non- empirical questions or issues SoA SoTL

© TWBANTA-IUPUI Building a Scholarship of Assessment - Banta & Associates Jossey-Bass Publishers April 2002

© TWBANTA-IUPUI Scholarly Assessment Involves o selecting/creating assessment methods o trying the methods o reflecting on strengths/weaknesses o modifying the methods or trying new ones o improving assessment continuously

© TWBANTA-IUPUI Scholarly Assessment Conduct syllabus analysis - Is critical thinking emphasized? Develop student guide to assessment - Do students understand why and how they are assessed?

© TWBANTA-IUPUI The Scholarship of Assessment Involves  basing assessment studies on relevant theory/practice  gathering evidence  developing a summary of findings  sharing findings with the assessment community

© TWBANTA-IUPUI Scholarship of Assessment Compare two teaching methods - Is technology-enhanced instruction more effective? Validate a measure of student civility - Do interventions increase civility?

Barriers to Scholarship in Assessment Campus coordinators are trained in other disciplines Scholars in relevant fields don’t do outcomes assessment Assessment scholarship is not rewarded Campus coordinators return to their own disciplines Few graduate programs prepare assessors © TWBANTA-IUPUI

Some Research Traditions Underlying Assessment  Program evaluation  Organizational change and development  Cognitive psychology  Student development  Measurement  Informatics © TWBANTA-IUPUI

Assessment Methods Improve instruments to measure  content knowledge at more complex levels  affective development  effects of educational interventions  changes in learning over time © TWBANTA-IUPUI

Assessment Methods How can we use technology in assessment more effectively? How can we demonstrate the validity of locally developed instruments? How can faculty make consensual judgments about the quality of student performance? How can student feedback be designed to help faculty improve their teaching? © TWBANTA-IUPUI

Organizational Behavior & Development  How can assessment be combined with other systemic changes to improve teaching & learning?  What patterns of organizational behavior promote and sustain assessment?  What methods of providing and managing assessment information are most effective?  Which public policy initiatives are most effective in promoting improvement on campuses? © TWBANTA-IUPUI

Shared Reflective Practice  Conduct meta-evaluations of approaches to assessment  Determine what works best within disciplines  Develop consortia of institutions to provide forums for reflection © TWBANTA-IUPUI

Engaging Faculty  Introduce assessment as research  Connect assessment with the scholarship of teaching  Support learning about assessment through faculty development © TWBANTA-IUPUI

Targets for Research on Engaging Faculty  How can we determine the interests and commitments of stakeholders?  How should we educate stakeholders for choosing methods?  How can we reduce costs and maximize assessment’s benefits?  What ethical principles should guide our work? Derived from Michael Quinn Patton’s Utilization – Focused Evaluation (1997) © TWBANTA-IUPUI