Assessing for Transformation Gary Brown The Center for Teaching, Learning & Technology.

Slides:



Advertisements
Similar presentations
Eli Collins-Brown, Ed.D. Illinois State University July 12, 2006 Aspects of Online Courses That Are More Effective and Successful than Traditional, Face-to-Face.
Advertisements

The Journey – Improving Writing Through Formative Assessment Presented By: Sarah McManus, Section Chief, Testing Policy & Operations Phyllis Blue, Middle.
Leon County Schools Performance Feedback Process August 2006 For more information
The Role of Academic Leadership in Student Success August 21, 2012 Deans and Department Chairs` Dialogue Southern Utah University Charles Schroeder, Consultant.
Campus-wide Presentation May 14, PACE Results.
The Assessment Imperative: A Work in Progress A Focus on Competencies Ricky W. Griffin, Interim Dean Mays Business School Texas A&M University.
NLII Focus Session Transformative Inklings and the Mystery of “Faculty Buy-In.” Gary Brown, Director The Center for Teaching, Learning, & Technology Washington.
Successful Online Implementation Depends on a Culture of Trust Bethel University College of Arts and Sciences Dr. Richard Sherry Dean of Faculty Growth.
Karen L. Mapp, Ed.D. Deputy Superintendent, Boston Public Schools
Pace University Assessment Plan. Outline I. What is assessment? II. How does it apply to Pace? III. Who’s involved? IV. How will assessment be implemented.
Promoting Student Engagement: Involving Students with NSSE Planning and Results William Woods University NSSE Users’ Workshop October 6-7, 2005.
The Washington State University Critical Thinking Project Diane Kelly-Riley Kim Andersen Paul Smith Karen Weathermon Washington State University.
Benchmarking Effective Educational Practice Community Colleges of the State University of New York April, 2005.
Chatham College Community and Computers Pervasive Computing at a Liberal Arts College Charlotte E. Lott, Ph. D. Lynda Barner West, Ed. D. Copyright Charlotte.
Transformation: The Rubric Paradigm Gary Brown The Center for Teaching, Learning & Technology.
Standards and Guidelines for Quality Assurance in the European
Assessing Students Ability to Communicate Effectively— Findings from the College of Technology & Computer Science College of Technology and Computer Science.
NUMBERS ARE NOT ENOUGH. WHY E- LEARNING ANALYTICS FAILED TO INFORM AN INSTITUTIONAL STRATEGIC PLAN Presented by: Sajana Meera.
Connecting Work and Academics: How Students and Employers Benefit.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
3R’s: Research, Retention and Repayment Predicting the Future by Understanding Current and Past Students.
Rose Asera, Ph.D Rethinking Pre-college Math Summer Institute Aug 22, 2012.
SENSE 2013 Findings for College of Southern Idaho.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
Mountain View College Spring 2008 CCSSE Results Community College Survey of Student Engagement 2008 Findings.
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
 “…the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and.
Achieving the Dream Dr. Jan Lyddon October What is Achieving the Dream?
An Evaluation of SLIS Student Satisfaction and its Global Impacts Christina Hoffman, MLS Dr. Samantha Hastings, Interim Dean The University of North Texas.
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Faculty Portfolios Cindy C. Wilson, Ph.D., C.H.E.S. Professor Department of Family Medicine Uniformed Services University.
Incorporating Student Engagement into the Accreditation Process April 11, 2010.
PBA Highlights Staff Education Fund – 31 staff employees received assistance Staff Development Fund – 12 departments/26 staff employees.
Leading Change. THE ROLE OF POLICY IN CHANGE Leading Change – The Role of Policy Drift to Quantitative Compliance- Behavior will focus on whatever is.
Mission and Mission Fulfillment Tom Miller University of Alaska Anchorage.
Developing a Distance Learning and Online Strategy: A Case Study of a Small, Private University using the Hedgehog Approach.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
Threshold Concepts & Assessment Ahmed Alwan, American University of Sharjah Threshold Concepts For Information Literacy: The Good, the Bad and the Ugly.
Final Update on the New Faculty Course Evaluation & Online System November, 2003.
Organizing for General Education Assessment Linda Suskie, Vice President Middle States Commission on Higher Education 3624 Market Street, Philadelphia.
AASCB The Assurance of Learning AASCB Association to Advance Collegiate Schools of Business Marta Colón de Toro, SPHR Assessment Coordinator College of.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
Retention and Advancement for Mid Career Faculty K.D. JoshiKelly Ward Associate Professor of Interim Chair and Information Systems Professor, Education.
Faculty Applications of Scholarship of Teaching and Learning Research Kathleen McKinney and Patricia Jarvis with the assistance of Denise Faigao.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
Assessment & Program Review President’s Retreat 2008 College of Micronesia - FSM May 13 – 15, 2008 FSM China Friendship Sports Center.
1 SUPPORTING PEDAGOGICAL CHANGE IN NMR SCHOOLS PROJECT Briefing of NMR secondary schools 11 February, 2010 Jean Russell, Graeme Jane, Graham Marshall.
Assessing Teacher Effectiveness Charlotte Danielson
Assessment of Technology at the UW March 15, 2002 Nana Lowell, Director Office of Educational Assessment University of Washington.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Instructors’ General Perceptions on Students’ Self-Awareness Frances Feng-Mei Choi HUNGKUANG UNIVERSITY DEPARTMENT OF ENGLISH.
Kathy Corbiere Service Delivery and Performance Commission
Treasure Mountain Research Retreat Teacher Librarian Collaboration: Using Social Marketing Techniques By Barbara Immroth and Bill Lukenbill, School.
Adopting and Integrating New Technology in Distance Learning Ivy Brown.
21 st Century Learning and Instruction Session 2: Balanced Assessment.
Presents: Information for participants: Your microphone will be muted for the formal presentation. If you cant hear, try running the Audio Set Up Wizard.
Associate Professor Cathy Gunn The University of Auckland, NZ Learning analytics down under.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
HLC Criterion Three Primer: Teaching and Learning: Quality, Resources, and Support Thursday, September 24, :40 – 11:40 a.m. Event Center.
CRITICAL CORE: Straight Talk.
THE JOURNEY TO BECOMING
Derek Herrmann & Ryan Smith University Assessment Services
Engaging Institutional Leadership
The Heart of Student Success
Business Administration Programs School of Business and Liberal Arts Fall 2016 Assessment Report
Presentation transcript:

Assessing for Transformation Gary Brown The Center for Teaching, Learning & Technology

We should not expect the guidance for change of this magnitude—in institutional culture and values—to come from the faculty ranks. After all, faculty are deeply rooted in the traditional values of higher education. Fundamentally, this is a leadership issue. --Carol Barone, Vice President Educause

Agenda Assumptions—the need for transformation To arms, to arms… About those jobs What Transformation is NOT What Transformative Assessment is NOT Stories Goals, Activities, & Processes (GAPs) Inquisitions and Assessments Critical Thinking—the rubric paradigm The Sleeping Bear & Principles of Transformative Assessment TAPS

To Arms, To Arms… PHOENIX Phoenix Online is growing faster than any of the university's 40 physical campuses 1,700 online support specialists 7,000 mostly part-time faculty 49,400 students 70% increase $64.3 million in net income

Phoenix —from the Chronicle “Some critics note that Phoenix strips faculty members of their central role in higher education.” “One of the greatest risks we face as a nation in the growth of new educational providers is the unbundling of teaching, research, and service functions” —Art Levine However, counters Phoenix, “Professors at traditional universities who attempt online education are learning as they go, and often give students a bad experience as a result.”

Cost The USA currently has 3,500 colleges and universities with an enrollment of 14 million students. We spend $175 billion on education. $12,500 per student But now, eleven International Mega- universities serve 3 million students.

Corporate Tax Support

Job Growth: * Predicted Percentage Increase… *US Department of Labor Statistics (1993)

Job Growth: * Or in Total Numbers *US Department of Labor Statistics (1993)

Job Market Projections* *US Department of Labor Statistics (1993)

The Real Story We need: 9 cashiers for every technical worker. 1.5 janitors for all the lawyers, accountants investment bankers, stock brokers, & computer programmers combined. “The projected shift … in educational requirements… can be accomplished if those entering the labor force have…one fourth grade level more education than those retiring from the labor force.” Economic Policy Institute

College Work *US Department of Labor Statistics (1993)

Already, In Fact… % College Graduates working in Blue Collar Jobs *US Department of Labor Statistics (1993)

What Transformation Is NOT

Attrition Rates in WSU Distance Courses %

Student Enrollments in WSU Online Learning Spaces

Transformation?

What Transformative Assessment is NOT

Evaluations as Barriers to Improvement Many evaluation instruments are subtly biased in favor of traditional instruction. They often penalize innovators, measuring what faculty aren’t doing but failing to measure what they are doing.

Purpose of Evaluation is Not Clear Most evaluation focuses on personnel decisions. Inconsistent interpretation and use of evaluation results. Little focus on areas where that might best benefit from feedback Evaluation usually occurs at the very end of a semester —Too late for change

Students Are in the Dark, Too The traditional evaluation process is not designed to help students become more involved with what will really help them learn. Students don’t take evaluations seriously.

Teaching-Learning Practices are NOT Linked to Learning Course evaluations rarely focus on practices that have been shown to produce better learning outcomes. Assessment rarely improves learning.

No Link Between Evaluation & Faculty Development Inadequate support for addressing results. Literature from past years emphasizes staff development as key to effective evaluation practices that promote teacher growth and improvement” (1997, Annunziata in Stronge, pp 289) Administrators frequently calculate an average score for all evaluation items and then rank faculty. The method does not foster dialogue among faculty about good practices.

The Goals, Activities, and Practices Invite faculty to join in the process of formulating the assessment. A series of three short, online surveys designed to provide formative assessment: one instructor survey and two student surveys.

The 3 Survey Goals 1.To help faculty gather useful data about their teaching goals, values and strategies… 2.To help faculty learn how those strategies address and influence their students’ goals, values, and learning behaviors… 3.And to examine the interaction between goals and practices that faculty might share with each other —and so establish a culture of evidence.

GAPs: Encouraging Participation Brown Bags - to initiate the conversation on assessing teaching and learning. We ed each faculty approximately 10 times Student surveys were linked to course sites of instructors who answered the instructor survey. We analyzed and returned results promptly... Regular updates were sent to instructors with their class results and overall results. We offered and have co-authored papers with faculty... We have talked with chairs, deans, and the assessment coordinator’s to include GAPS in teaching portfolios. We wore out our shoes…

Sample Findings Significant mismatch between faculty and student goals Online courses are significantly more effective than video based courses Designed courses are significantly more likely than courses not formally designed to evidence principles of good practice Faculty motivation predicts perceptions of efficacy Faculty who want to keep abreast of the scholarship of T & L report that online learning is positive Faculty who teach online primarily for the money … Student age and gender predict perceptions of testing efficacy

Time & Cost Estimated Hours

The Assessment Gold Standard Participants Who Used GAPs Data to Transform….

However…. One of the two programs endured a budget related, faculty initiated inquisition….. Demonstrated, by virtue of GAPS, systematic, formative data gathering and responsiveness to that data… Demonstrated evidence of good practice. Demonstrated the “creative use of technology” and gains in critical information literacy Demonstrated the program was cost effective The program improved freshman retention Demonstrated the program improved student grades, including “special admits” expected to struggle.

The Learning Context: Student WSU Cum GPA by Admissions Quartiles

Which has resulted in…. The faculty committee that initiated the inquiry was commended by administration for its attentiveness to issues of quality…. They have subsequently been invited to examine other programs… Which has transformed the CTLT We are seeing the inklings of a culture of evidence sprouting in a rough, research dominated terrain…

small steps….

Guide faculty grading Guide student learning Provide measures of growth The Rubric Paradigm

Dimensions of Critical Thinking 1.Identifies and summarizes the problem/question at issue (and/or the source's position). 2.Identifies and presents the STUDENT’S OWN perspective and position as it is important to the analysis of the issue. 3.Identifies and considers OTHER salient perspectives and positions that are important to the analysis of the issue. 4.Identifies and assesses the key assumptions. 5.Identifies and assesses the quality of supporting data/evidence and provides additional data/evidence related to the issue. 6.Identifies and considers the influence of the context * on the issue. 7.Identifies and assesses conclusions, implications and consequences.

Critical Thinking and Measures of Growth Emerging____________________________ Mastering Identifies and summarizes the problem/question at issue (and/or the source's position). Does not identify and summarize the problem, is confused or identifies a different and inappropriate problem. Does not identify or is confused by the issue, or represents the issue inaccurately. Identifies the main problem and subsidiary, embedded, or implicit aspects of the problem, and identifies them clearly, addressing their relationships to each other. Identifies not only the basics of the issue, but recognizes nuances of the issue.

Critical Thinking: 8 Courses: 4 with CT, 4 w/o CT

Critical Thinking: One course—two semesters

Faculty Development The critical thinking rubric was valuable for helping faculty articulate their goals and communicate expectations to students. Faculty who used the rubric were enthusiastic and expressed plans to integrate the rubric more intensively in future courses.

Critical Thinking Study — Results Gains in courses when rubric is used—when the faculty in this project integrated the WSU Critical Thinking Rubric into their instruction and assessment, evidence of student gains in critical thinking increased dramatically. Gains from freshmen to junior years—Critical thinking was significantly higher among juniors than among freshmen. But even the writing of juniors had only a mean of 3.1 on a 6 point scale.

Additional Findings & Implications The dimension of least gain was in students’ abilities to articulate their own viewpoints. The greatest gains by juniors reflect improved abilities to analyze issues from multiple perspectives. Comparisons to WSU’s writing assessment— As critical thinking scores rise, writing placement scores and portfolio exam scores sink... The faculty questionnaire revealed a focus on grading over fostering critical thinking for broader life-long learning.

Transformation?

The Sleeping Bear Those most oblivious of the teeth are often the first to fodder… The growling bear will elicit benchmarks & comparisons The bear will feast on standardization But bears are omnivorous…

Emerging Principles of Transformative Assessment Institutional leadership is imperative Assessment focuses on institutional efforts to provide students with rich learning experiences Assessment includes student reports of their own increasingly unique experiences Qualitative measures are valued and may be supported by quantitative measures Emphasizes student learning defined by development and change over time Dissemination engages the responsibility for shaping as well as reflecting society needs “High Standards” is not the same as Standardization

Dimensions of Transformation 1.Purpose 2.Data Acquisition 3.Application 4.Dissemination

Criteria for Rating Transformation Rater:Project: *Faculty*Designer *Student*Assessment Specialist *Community Colleague*Administrator *Other _________ Rating Assessment Purpose Data & Data Acquisition Application Dissemination Bonus—Faculty Rewarded for Assessment Total Score Comments: The Scoring Form

U#1U#2U#3U#4U#5Dimension Average Purpose Data Application Dissemination University Average Pilot Findings

Key Comments U#1 Doesn’t stretch student learning or acknowledge what students may bring. Focus satisfies industry’s demand for skills. While faculty are included in this process, audience is clearly the upper echelon of the institution and future employers.

Key Comments U#2 Not many rewards for assessment. No mention of disseminating results beyond unit; no feedback to faculty and students. Limited to classes.

Key Comments U#3 Assesses key concern: learning to learn. No plan to apply or fund results of findings. Faculty support neglected. Feedback to institution and external agencies is minimal—how are constituents engaged in process?

Key Comments U#4 Design for infrastructure for hybrid classes—assumptions that tech alone equates with transformation

Key Comments U#5 Good start but…