The NASULGC Agenda Elements of Accountability Competitiveness-STEM Teachers International-1,000,000 abroad Ag Act Reauthorization- Create 21.

Slides:



Advertisements
Similar presentations
The Commissions Expectations for the Assessment of Student Learning and Institutional Effectiveness Beth Paul Interim Provost and Vice President for Academic.
Advertisements

Standard 13 Related Educational Activities. What does it cover? The institutions programs or activities that are characterized by particular content,
The Future of the Academy– Societal Changes; Accreditation; Accountability APLG/FSA 2008: Strategies for a Complex World Jon Wergin Antioch University.
1 Mid-Term Review of The Illinois Commitment Assessment of Achievements, Challenges, and Stakeholder Opinions Illinois Board of Higher Education April.
David J. Sammons, Dean UF International Center. Southern Association of Colleges and Schools: SACS is our regional accrediting authority. The last SACS.
The Board’s Role in Accreditation Michael F. Middaugh Associate Provost for Institutional Effectiveness University of Delaware Celine R. Paquette Trustee,
National Commission for Academic Accreditation & Assessment Preparation for Developmental Reviews.
1 General Education Senate discussion scheduled for April 11 and 25 1.Proposal to base General Education on outcomes that can be assessed 2.Proposal for.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
SUNY GENERAL EDUCATION ASSESSMENT CONFERENCE Guidelines for and Implementation of Strengthened Campus-Based Assessment.
Institutional Accreditation Review Christine M. Ladisch Vice Provost for Academic Affairs Getting Prepared:
The Voluntary System of Accountability (VSA) College Senate SUNY Oneonta February 25, 2008.
Introduction to teaching and assessing so students will learn more using learner-centered teaching Phyllis Blumberg Warm-up activity How can instructor’s.
 The Middle States Commission on Higher Education is a voluntary, non-governmental, membership association that is dedicated to quality assurance and.
FLCC knows a lot about assessment – J will send examples
Portfolio Assessment A collection of a student’s work specifically selected to tell a story about the student.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
TaskStream Training Presented by the Committee on Learning Assessment 2015.
Overall Teacher Judgements
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
Preparing for THE Visit: the PA Role in the Institutional Accreditation Report and Visit Ellie A. Fogarty, Ed.D., Vice President Middle States Commission.
Quality Breakfast/ Tea April 1 – 8:30 am April 3 – 4:00pm.
The Voluntary System of Accountability (VSA SM ).
ANDREW LAMANQUE, PHD SPRING 2014 Status Report: Foothill Reaffirmation of Accreditation.
Strategies for Implementing Reviews of Student Learning in a Decentralized Environment Sharon A. La Voy Office of Institutional Research, Planning and.
Mission and Mission Fulfillment Tom Miller University of Alaska Anchorage.
The Report of the Provost’s Advisory Group on the SUNY Assessment Initiative September 2009 Tina Good, Ph.D. President Faculty Council of Community Colleges.
Developing and Writing Winning Individual, Corporate and Foundation Proposals Robin Heller, Director, Corporate and Foundation Philanthropy, BBBSA Robert.
Assessment of Student Learning North American Colleges and Teachers of Agriculture Cia Verschelden June 17, 2009.
Dr. Constance Ray Vice President, Institutional Research, Planning, & Effectiveness.
Commissioning Self Analysis and Planning Exercise activity sheets.
Assessment Update Report to the University Senate 3 October 2006.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
1 SCU’s WASC Reaccreditation Diane Jonte-Pace, Self Study Steering Committee Chair Don Dodson, Academic Liaison Officer Winter 2007.
Meeting the ‘Great Divide’: Establishing a Unified Culture for Planning and Assessment Cathy A. Fleuriet Ana Lisa Garza Presented at the 2006 Conference.
Responding to Calls for Greater Accountability SHEEO-NCES Network Conference April 17, 2008 Christine M. Keller NASULGC The VSA Project.
Creating a Culture of Accountability and Assessment at an Urban Community College Diane B. Call, President Paul Marchese, VP Academic Affairs, Liza Larios,
Cleveland State University Self Study 2010 North Central Association/Higher Learning Commission Accreditation.
2006 Fall Workshop PLANNING and ASSESSMENT: A TUTORIAL FOR NEW DEPARTMENT CHAIRS – A REFRESHER COURSE FOR OTHERS.
1 VSA UPDATE The Next Generation of the College Portrait Presentation by William E. Kirwan USM Chancellor Association of Governing Boards Sunday, April.
Accreditation Who? Regional Accrediting Agency: NWCCU (Northwest Commission on Colleges and Universities). What? Self-study followed by on-campus.
WASC “All Hands” Meeting Overview and Update November 12, 2007 D. Jonte-Pace.
Monitoring and Oversight: College Completion and Attainment Dr. Kevin Reilly & Dr. Sheila Stearns AGB Consultants December 7th, 2015.
Assessment of Course-Level Learning Outcomes in Psychology.
Columbia Basin College Plenary I: Mission and Mission Fulfillment Rich Cummins Melissa McBurney 1.
CCSSO Task Force Recommendations on Educator Preparation Idaho State Department of Education December 14, 2013 Webinar.
Conflicts of Interest Peter Hughes IESBA June 2012 New York, USA.
6 Standards: Governance, Curriculum, Diversity, Assessment, Faculty, and Clinical  Spring Self Study Completed  June Submit Report  Fall.
Because we believe good teaching makes the world a better place Meckler, University of Portland, Teaching and Scholarship Committee1.
Institutional Effectiveness at CPCC DENISE H WELLS.
Cleveland State University Self Study 2010 North Central Association/Higher Learning Commission Accreditation.
Recruiting and Retaining Diverse Students: Why it’s Different and The Same Presented by Sylvia R. Carey-Butler, PhD Assistant Vice Chancellor, Academic.
Texas Higher Education Coordinating Board Dr. Christopher L. Markwood Texas A&M University-Corpus Christi January 23, 2014.
HOW INSTITUTIONS MAKE MEANINGFUL CHANGE WITH OUTCOME ASSESSMENT MARCH 21, 2014 MIRAMAR COLLEGE: A CULTURE OF IMPROVEMENT.
Updates on AASCU & APLU Data Projects Christine M. Keller, PhD Executive Director, Voluntary System of Accountability APLU Associate Vice President, Academic.
A lens to ensure each student successfully completes their educational program in Prince Rupert with a sense of hope, purpose, and control.
Accreditation 2007 Undergraduate Council September 26, 2005.
HLC Criterion Four Primer Thursday, Oct. 15, :40 – 11:40 a.m. Event Center.
Can Institutions Really Be Compared Using Standardised Tests? Presented at the 2008 Meeting of the European Association for Institutional Research Copenhagen,
1 Institutional Quality and Accreditation: A Workshop on the Basics.
The assessment process For Administrative units
Explaining and Communicating Faculty Purview over Curriculum to Board Members and External Stakeholders Larry Galizio, Community College League of California.
Director of Policy Analysis and Research
Faculty Senate Meeting September 20, 2016
Topic Principles and Theories in Curriculum Development
Decisions Defining prereqs, expectations Choosing textbooks/ resources
Presentation transcript:

The NASULGC Agenda Elements of Accountability Competitiveness-STEM Teachers International-1,000,000 abroad Ag Act Reauthorization- Create 21

Elements of Accountability for Public University and Colleges By Peter McPherson, President, and David Shulenburger, Vice President NASULGC

An Environment of Mistrust  Request from government that universities demonstrate productivity  Especially learning outcomes Spellings Commission New York Times!

Diffuse mistrust is hard to manage  Results in cycle Request for evidence Evidence supplied Dissatisfaction with evidence Request for different evidence Evidence supplied Etc.  Breeds cynicism  Wastes time and resources for all

We need to get it right this time!  This is the motivation of this NASULGC/AASCU effort  Requests for critique and suggestions from all quarters are genuine as we really need to get it right this time!

What is this?  A proposed set of voluntary accountability measures for public universities  Intended to help improve student learning, improve fit between student and university and be entirely responsive to needs of policy makers, Boards of Trustees and others  Constituted of measures of sufficient rigor and thoroughness that they should be substituted for existing accountability measures  A visible expression of the university willingness to be open and accountable  Broadens the conversation from cost to the value provided by university education

First Principles  Our mission is education. Accountability measures must promote that end  We can legitimately take credit only for the value we add  Our measures must be transparent  Only like institutions should be compared and then only on specific measures  For the sake of economy: These measures should substitute for other accountability requirements Sampling should be used, where possible, to reduce measurement cost

To Whom Do We Owe Accountability ?  Students, Prospective Students and Their Parents  Faculty and Support Staff  Public Policy Makers, Governing Boards, Various Funders, Alumni and Supporters

Why Are Current Accountability Measures Insufficient?  Measures utilized vary so much from university to university that comparison of university performance is impossible  And they are often kept confidential  Result: A significant proportion of the public does not believe that we wish to be accountable and others are simply confused by our diverse measures

What Do We Need to Produce?  Comparable Measures across Comparable Universities  Publicly available  Uniform conventions

Components of the Accountability Set of Measurements  Consumer Information To improve the fit between student and university  Campus Learning Climate Data NESSE, CIRP, ?  Core Education Outcomes, e.g. CLA, MAP, CAAP, GRE? Adjusted to reflect value- added

What might be lost with outcomes testing?  Diversity of mission If the test is not mission-specific its contents probably will not match your mission Since you test over what is considered important Teaching to the test might result  Baylor University  Haskell Indian Nations University Mission changes to fit the test and homogeneity results

But do our schools have any commonality of mission?  Probably  Core seems to include at least Critical thinking skills Written communications skills Analytical reasoning skills  Our outcomes testing proposal is focused only on measurement of this common core of the teaching mission

Why Value Added?  Dysfunction of input measures ACT/SAT scores Resource measures in general  Value-added promotes efficiency in use of resources and promotes access  That said, measuring value added is far easier said than done as it is entangled with inputs.  Thus, value added measurement is a goal, an an important goal that will require some experimentation with measurement methods before we can attain it.

Regional Accreditation and these accountability measures  When the set of standards is complete the six regional accreditors will be asked to consider substitution of these measures for measures they currently accept in satisfaction of their standards  Governing boards and other oversight agencies will be asked to engage in the same review.

Time Frame  General distribution of discussion draft  Comments welcomed from all  Discussions with Provosts this Summer  Reconvening the Kirwan committee this fall to consider comments and revise draft  Consideration by Presidents at annual meeting  If agreement on elements exists then  Establish working groups to develop specific measures  2 to 3 years after agreement reached individual schools begin public reporting of Accountability Measurements

Remaining Dynamic  Higher Education’s Environment is not Static  Accountability Measures that are Suitable Today may not be Suitable in the Future  A major advantage of a voluntary system over a government dictated system is this ability to keep the system dynamic A Continuing Mechanism/Authority Must be Established to Ensure Continued Acceptability of the Accountability Measures Agreed Upon

Conclusion  This constitutes rigorous self-regulation  We believe it to be responsive to stake holders and are testing that with this discussion draft  It is voluntary by institution  How much this effort really matters will ultimately be determined by the number of NASULGC/AASCU universities that subscribe to this public university accountability agreement