THE EDGE IN KNOWLEDGE Changes in the Carnegie Classifications: What They Mean for Colleges & Universities Perry Deess Ph.D. Director of Institutional Research.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Assessment: A Mirror with 2 Faces Accountability Reflective Practice.
Office of Institutional Research Song Yan, Kristy Maxwell, Mark A. Byrd Associate Director Senior Research Analyst AVP Wayne State University.
Building a European Classification of Higher Education Institutions Workshop ‘New challenges in higher education research and policy in Europe and in CR’,
CENTRAL TEXAS COLLEGE DECEMBER 2012 FACULTY QUALIFICATION.
1 NSSE Results Indiana University Kokomo Sharon K. Calhoon Director, Center for Teaching, Learning, and Assessment Presentation to Clerical.
Bobby Nelson CSC N March Table Of Contents CCSU SCSU WCSU ECSU Enrollment Admissions Cost Cohort Default Rates Recommendation.
The Faculty Senate Task Force on University Status Points of Information.
Integrated Postsecondary Education Data System (IPEDS) Interrelated surveys conducted annually by the National Center for Education Statistics (NCES)
Task Force Phase I Progress Report March 18 & 19, 2013.
Promotion and Tenure Workshop May 2005 PURPOSE CRITERIA Lou Malcomb 5/2005
LS 451 ACADEMIC LIBRARIES Laura Saunders Spring 2010.
Carnegie Classification 2010 Updates for The Richard Stockton College of New Jersey
University Surveys and Assessments Department Chair and Dean Retreat.
Dec. 14, 2010 APS BOE Workshop Graduation Requirements High School Presented By: Division of Instruction PACESetters!
Joint EAIA and NAFSA Symposium Commonalities and Differences in Systems Linda Tobash Institute of International Education Amsterdam March 22-23, 2007.
The Special Education Leadership Training Project January, 2003 Mary Lynn Boscardin, Ph.D. Associate Professor Preston C. Green, III, Ed.D., J.D., Associate.
0 U S News & World Report Undergraduate College Rankings Ruth Kallio, Associate Director for Institutional Research Office of Budget and Planning November.
1 Strategic Planning: An Update March 13, Outline What we have done so far? Where do we stand now? Next steps?
Industry Advisory Board Department of Computer Science.
1 The New York State Education Department New York State’s Student Reporting and Accountability System.
UNF Writing Program and Center Linda Howell, Director UNF Writing Program David MacKinnon, Coordinator UNF Writing Center.
AACN Financial Benchmarking Survey and Other Data Initiatives Di Fang, PhD - Director of Research and Data Services.
How FSU Stacks Up To Its Peers: National Views of FSU THE FLORIDA STATE UNIVERSITY.
HumaniTech®: Educause, Seattle October 24, 2007 Bridging Divides, Building Collaborations
Where Does UMKC Fit in the 2011 Rankings?. Why pay attention to the US News Rankings? Although rankings are imprecise, metrics are important Prospective.
Fifth Annual NSF Robert Noyce Teacher Scholarship Program Conference July 7-9, 2010 Enrique Ortiz University of Central Florida Using a Teaching Goals.
University Of North Alabama General Education Assessment Paradigm Shift: A plan for Revising General Education Assessment at UNA.
Human Resource Management (HRM) The Basics of Performance Management.
QUALITY ASSURING OF THE HIGHER EDUCATION IN KAZAKHSTAN
Office of Institutional Research IPEDS WORKSHOP THURSDAY, OCTOBER 31, 2013.
Employer perceptions of international education and UK degrees Cliff Young, Managing Director, Ipsos Public Affairs International Legal Education Abroad.
Assessment, Evaluation and Reporting for Secondary School Students Bonnie Brough and Jenny Perry Friday, May 19, 2006.
Confidential Faculty Development Needs Assessment: Report on Findings Institution: Kendall College Response to Final Survey.
Standard 5 - Faculty Qualifications, Performance, and Development Kate Steffens St. Cloud State University.
The Voluntary System of Accountability (VSA SM ).
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
EGS Research & Consulting BASELINE SURVEYS OF MATHEMATICS DEPARTMENT CHAIR PERSONS, MATHEMATICS FACULTY AND EDUCATION DEPARTMENT FACULTY.
Presented by Jerry L. VerDuft, MSQA To ASQ Section 1312 September 10,2008 Distance Learning.
Developed by Yolanda S. George, AAAS Education & Human Resources Programs and Patricia Campbell, Campbell-Kibler Associates, Inc. With input from the AGEP.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
Overview of KSAccreditation Support OverviewAccreditation Statistical ReportsComparison Groups 2003 SBAA Summer Workshop Accreditation Data Reporting Comparison.
QUALITY ASSURANCE IN BULGARIAN HIGHER EDUCATION Prof. Anastas Gerdjikov Sofia University March 30, 2012.
Learning Initiatives to Improve Undergraduate Education at a Research-Oriented Institution Robin R. Rastani * and Michel A. Wattiaux Department of Dairy.
BUILDING A PRIOR LEARNING ASSESSMENT PROGRAM Office for Prior Learning Assessment Joyce Lapping, Director and Panel Presenter at NEASC 126 th Annual Meeting.
Classifying European Institutions of Higher Education Phase II Frans van Vught.
FCD CWI 1 The Foundation for Child Development Index of Child Well- Being (CWI) 1975 to 2004 with Projections for 2005 A Social Indicators Project Supported.
Career Exploration KYLE KELLER AND RANDI LEWIS CAREER ADVISORS SINCLAIR COMMUNITY COLLEGE.
Au: Our Aspirations and Perspirations By Rev. Bro. Bancha Saenghiran, Ph.D. April 6, 2005 Mission Hill, Nakornratchasima.
SACS Leadership Retreat 9/23/ Western Carolina University SACS Reaffirmation of Accreditation Frank Prochaska Executive Director, UNC Teaching.
ASEE Profiles and Salary Surveys: An Overview
16 OCTOBER 2015 JOACHIN ARIAS, SLO COORDINATOR EDWARD PAI, DEAN OF INSTITUTIONAL EFFECTIVENESS Program Review 2.0 Training: SLO Assessment Participation.
Our Story: Our Story: The Story of One Student Affairs Division’s Quest to Improve Assessment Don Whalen, Coordinator of Assessment, Department of Residence.
Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD.
Quality assurance and graduate student support Fred L Hall Former Dean of Graduate Studies at University of Calgary, McMaster University,
Provost Café University of La Verne Greg Dewey 12/1/10 Visioning and Core Values.
DEVELOPED BY MARY BETH FURST ASSOCIATE PROFESSOR, BUCO DIVISION AMY CHASE MARTIN DIRECTOR OF FACULTY DEVELOPMENT AND INSTRUCTIONAL MEDIA UNDERSTANDING.
Voluntary System of Accountability UNF sign-up March 1, 2008 Slides from the NASULGC Opening General Session on the VSASlides from the NASULGC Opening.
Preparing for the Title III Part F STEM Competition Alliance of Hispanic Serving Institutions Educators Grantsmanship Institute March 20, 2016.
Recruiting and Retaining Diverse Students: Why it’s Different and The Same Presented by Sylvia R. Carey-Butler, PhD Assistant Vice Chancellor, Academic.
Assessment Instruments and Rubrics Workshop Series Part 1: What is a rubric? What are the required elements? What are VALUE rubrics? February 24, 2016.
Faculty Leadership Training
AAU Membership Metrics
Achieving Carnegie Highest Research Activity Classification VPR & COGS
UTRGV 2016 National Survey of Student Engagement (NSSE)
GRADUATE PROGRAM COORDINATOR GRADUATE CURRICULUM SERIES
UTRGV 2017 National Survey of Student Engagement (NSSE)
Taking the Next Step: Carnegie R1 Tier for Highest Research Activity
2019 Academic Administrator Workshop
OUR RESEARCH ENTERPISE We R1
Presentation transcript:

THE EDGE IN KNOWLEDGE Changes in the Carnegie Classifications: What They Mean for Colleges & Universities Perry Deess Ph.D. Director of Institutional Research and Planning, NJIT Annual Meeting of the Association of NJ Graduate Schools March 24, 2006

THE EDGE IN KNOWLEDGE A Little History  1970: The Carnegie Commission on Higher Education creates a classification system to serve its research program  1973: Classification published to assist research on higher education  1976, 1987, 1994, 2000: revised editions  : major revisions Clark Kerr

THE EDGE IN KNOWLEDGE Original Design Principles  Seek comparability with respect to:  Functions of the institutions  Characteristics of students and faculty  Use empirical data about what institutions do  Secondary analysis of existing data

THE EDGE IN KNOWLEDGE Why Was The Classification Changed?  Higher education has changed  1970 framework has weaknesses and blind spots  A single framework is not sufficient  Value in acknowledging complexity BIG REASON  To reduce competition based on the classification system

THE EDGE IN KNOWLEDGE Summary of Changes Comprehensive (all-inclusive) schemes  Basic, with changes  Instructional Program  Undergraduate  Graduate  Student Profile  Overall  Undergraduate  Size & Setting Elective (voluntary) schemes  Outreach & Community Engagement  Undergraduate Education Inquiry & Support

THE EDGE IN KNOWLEDGE Basic Classification  Associate’s: subcategories  Doctorate-granting: index of research activity  Master’s: finer distinctions  Baccalaureate: “liberal arts” to “arts & sciences”  Special focus: sharper definition

THE EDGE IN KNOWLEDGE Doctorate-granting: index of research activity  Doctoral institutions are a key area of competition  Three categories now  Research universities—very high research activity  Research universities—high research activity  Doctoral/Research universities

THE EDGE IN KNOWLEDGE Defining Doctoral Institutions* (IPEDS based doctoral conferrals; professional doctorates not counted for the base of 20; research staff from NSF survey of Graduate Students and Postdoctorates in Science and Engineering )  “The research index is based on the following correlates of research activity: research and development expenditures in science and engineering (NSF R&D survey); research and development expenditures in non-science and engineering fields; science and engineering research staff; and doctoral conferrals in humanities fields, social science fields science technology, engineering, and mathematics fields, and professional fields. These data were statistically combined using principal component analysis to create two indices of research activity. The first index was based on aggregate levels of these factors,. The second index, of per-capita research activity, used the expenditure and staffing measures divided by the number of full time faculty members whose primary responsibilities were identified as research, instruction, or a combination of instruction, research and public service. (From IPEDS)”

THE EDGE IN KNOWLEDGE Defining Doctoral Institutions (continued)  “The values in each index were then used to locate each institution on a two-dimensional graph (scatterplot). Each institution’s distance from a common reference point was calculated, and the results were used to assign institutions to three groups based on their distance from the reference point. Thus the aggregate and per-capita indices were considered equally such that institutions that were very high on either index were assigned to the “very high” group, while institutions that were high on one but (but very high on neither) were assigned to the ‘high’ group.”  [The Chronicle of Higher Education March 3, 2006]

THE EDGE IN KNOWLEDGE What does this mean? “Ain’t nobody gonna figger how ta game it.”  The point of this is to prevent competition and limit the explosion of doctoral programs for competition in a ranking system.  The system is fundamentally relational  The mathematics are virtually inscrutable  It IS competitive, but few schools will spend the time to work out how to compete

THE EDGE IN KNOWLEDGE How to game the doctoral ranking system?  Have a long talk with the people completing the NSF R&D Survey, the NSF Graduate Student and Post-doctorate Survey, and the IPEDS.  If they carefully and position the university based on the criteria described above they can maximize your chances of reaching a higher tier.  Remember you only need one VERY HIGH index score to achieve the VERY HIGH category.

THE EDGE IN KNOWLEDGE Instructional Program Undergraduate  Degree level  Balance of arts & sciences and professional fields  Correspondence with graduate programs Graduate  Degree levels  Mix of offerings  Comprehensive  Focused

THE EDGE IN KNOWLEDGE Student Profile Overall student profile  Mix of undergraduate and graduate/professional enrollments Undergraduate profile  Proportion full- & part-time  Achievement characteristics of first-year students  Transfer-in percentage

THE EDGE IN KNOWLEDGE Size and Setting  Total enrollment  Residential character

THE EDGE IN KNOWLEDGE Elective (voluntary) Schemes Outreach & community engagement  Mix of outreach and engagement activities Undergraduate education inquiry & support  Efforts to assess undergraduate education  Support for assessing & improving teaching & learning

THE EDGE IN KNOWLEDGE How to do peer analysis? (cont.)

THE EDGE IN KNOWLEDGE How to do peer analysis? (cont.)

THE EDGE IN KNOWLEDGE

Why was all of this done?  To facilitate peer analysis  To aid research  To develop generally non-competitive scales  To encourage more sophisticated ranking— particularly by US News

THE EDGE IN KNOWLEDGE Advantages  Complexity  Flexibility  More nuanced classification  Better matching of classification to purpose  Possibilities for customization  Responsibility  Make & justify choices

THE EDGE IN KNOWLEDGE How to do peer analysis?  Start at this site:

How to do peer analysis? (cont.)

THE EDGE IN KNOWLEDGE Where to Learn More   Copies of slides:  For more information contact: Perry Deess