Assessing Research-Doctorate Programs: A Methodology Study.

Slides:



Advertisements
Similar presentations
National Accessible Reading Assessment Projects National Accessible Reading Assessment Project General Advisory Committee October Meeting Review.
Advertisements

S Y N E R G I E S The Canadian Research Information Network Canadian Foundation for Innovation External Committee Evaluation –
Health Economists: Who We Are, What We Do, and How Much We Earn Michael A. Morrisey, Ph.D. University of Alabama at Birmingham and John Cawley, Ph.D. Cornell.
Essential Elements for a Highly Successful Graduate Real Estate Program: The Stakeholders Have Spoken Elaine Worzala Clemson University Charles Tu Lauren.
Integrated Postsecondary Education Data System (IPEDS) Interrelated surveys conducted annually by the National Center for Education Statistics (NCES)
Volumes held at UC Berkeley Source: UCOP,
Evaluation and Accountability Evaluation Institute and its role in evaluating private schools Evaluating schools’ processes and outcomes soundly, systematically.
Measuring ROI for Research Libraries Phase I Judy Luther Informed Strategies ASERL Membership Meeting Spring 2010.
© 2011 School Improvement Network 2008 Learning Framework Impact Assessment: St. John the Baptist Parish Public Schools versus Louisiana as a State Prepared.
NATIONAL ACADEMY OF SCIENCES NATIONAL RESEARCH COUNCIL Research Priorities: Earth Sciences and Public Health Board on Earth Sciences and Resources Board.
PERFORMANCE FUNDS. New Performance Funding Allocation Criteria Each university metric is evaluated based on Excellence or Improvement and has five benchmarks.
Office of the Provost - Institutional Research Colombian Leadership Dialogue: The Renewal of Research Universities Value of Comparative Data Lydia Snover.
Chapter 13 Survey Designs
Supporting Research in Colleges of Education: Some Preliminary Data Ralph E. Reynolds
1 Canada’s National Data Archive Consultations Chuck Humphrey University of Alberta IASSIST 2005.
NRC Assessment of Doctoral Programs Charlotte Kuh
STEPPING STONE PROJECT STEPPING STONE PROJECT designing a new engineering discipline presented by team 1.
Graduate Opportunities in Bioinformatics By Tristan Butterfield Alternative Career Presentation Senior Seminar,
 The field of statistics provides the scientist with some of the most useful techniques for evaluating ideas, testing theory, and discovering the.
Introduction to Communication Research
Ranking universities: The CHE Approach Gero Federkeil CHE – Centre for Higher Education Development International Colloquium “Ranking and Research Assessment.
Julia Bauder, Grinnell College & Jenny Emanuel, University of Illinois Be Where our Faculty Are: Emerging Technology Use and Faculty Information Seeking.
1. Subhash Kuvelker Senior Program Officer Policy and Global Affairs Division National Research Council May 21, 2012.
The ARL E-Metrics Project Measures for Electronic Resources ACRL/NEC Information Technology Interest Group May 17, 2002 Brinley Franklin Director, University.
Maria Thompson Provost & Vice President for Academic Affairs Academic Affairs Restructuring Proposal 23 April 2012.
Collecting, Presenting, and Analyzing Research Data By: Zainal A. Hasibuan Research methodology and Scientific Writing W# 9 Faculty.
Wojciech Fenrich Interdisciplinary Centre for Mathematical and Computational Modelling (ICM) University of Warsaw Prague, KRE 12,
Addressing National Priorities in TEMPUS Projects TEMPUS Project for Establishing a Center of Excellence for Research & Training at Damascus University.
Incorporating Information Competency into the Curriculum Carol Womack Santa Monica College Library
Social Sciences and the Humanities Data in the United States National Science Foundation Division of Science Resources Statistics Dr. Lynda T. Carlson.
Universities and Centers: U.S. examples Susan E. Cozzens Technology Policy and Assessment Center Georgia Institute of Technology Atlanta, GA.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
Bioengineering Graduate Program Fischell Department of Bioengineering University of Maryland John P. Fisher, Ph.D. Professor and Associate Chair Director.
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
Why Study Economics at Baylor? Steve Green Professor of Economics & Statistics Chair, Department of Economics.
American Council of Learned Societies The History E-Book Project.
End of Course Evaluation Taimi Olsen, Ph.D., Director, Tennessee Teaching and Learning Center Jennifer Ann Morrow, Ph.D., Associate Professor of Evaluation,
Advancing Assessment of Quantitative and Scientific Reasoning: A Progress Report on an NSF Project Donna L. Sundre, Douglas Hall, Karen Smith,
Report on present status of the quality assurance system at University of Split Željko Dujić, MD, PhD Vice rector for science and international affaires.
The NRC Doctoral Program Assessment Charlotte Kuh National Research Council.
Role of University Rankings in Kazakhstan Prof. Sholpan Kalanova BRATISLAVA 2011.
Columbia University :: Office of the Provost :: Planning and Institutional Research NRC Assessment of Research-Doctoral Programs October 27,
Florida’s Doctorate Programs in Key Areas of Basic Research Board of Governors Strategic Planning Committee Florida A&M University April 22, 2004.
4/6/20061 Are Sociologists Different? Findings from Social Science PhDs- 5+ Year Out: A National Study of PhDs in Six Social Science Fields Panel: Satisfaction.
1 CU-Boulder Doctoral Programs and the NRC Study Lou McClelland Planning, Budget, and Analysis February 2006
Amy W. Apon* Linh B. Ngo* Michael E. Payne* Paul W. Wilson+
PROMOTION AND TENURE FOR CLINICAL SCIENTISTS – BOTH PATHWAYS Peter Emanuel, M.D. Laura Lamps, M.D.
1 Graduate School. 2 Why graduate school? Different (better) job. Career change. Want to learn more.
Improving Measures of Science, Technology, and Innovation: Interim Report (2012) Committee on National Statistics Division of Behavioral and Social Sciences.
IT Accreditation and Model Curriculum Eydie Lawson, Chair SIGITE Barry M. Lunt, Ph.D. ISECON Nov
HATHITRUST A Shared Digital Repository HathiTrust and the Future of Research Libraries American Antiquarian Society March 31, 2012 Jeremy York, Project.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
Visioning 2 Committee 15,800 by 2018 JoyceArmstrongFamily Sciences 13 JessicaGullionSociology GovernorJacksonFinancial AidMarkHamnerInstitutional Research.
1 Revising the Content Standards for K-12 Social Sciences A Progress Report State Board of Education December 3, 2009.
National Research Council Assessment of Doctorate Programs: The Methodology Report E. L. Fink & S. Chai University of Maryland.
Strategic Planning Update Board of Trustees Academic Affairs and Student Life Committee February 6, 2009 Joseph A. Alutto Executive Vice President and.
Center for the Integration of Teaching, Research, and Learning Kathryn Spilios CAS Biology.
Initiative for Optimal Doctoral Completion. How are UGA programs performing with respect to doctoral completion figures? All of this leads to the question:
Independent Schools Survey— Fall 2005 Summary Report Conducted by the Data Committee of the Independent School Section of the AASL.
Writing Grants for VIVO Partnerships and Innovations Mike Conlon.
The Methodology of the NRC Doctoral Program Assessment
Faculty Salary Study Comparison to AAU Data Exchange Institutions
Marilyn Billings Scholarly Communication Librarian
The NRC Study and Neuroscience
Update May 5, 2008.
A DATA BASED ASSESSMENT OF RESEARCH DOCTORATE PROGRAMS
EDUCAUSE 2011 Philadelphia Convention Center 10/19/11
MIT Institutional Research
Supporting Faculty Research
Standard 10 Research(**) البحث العلمي )**(.
Presentation transcript:

Assessing Research-Doctorate Programs: A Methodology Study

Committee Task Review and revise the methodology used to assess the quality and effectiveness of research doctoral programs. Explore new approaches and new sources of information about doctoral programs and new ways of disseminating these data. Recommend whether to conduct a full assessment using the methodology developed in by the committee

History of NRC Assessments 1982 “Assessment of Research-Doctorate Programs in the United States” Lyle V. Jones (Co-Chair) Gardner Lindzey (Co-Chair) 1995 “Research-Doctorate Programs in the United States: Continuity and Change” Marvin L. Goldberger (Co-Chair) Brendan Maher (Co-Chair)

Perceived Strengths of Prior NRC Assessments Authoritative source Comprehensive Clearly stated methodology Temporal continuity Widely quoted and utilized

Perceived Weakness of Prior NRC Assessments Spurious precision of program rankings Confounding of research reputation and educational quality Soft criteria for assessments of programs Ratings based on old data

Weaknesses continued… Poor dissemination of results for some audiences Taxonomy categories out of date Validation of data inadequate

Design of the Methodology Study Formation of a committee. Definition of tasks. Panel meetings to define questions, discuss methodology. Panels:  Taxonomy and interdisciplinarity  Quantitative measures  Student processes and outcomes  Reputation and data presentation Pilot trials of questionnaires, taxonomy.

Recommendations Spurious precision issue: The committee recommends a new statistical methodology to make clear the probable range of ranking for each assessed academic unit.

Alternative Approach to Rankings to Convey Rating Variability Draw ratings at random. Calculate rating for that draw. Repeat process enough times to reach statistical reliability. Present distribution of ratings from all the draws.

Recommendations continued… Research versus education issue: –Drop reputational estimate of education quality as not independent of the reputational estimate of program quality. –Add quantitative indicators of educational offerings and outcomes.

Program Measures and a Student Questionnaire Questions to programs –Size –Student characteristics and financing –Attrition and time to degree –Competing programs

Program Measures and a Student Questionnaire continued… Questions to students in selected fields –Employment Plans –Professional Development –Program Environment –Infrastructure –Research Productivity

Recommendations continued… Soft criteria issue: Add quantitative measures concerning research output, citations, student support, time to degree, etc.

Examples of Indicators Publications per faculty member Citations per faculty member Grant support and distribution Library resources (separating out electronic media) Laboratory Space Interdisciplinary Centers

Recommendations continued… Poor dissemination issue: –Add analytic essays to archival book output. –Add updateable current web output. –Add electronic assessment tools. –Add links from professional societies.

Recommendations continued… Taxonomy issue: –Update 1995 taxonomy. –State clear criteria. – Consult professional societies, administrators and faculty. – Allow for two academic categories (rated programs and emerging fields). –Named subfields to help universities classify their programs. –Allowed faculty to be in more than one program. –Included two sub-threshold humanities fields (classics and German) to maintain continuity.

Recommendations continued… Validation issue: Conduct pilot studies and institute checks, both by institutional respondents and by external societies.

Pilot Institutions University of Maryland Michigan State University Florida State University University of Southern California Yale University University of Wisconsin at Milwaukee University of California, San Francisco Rennsalear Polytechnic Institute

What’s next Obtain financing for the full study from both federal and foundation sponsors. If funding is obtained: –Full study would begin in Spring, 2004 –Data collection in 2004/2005 for previous academic year. –Final report in summer 2006

Conclusion The study that the Committee recommends is a BIG undertaking in terms of survey cost and the time of graduate programs and their faculty. Why is it worth it? It will provide faculty, students and those involved with public policy an in-depth look at quality and characteristics of those programs that produce our future scientists, engineers, and those who help us understand the human condition.

Committee Jeremiah Ostriker, Princeton, (Astrophysics), Chair Elton Aberele, U. of Wisc (Ag) John Brauman, Stanford U. (Chem) George Bugliarello, PolyNY (Eng) Walter Cohen, Cornell U. (Hum) Jonathan Cole, Columbia U. (Soc Sci) Ronald Graham, UCSD (Math) Paul Holland, ETS (Stat) Earl Lewis, U. of Michigan (History) Joan Lorden, U. of Alabama- Birmingham (Bio) Louis Maheu, U. de Montréal (Soc) Lawrence Martin, SUNY-Stony Brook (Anthro.) Maresi Nerad, U. Wash (Sociology & Education) Frank Solomon, MIT (Bioscience) Catherine Stimpson, NYU (Hum)

Sub Committee – Panels STUDENT PROCESSES AND OUTCOMES QUANTITATIVE MEASURES TAXONOMY AND INTERDISCIPLINARITY REPUTATIONAL MEASURES AND DATA PRESENTATION Joan Lorden (Chair) University of Alabama-Birmingham Catherine Stimpson (Chair) New York University Walter Cohen (Co-Chair) Cornell University Frank Solomon (Co-Chair) Massachusetts Institute of Technology Jonathan Cole (Co-Chair) Columbia University Paul Holland (Co-Chair) Educational Testing Service

Additional Panel Members STUDENT PROCESSES AND OUTCOMES Adam Fagen, Harvard Univ. (Bioscience, grad.student) George Kuh, Indiana Univ. (Education) Brenda Russell, Univ. of Illinois-Chicago (Bioscience) Susanna Ryan, Indiana U. (English, Woodrow Wilson Fellow) QUANTITATIVE MEASURES Marsha Moss, Univ. of Texas (Institutional Research) Charles E. Phelps, Univ. of Rochester (Provost & Econ.) Peter D. Syverson, Council of Graduate Schools

Additional Panel Members TAXONOMY AND INTERDISCIPLINARITY Richard Attiyeh,UCSD (Econ.) Robert F. Jones, AAMC (Bioscience) Leonard K. Peters, VPI (Computer Science) REPUTATIONAL MEASURES AND DATA RESENTATION David Schmidley, Texas Tech (President & Bioscience) Donald Rubin, Harvard (Statistics)

Project web-site