INSTITUTIONAL RESEARCH AND DECISION-MAKING Vladimir Briller, Ed.D. Executive Director of Strategic Planning and Institutional Research Pratt Institute,

Slides:



Advertisements
Similar presentations
Assessing the Work of Higher Education: Institutional Effectiveness and Student Learning Dr. Jo Allen, Senior Vice President & Provost Widener University.
Advertisements

MSCHE Standards: Institutional Effectiveness (7) and Student Learning (14) Dr. Jo Allen, Senior Vice President & Provost Widener University.
Twin Peaks of Assessment: Institutional Effectiveness and Student Learning Dr. Jo Allen, Senior Vice President & Provost Widener University.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
THIS WORKSHOP WILL ADDRESS WHY THE FOLLOWING ARE IMPORTANT: 1. A comprehensive rationale for funding; 2. Measurable objectives and performance indicators/performance.
A relentless commitment to academic achievement and personal growth for every student. Redmond School District Graduates are fully prepared for the demands.
The Board’s Role in Accreditation Michael F. Middaugh Associate Provost for Institutional Effectiveness University of Delaware Celine R. Paquette Trustee,
A Commitment to Excellence: SUNY Cortland Update on Strategic Planning.
An Assessment Primer Fall 2007 Click here to begin.
SEM Planning Model.
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
Understanding Boards Building Connections: Community Leadership Program.
The Academic Assessment Process
Step Into Your Future: Understanding College Fit.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
DRAFTFall ’08 / Spring ’09 Undergoing significant revision and expansion. Strategic Plan Draft October 1, 2008 Fall ’08/Spring ’09 Undergoing significant.
ONE-STOP SHOP: INTEGRATED ONLINE PROGRAM REVIEW AND BUDGET PLANNING Daylene Meuschke, Ed.D. Director, Institutional Research Barry Gribbons, Ph.D. Assistant.
What should be the basis of
Standards and Guidelines for Quality Assurance in the European
 The Middle States Commission on Higher Education is a voluntary, non-governmental, membership association that is dedicated to quality assurance and.
MGT-555 PERFORMANCE AND CAREER MANAGEMENT
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Creating a Culture of Collaboration: Collecting Community Engagement Data Susan Connery, Director of the Feinstein Community Service Center Christopher.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Enhancing Parents’ Role in Higher Education Assessment Anne Marie Delaney Director of Institutional Research, Babson College.
1 Focus on Quality and the Academic Quality Improvement Program At Cuyahoga Community College.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Session Materials  Wiki
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Tusculum College School of Business. Tusculum College Program is: –Approved…Regionally by Southern Association of Colleges and Schools –Flexible & cost.
Assessing Student Learning Outcomes in Student Development – Part I Student Development Division Meeting SUNY Oneonta May 9, 2008.
Overhaul of a Graduate Program in Arts Administration Master of Arts in Arts Administration –Initiated in 2003 –Low-residency—one weekend per month (over.
The Voluntary System of Accountability (VSA SM ).
Student Development The Gateway to … Opportunity Accomplishment Financial Solution Student Success Achievement Student Life.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessing Program-Level SLOs November 2010 Mary Pape Antonio Ramirez 1.
Exemplary: Serving as a model; worthy of imitation; commendable.
HECSE Quality Indicators for Leadership Preparation.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Developing the Year One Report: WVC’s Experience as a Pilot College Dr. Susan Murray Executive Director, Institutional Effectiveness.
Foundations of Excellence TM in the First College Year Improving the First Year of College: Foundations for Excellence Scott E. Evenbeck IUPUI FACULTY.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
AASCB The Assurance of Learning AASCB Association to Advance Collegiate Schools of Business Marta Colón de Toro, SPHR Assessment Coordinator College of.
Middle States Steering Committee Overview of Standards March 20, 2008.
WHO Global Standards. 5 Key Areas for Global Standards Program graduates Program graduates Program development and revision Program development and revision.
Preparing and Evaluating 21 st Century Faculty Aligning Expectations, Competencies and Rewards The NACU Teagle Grant Nancy Hensel, NACU Rick Gillman, Valporaiso.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Gallaudet University 2015 There’s No Place Like Home: Assessing Climate Prepared by OAQ/Office of Institutional Research October 20,
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
Assessment, Accreditation, and Retention. “Thriving at the Liberal Arts College: Best Practices in Operations and Research” Dr. Claire Robinson, University.
The Essentials of Strategic Enrollment Planning James Mager Associate Vice President.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment toCCED Mohawk Valley Community College October 11, 2004.
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Graduate Program Completer Evaluation Feedback 2008.
Evaluator Training Workshop March 1, 2012 Jeff Jordan Vice President for Student Life Seattle Pacific University.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
HLC Criterion One Primer Criterion One. Mission August 27, 2015.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
Assessment and Evaluation of CAREER Educational Components Center for Teaching Advancement and Assessment Research.
Consider Your Audience
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Overview Brief background on the UChicago Consortium, the post-secondary transition research project, and Chicago Public Schools (CPS) Three main findings.
Governance and leadership roles for equality and diversity in Colleges
Engaging Institutional Leadership
Presentation transcript:

INSTITUTIONAL RESEARCH AND DECISION-MAKING Vladimir Briller, Ed.D. Executive Director of Strategic Planning and Institutional Research Pratt Institute, New York, U.S.A. Higher School of Economics, Moscow October 18, 2012

Institutional Research Institutional Research is the practice whereby an institution assesses itself, its activities and its position within a given milieu. Higher Education Institutional Research offices conduct these assessments with the objective of serving as a comprehensive resource for information about the institution.

Institutional Research at Pratt The Office of Institutional Research (IR) at Pratt Institute is part of the President's Office. IR mission is to support data driven decision making in evaluation and planning efforts of the Institute's senior administration by initiating and conducting studies on Pratt's policies, academic programs, and environment.

IR office : Gathers information from internal and external sources (e.g., students, parents, faculty, staff, other institutions, and external agencies) for assessment and strategic planning. Provides information and projections needed for planning. Coordinates Pratt's response to reports required by the federal government, including the IPEDS report, NYSED, NASAD, retention and graduation rate studies, etc.

IR Office (continued): Provides information required for certain institutional affiliations, such as accreditation reports, AICAD, and any special research projects in which Pratt Institute chooses to participate. Responds to external information requests and surveys that are determined to be of value to Pratt Institute.

Assignment You have decided that a new program should be open or an ineffective one closed. What information will you request (and from who) to make an educated decision?

Institutional Research The data resources usually comprise information derived from surveys, student records and other internal record systems, sectoral and national databases and reports and published research. The actual assessments, analyses and tested hypotheses cover issues requiring ongoing monitoring as well as the exploration of emerging issues to inform an institution’s decision-making with regard to its own development.

IR Support of Teaching and Development (examples) Grade ranges applied in particular subjects over time and correlation with changing characteristics in student cohorts with regard to prior achievement. The impact of separate components (e.g. modules) on overall award classifications over time. The effect of the size of continuous assessment components on overall grades awarded.

IR Support of Teaching and Development (examples) The entry standard below which students have a substantially increased risk of failure The importance of mathematical ability in overall performance in Science and Engineering Application, acceptance, registration and withdrawal figures for programs reflecting demand, perception and experience.

IR Support of Teaching and Development Example of a faculty question: Failure rates have risen dramatically in one of my courses, but I have not changed my methods and I can’t see why this has happened. Possible IR-based explanations: Changes in entry requirements Changes in actual pre-entry educational achevement of the cohort.

IR Support of Teaching and Development Achievement in core pre-entry subjects such as English or Mathematics Changes in class size Changes in origins of class (are all students in the class native English speakers?) Gender, Age, Educational and socioeconomic characteristics, and attendance type profiles Range of grades used over time in assessing the course

IR Support of Teaching and Development Example of a Dean/Department Chair Question : Student retention in my program is poor, I understand some of the reasons why but I want to address the problem and need a comprehensive picture of what is happening. Possible IR-based actions: Analyze: Student profile now, how it has changed and how it is likely to change in the future Particular program elements are contributing most consistently to non-completion

Analyze: The students’ perception of the program and overall college experience Whether student expectations of the program were realistic prior to entry IR Support of Teaching and Development

Analyze: Whether entry requirements need to be recalibrated based on changes in standards or curricula outside the Institution Whether a change in program content and providing extra support in problem areas, would help students to progress.

Case Study: Attitudes, experiences and characteristics influencing student degree completion Focus on Freshmen The project uses information from student database and information derived from a series of three student surveys. The surveys track changing attitudes as well as academic progress through the first year. The study also eliminates factors that do not actually have any significant effect on student achievement

Study Goals Explore a wide range of aspects of the experience of undergraduate students with the specific purpose of identifying factors that may influence program completion. Identify the factors and relationships determining the qualitative nature of the student experience. Explore the relationship between pre-entry expectations and reality of the university experience. Identify factors affecting student retention, with a view to focusing efforts and resources on the most potent influencing factors.

First Survey: Point of Admission Demographics Self evaluation of personal characteristics; including persistence, mathematical and writing ability, ambition, academic ability and self-confidence, Factors affecting the decision to study at University, Level of prior understanding of the program, Anticipated time spent on specified work, study and social activities, Difficulties anticipated,

First Survey: Point of Admission Perceived locus of responsibility for learning and the role of the lecturer, Priorities while at University, academic ambitions and career goals, Family educational background, Financial concerns, Perception of the experience of studying at higher education level in practical terms, and The anticipated best and worst elements of the experience of study at University.

Second Survey – Mid-year Compares student responses with the first survey Self evaluation of characteristics, Level of prior understanding of the program, Actual time spent on specific activities, Difficulties encountered, Perceived locus of responsibility for learning and the role of the lecturer, Priorities while at University, academic ambitions and career goals,

Second Survey – Mid-year Financial concerns, The best and worst elements of the experience thus far, Self-identified changes in perception of study at higher education level having spent one semester in the University, Support services accessed, and Integration into campus life/sense of belonging.

Third Survey – End-of-Year Academic history including high school results and SAT (ACT) scores, and level of preference for the institution where the participants were accepted, Exam results achieved through the year, including continuous assessment grades, End-of year results, Other official items of record including withdrawal and reasons for withdrawal, changes in optional program elements and transfer, and Completion rates at the institutional and program level.

Other Cases/Studies Factors affecting student retention and graduation Barrier Courses Placement tests and their impact on subsequent student course performance SAT scores as predictors of student persistence

Enrollment Management

STUDENT RECRUITMENT The Educational Pipeline Understanding Student Choice: ~ Marketing studies that determine what factors influence students to apply, become admitted, and enroll at the institution. ~ Identifying databases and software analyses tools that facilitate institution’s ability to locate, recruit and attract students in the pipeline.

STUDENT RECRUITMENT ~ Generate a trend analysis that compares characteristics of this year’s applicants with applicants from previous years at the same point in time. ~ Compare admitted studentswho ultimately chose to enroll with those who did not. ~ Provide institutional data to college ranking services. ~ Provide data about student and parent perception of the institutional image as compared with peers.

STUDENT RECRUITMENT Yield rates: ~ Admit yield rates ~ Enrollment yield rates Enrollment projections: 1.What are the needs of institution? 2.What are the dimensions of the analysis? 3.What is the time horizon? 4.What methodology should be used? 5.How should qualitative and quantitative input be balanced?

STUDENT RECRUITMENT Financial Aid (FA) 1.What is the college’s FA policy? Who determines the policy? How well integrated are the admissions and FA policies? 2.What types of aid are available? How do students qualify? 3.How is FA packaged? How and when are students offered aid? How is it disbursed? 4.How are scholarships, loans and student employment balanced? 5.How are recruitment and retention functions of aid balanced?

STUDENT RECRUITMENT 6.What are statistics reported by the FA office? (examples) ~ How many students receive aid? New students? Continuing students? ~ How many receive scholarships? Loans? Work-study awards? ~ How many receive need-based aid? How many show unmet need? ~ How much FA is disbursed? What is the net tuition revenue? ~ What is the price of attendance?

STUDENT RECRUITMENT ~ What is the level of student indebtedness? ~ How those statistics vary by student demographics and other characteristics? ~ What are the trends over time? National Concerns: The interplay between FA, tuition and college price overall. The impact of federal and state policies on FA. Use of “discounting” for effective recruiting. The rapid escalation of student loans and indebtedness.

STUDENT FLOW Academic Preparation Selecting Students Student Placement Other Academic Assets The Curriculum Types of Studies Campus Climate Academic and Student Support Programs Formative (Process) Evaluation Summative (Outcome) Evaluation

STUDENT FLOW Graduation and Retention Rates 1.Increasing the institution’s retention and graduation rates 2.Increasing transfer rates (in) and baccalaureate degree completion of associate degree students 3.Reducing time to graduation 4.Closing the gap between underrepresented groups and other students 5.Increasing academic preparation – the link between recruitment & retention 6.Implementing & evaluating efficient & effective retention programs.

STUDENT FLOW Descriptive Data Multivariate Analyses Qualitative Methods 1.Survey Research 2.Interviews 3.Focus Groups Peer Data

Student Flow Beyond Graduation: 1.The overall quality and training of an institution’s graduates/students. 2.The preparation of graduates in specific areas: writing skills, technical skills, quantitative resoning, oral communication, leadership & teamwork. 3.The accessibility of the campus, and its students, to the employer for interviewing. 4.Trends in past hiring and expectations for the future.

Assessment Operational Terms Drivers of assessment Assessment of institutional effectiveness Assessment of student learning outcomes Blending assessments Benefits and cautions Questions and concerns

Assignment Please list all the factors you use to evaluate students / faculty / administrators.

Institutional Research and Assessment Assessment is the process of asking and answering questions that seek to align our stated intentions with documentable realities. As such, in higher education, it deals with courses, programs, policies, procedures, and operations.

Evaluation: An Operational Definition Evaluation focuses on individual performance in the sense of job completion and quality, typically resulting in merit raises, plans for future improvement, or—in less satisfying cases—probation and possibly firing.

Assessment vs. Evaluation Assessment focuses on the work to be done, the outcomes, and the impact on others—not on the individuals doing the work. Evaluation focuses on the work of the individuals—their contributions, effectiveness, creativity, responsibility, engagement, or whatever factors the organization deems most desirable.

Assessment of Institutional Effectiveness vs. Student Learning Institutional effectiveness = the results of operational processes, policies, duties and sites—and their success in working together—to support the management of the academy Student learning = the results of curricular and co-curricular experiences designed to provide students with knowledge and skills

What or who is driving assessment? Accreditors…  charged with determining the reputable from non-reputable institutions and programs  charged with checking on practices that affect the viability and sustainability of the institution and its offerings  represent disciplinary and institutional interests

Assessment drivers (cont’d.) The public: “Ivory Tower,” liberal bias, ratings/rankings Legislators: responsive to citizens’ concerns about quality, costs, biases….or? Prospective faculty: Quality and meaningful contributions to students’ lives Prospective parents: real learning and preparation for careers Prospective students: How will I measure up? And what kind of job can I get when I graduate? Funding agencies/foundations: evidence of commitment to learning and knowledge and evidence of [prior] success

Higher Education Realities Competitive nature of higher education –National rankings –Institutional research and data –Marketing –Niche markets Tuition Costs Consumer attitudes of students: learning outcomes and institutional effectiveness

Matters of Institutional Quality Can we justify costs/prices of attendance? Can we verify the quality of our educational offerings in measurable terms? Can we verify the effectiveness of operational contributors to a sustainable educational experience? Can we use data and other findings to improve the quality of our educational and operational offerings? Can we use those findings to align resources (financial, staff, curricular, co-curricular) to enhance desired outcomes?

Sites of Institutional Effectiveness Processes [existence and transparency] –Enrollment: Admissions, financial aid, registration –Curricular: Advising, progress toward degree completion –Budgeting: operations/salaries; capital; bond ratings and ratios; endowment management; benefits; etc. –Planning: strategic planning, compact planning, curricular planning, etc. –Judicial: education/training, communication, sanctions, etc. –Residence Life: housing selection, training for RAs, conflict resolution/mediation,

Sites of Institutional Effectiveness Units/Offices of operations –Advancement –Admissions, Bursar, Registrar –Center for Advising, Academic Support, etc. –Campus Safety –Maintenance –IT –Institutional Research –Athletics –Student Engagement

The Assessment Cycle: Key Questions for Institutional Effectiveness  What services, programs, or benefits should our offices provide?  For what purposes or with what intended results?  What evidence do we have that they provide these outcomes?  How can we use information to improve or celebrate successes?  Do the improvements we make work?

What are we looking for? EXAMPLES of evidence:  Our admission of students for whom our institution is the first choice has risen 20%.  90% of students report satisfaction with the housing selection process.  Four faculty and two student committees participated in the last strategic planning planning cycle.  Overall, faculty, staff, and students report feeling safe on campus, following the new Campus Safety Improvement initiatives.

Where do we seek improvement [and what evidence will help us]? We need to raise the number of students who choose our institution as their first choice to 50% by All faculty committees will be invited to participate in the next planning cycle. Students (95%) will report feeling safe on campus and its neighborhood. 50 percent of the credit-hours will be taught by the full-time faculty.

What qualities point to institutional effectiveness? A well-articulated set of processes for critical functions A clear line of responsibility and accountability for critical functions An alignment of the importance of the function and sufficient resources (staff, budget, training, etc.) to support the function Evidence of institution-wide knowledge of those critical functions, processes, and lines of responsibility

What kinds of evidence points to institutional effectiveness? Well-managed budgets Accreditation and governmental compliance Clearly defined and supported shared governance (board, president, administration, faculty, staff, and students) Communication pathways and strategies [transparency] Consensus on mission, strategic plan, goals, priorities, etc. Student (and other constituencies’) satisfaction

How do we measure institutional effectiveness? Tangible evidence: Audited budget statements, handbooks, enrollment data, institutional data Records/reports of activities and/or compliance Self-studies pointing to documented evidence Surveys of satisfaction, usage, attitudes, confidence, etc. Disciplinary accreditation reports

The Assessment Cycle: Key Questions for Student Learning  What should our students know or be able to do by the time they graduate?  What evidence do we have that they know and can do these things?  How can we use information to improve or celebrate successes?  Do the improvements we make work?

The Iterative Assessment Cycle for Institutional Effectiveness Mission/Purposes Objectives/Goals Outcomes Implement Methods to Gather Evidence Gather Evidence Interpret Evidence Make decisions to improve programs, services, or benefits; contribute to institutional experience; inform institutional decision- making, planning, budgeting, policy, public accountability

Student Learning Assessment: What should students know or be able to demonstrate by the time they graduate? Civic engagement Diversity appreciation Communication skills Professional responsibility Ethics Critical thinking Collaborative learning Leadership Mathematical or Quantitative competence Technological competence Scientific competence Research skills Cultural competence Interdisciplinary competence Civic responsibility Global competence Economic/financial competence Social justice

What might our sources of evidence be? Essays/Theses Portfolios (faculty or external readers evaluated) Quizzes Oral presentations Homework assignments Lab experiments Tests Journal entries Projects Demonstrations

What are we looking for? Evidence of students’ skill level (basic competency to mastery) –based on faculty-articulated standards of quality and judgments –applied to all students’ work evenly –indicative of aggregate evaluations of performance or knowledge –informative for course or program improvements

Can we use the same processes and strategies to assess both arenas? Measuring learning versus effectiveness, efficiency, and/or satisfaction - BEYOND ANECDOTAL INTO EVIDENCE Methods of testing, projects, demonstrations versus surveys, records, reports - QUALIFY OR QUANTIFY THE OUTCOMES Use of results (revisions versus training) - MODIFY WHAT YOU DO TO AFFECT OUTCOMES

What is similar? A commitment to doing the very best job possible under whatever conditions exist A commitment to recognizing ways that altering those conditions can affect the outcomes A commitment to recognizing that altering the outcomes can affect the conditions

Ultimately…. We hold ourselves and our colleagues accountable for articulating the intentions of our work and then measuring the realities, resulting in designing and implementing strategies for improvement over time. How are we doing? How can we do better?

Common IR The Office of Institutional Research provides a wide variety of services: Fact book Student Retention and Graduation Reports Federal Reporting National and professional surveys In-house surveys, etc.

Assignment What parameters do you use to compare your institution, division, school or department to your peers?