Use of the Canadian Graduate & Professional Student Satisfaction Survey:  A Local Approach Joan Norris, Keith Flysak & Michael Bittle Faculty of Graduate.

Slides:



Advertisements
Similar presentations
2012 EXAMINER TRAINING Examples of NERD Comment Formatting
Advertisements

Qualifications Update: Engineering Science Qualifications Update: Engineering Science.
First-Year Graduate Student Survey INTRODUCTION As part of the Graduate Schools recruitment and retention efforts, a graduate student survey was developed.
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Transfer Success: Skills to Succeed in a Baccalaureate Program Charlene A. Stinard, Director Transfer and Transition Services University of Central Florida.
UTS CRICOS PROVIDER CODE: 00099F UKCGE INTERNATIONAL ANNUAL CONFERENCE 2014 Growing quality in uncertain times Prof Nicky Solomon, Dean, Graduate Research.
The International Postgraduate Student Experience – QAA perspective Dr Laura Bellingham Research, Development and Partnerships Group UKCGE workshop Thursday.
Educational Teams: Variation at McGill Teaching in a different way Lynn McAlpine McGill University Canada
Remaking the Climate for Graduate Education: DECADE Frances Leslie, Susan Coutin, Kelly Ward & Marina Corrales UC Irvine.
Now That They Stay, What Next?: Using NSSE Results to Enhance the Impact of the Undergraduate Experience.
Methods to weigh the outcome of the use of electronic resources Highlights of two studies among Doctoral Candidates and PhDs worldwide Mr. Boe Horton Senior.
Preliminary Analysis of the 2013 Canadian Graduate and Professional Student Survey (CGPSS) - York University Results Richard Smith Acting Director Office.
Getzel. Sheryl Burgstahler, Ph.D. University of Washington Services, Supports and Accommodations Role of Technology.
An Engaged CAO and CBO on How They Collaboratively Moved the Needle in Student Retention and Graduation The Wayne State University Perspective August 2014.
A Commitment to Excellence: SUNY Cortland Update on Strategic Planning.
Writing an Honors Thesis in the Marriott School of Management.
Susan Malone Mercer University.  “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy,
Assurance of Learning The School of Business and Economics SUNY Plattsburgh.
ARE WE GETTING THE JOB DONE? TEACHING FINANCIAL COMPETENCIES FOR PUBLIC HEALTH Louis C. Gapenski, PhD Michael E. Morris, MPH, MPA, CPH Peggy A. Honoré,
Law School Survey of Student Engagement Users’ Workshop November 4, 2011 Seton Hall Law School 1.
Bologna and the Third Cycle Anthony J Vickers UK Bologna Expert.
U NIVERSITY OF M INNESOTA Measuring Student Perspectives on University Experiences.
Examination of Holland’s Predictive Pattern Order Hypothesis for Academic Achievement William D. Beverly and Robert A. Horn Northern Arizona University,
CRICOS Provider No 00025B Strategies for enhancing teaching and learning: Reflections from Australia Merrilyn Goos Director Teaching and Educational Development.
University Plan for the Assessment of Student Learning Spring 2006 Revisions Include: -Addition of Graduate School Learning Goals -Incorporation of recommendations.
Cluster Analysis on Perceived Effects of Scholarships on STEM Majors’ Commitment to Becoming Teachers versus Teaching in High Needs Schools Pey-Yan Liou.
DRAFTFall ’08 / Spring ’09 Undergoing significant revision and expansion. Strategic Plan Draft October 1, 2008 Fall ’08/Spring ’09 Undergoing significant.
Graduate Program Review Prof. Emad Ali. Major Review Steps Self-study Report External evaluation Apply actions for improvement.
Benchmarks from the Harvard Collaborative on Academic Careers in Higher Education (COACHE) Faculty Job Satisfaction Survey University Faculty Meeting October.
Registration Satisfaction Survey FAS Report, Fall Presented by: K. El Hassan, PhD. Director, OIRA.
Assessing the Heritage Planning Process: the Views of Citizens Assessing the Heritage Planning Process: the Views of Citizens Dr. Michael MacMillan Department.
Alexis Kanda-Olmstead Office of Student Leadership, Involvement & Community Engagement Colorado State University March 27, 2008.
CCSSE Houston Community College System Presented by Margaret Drain June 19, 2007.
21 st Century Maricopa Review of Process Human Resources Projects Steering Team Meeting May 12, 2010.
Research Strategy Options Workshop FECM Strategic Review 2009.
Mentoring: More than just a welcome Michael DeBisschop, Pharmacy Alan Pogroszewski, Sports Management Kelly McCormick-Sullivan, Communication/Journalism.
January 29, 2010ART Beach Retreat ART Beach Retreat 2010 Assessment Rubric for Critical Thinking First Scoring Session Summary ART Beach Retreat.
Fostering student interest in academic administration through creation of an experiential learning rotation Andrew N. Schmelz, PharmD Candidate, Brian.
Linda G. Shook and Virginia E. O’Leary. Big Data.
Columbia University :: Office of the Provost :: Planning and Institutional Research NRC Assessment of Research-Doctoral Programs October 27,
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Academic Program Review Chair’s Workshop John E. Sawyer, Ph.D. Associate Provost Institutional Research and Effectiveness.
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
NSSE 2013 How to Use Results (or “Why you should care about NSSE”) 8/26/
2008 FAEIS Annual Longitudinal Assessment With a Comparison to the 2007 Survey Results The purpose of the FAEIS annual evaluation is to develop longitudinal.
Understanding ARC Future Fellowships ANU College of Medicine, Biology and the Environment and ANU College of Physical Sciences 20 th October
Retention and Advancement for Mid Career Faculty K.D. JoshiKelly Ward Associate Professor of Interim Chair and Information Systems Professor, Education.
Practices and Predictors of the Use of Accommodations by University Faculty to Support College Students with Disabilities Leena Jo Landmark, M.Ed., and.
Graduation Specialists: Overcoming Graduation Barriers
Online students’ perceived self-efficacy: Does it change? Presenter: Jenny Tseng Professor: Ming-Puu Chen Date: July 11, 2007 C. Y. Lee & E. L. Witta (2001).
Using Data to Promote Student Success Analyzing and Interpreting EQAO Results.
CCSSE 2012 Findings for Southern Crescent Technical College.
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
Educational Excellence – Phase One Lisa Blazer & Dan Gelo Presenting.
Strategic Enrolment Management: Core Concepts 1 SEM Summit April 24, 2008.
Development of Key Performance Indicators: Lebanese Case Study
IAP Data Support for Academic Program Reviews
SPHERE Study Visit: University of Edinburgh (October 2017)
UTRGV 2016 National Survey of Student Engagement (NSSE)
What’s Your Evidence? Using Data to Support Student Success
TRENT UNIVERSITY 2007 CGPSS REPORT
2017 National Survey of Student Engagement (NSSE)
Improving the performance reporting of primary care patient experience
UTRGV 2018 National Survey of Student Engagement (NSSE)
UTRGV 2017 National Survey of Student Engagement (NSSE)
Rankings from the perspective of European universities
2017 Postgraduate Research Experience Survey (PRES) Results
Internal and External Quality Assurance Systems for Cycle 3 (Doctoral) programmes "PROMOTING INTERNATIONALIZATION OF RESEARCH THROUGH ESTABLISHMENT AND.
Presentation transcript:

Use of the Canadian Graduate & Professional Student Satisfaction Survey:  A Local Approach Joan Norris, Keith Flysak & Michael Bittle Faculty of Graduate & Postdoctoral Studies

CGPSS Intent: to investigate sources and levels of satisfaction among enrolled graduate students In both research-intensive and professional programs

Why measure satisfaction? HEQCO perspective (Spence, 2009; Zhao, 2012): Better understanding of graduate level education processes; Comparative analyses; Provincial & national portraits of graduate education with insights into funding, completion, institutional infrastructure & other areas of improvement; Promoting relevant changes & appropriate adaptations to maintain a competitive international edge.

But keep in mind the limitations of the CGPSS: Survey development not systematic: Many sources cited (an informal group of grad deans from Rutgers, Duke, Stanford; adopted and revised by MIT, Western & G13). Despite its origins, not often used in the U.S. Decision rules re: category and question choice unclear (“anointed correct”) Reliability and validity unknown (although factor analyses have been carried out). Different versions of the survey administered, so a true-cross sectional analysis difficult.

And be cautious: Respondents’ answers to any measure of “satisfaction” may be influenced by: affective state; current context, future expectations, past events and social comparisons Findings will be also affected by Sample size restrictions, bias, missing data, rewards and incentives to participants

Our goals at Laurier: Examine stability of positive findings regarding faculty mentoring and teaching strength; Evidence for improvement in areas identified by first two administrations; Opportunities & challenges in individual programs; Provide information for cyclical reviews, integrated budgeting and planning exercise, strategic enrolment management; Benchmarking across similarly sized institutions.

Measures and Indices (HEQCO): General Assessment General Satisfaction Benchmarks of Satisfaction

General Assessment: How would you rate the quality of-- your academic experience at this university? your student life experience at this university? your graduate/professional program at this university? your overall experience at this university?

General Satisfaction: If starting over, select same university? If starting over, same field of study? Would you recommend this university to someone considering your program? Would you recommend this university to someone in another field? If starting over, select same faculty supervisor?

Benchmarks of Satisfaction: (items selected from factor analyses by G13) Quality of Teaching (3 items) Opportunities to Present and Publish (5 items) Research Training and Career Orientation (9 items) Supportive Dissertation Advisor (12 items)

Our analyses have included: Frequencies (provided by Mosaic) Snapshots of each administration Development of unique indices modelling Cross-sectional analyses of indices Program profiles and scorecards

Cross-sectional analyses: Variables: Composite general assessment index Composite satisfaction index Four benchmark indices: Quality of Teaching Opportunities to Present and Publish Research Training and Career Orientation Supportive Dissertation Advisor

Cross-sectional analyses (2007, 2010, 2013) with comparisons to mid-size and consortium groups (one- way ANOVAS with post-hoc comparisons) Response rates: approx. 40% for research intensive programs & 25% for professional programs at each administration Separately for master’s and doctoral students: Could not separate master’s because of changes to the survey 2007: with/without thesis 2010: regular/professional 2013: research & coursework streams/professional

Snapshot results: Areas of strength include benefits of a small institution: high quality faculty mentoring and teaching Implications for expansion Areas of need included extracurricular training opportunities Development of professionalization suite of workshops, seminars, courses (ASPIRE) Co-curricular record

Cross-Sectional Results: Satisfaction ratings consistent with same- sized universities in the consortium Overall high quality maintained in the context of rapid expansion (doubling of programs)

Differences in “general” measures often difficult to detect Persona and Program profiles and scorecards may be more useful: Contribute to Strategic Enrolment Management project Developed persona groups: professional master’s, research intensive master’s, doctoral Individual program results provide insights into quality enhancement.

Supplemented satisfaction scores with: Student demographics (e.g., age, citizenship/visa status, gender, Canadian geographic area (KW, rest of ON, QC, East, West, North) Nonenrolment survey results Admissions conversion scores: efficiency rates

Analyses of Variance in Persona Groups Using Satisfaction Indices

Research master’s persona group: Supervision satisfaction strengthening; publishing/presenting opportunities need more attention

Professional master’s persona group

Doctoral persona group: Significant improvement in Student Life & Quality; Supervision strong; presentation/publication opportunities need attention.

Final thoughts: Pressure to assess student views of their graduate experience will remain; Satisfaction surveys provide one useful, but limited, means of assessment; CGPSS will continue to develop as an assessment method; Benchmarking may be helpful, but within- institution scorecards more likely to lead to quality improvement.

Some things do improve over time! They look pretty satisfied…