Just before we get started… Who are we? How questions will be handled Resources available after the webinar Key QILT Dates & the AGS Item Review 2.

Slides:



Advertisements
Similar presentations
1:30-2:15. Preliminary Tables A & B and frequencies for checking during data submission Tables A-J Careers advisers (plus employer names file, etc.) National.
Advertisements

Placement Workshop Y2, Sem 2 Professional Practice Module (PPM)
School statistic collections Summary of previous years, results, issues and proposed changes to future years collections.
Graduate Application Project Design Concept Walkthrough
External Examiners Preview Demonstrations Academic Services & Student Systems Presented by Daniel Chandler, Project Officer, Academic Services & Matthew.
Degree completion as an indication of quality Nigel Palmer College of Arts & Social Sciences Australian National University International Conference on.
COLLECTING DATA ON A SAMPLE OF RESPONDENTS Designing survey instruments.
Multi-State Collaborative 1.  Ashley Finley ◦ Senior Director of Assessment and Research – AAC&U  Bonnie Orcutt ◦ Director of Learning.
Course Outline Process Overview for Course Convenors.
Writing the Honors Thesis A Quick Guide to Long-term Success.
Quality Indicators for Learning and Teaching 2014 Survey Managers’ Information Forum 18 July 2014 Dr Andrew Taylor Branch Manager, Higher Education Data.
Needs Analysis Instructor: Dr. Mavis Shang
INTRODUCTION FOR NEW SURVEY MANAGERS EVERYTHING YOU NEED TO KNOW BUT WERE TOO AFRAID TO ASK!! EDWINA LINDSAY, GCA.
Research Project or Thesis Proposal MT, MA and PhD
AFRICAN ECONOMIC RESEARCH CONSORTIUM (AERC): RESEARCH ACTIVITIES ON POPULATION AND DEVELOPMENT by Germano Mwabu University of Nairobi Prepared for presentation.
How to Assess Student Learning in Arts Partnerships Part II: Survey Research Revised April 2, 2012 Mary Campbell-Zopf, Ohio Arts Council
 Reports (tables and publications)  Processes  Data uses.
Using the H-index to Measure Czech Economic Research and Czech Researchers’ Habits Related to Research Papers T. Cahlík, H. Pessrová.
IT 499 Bachelor Capstone Week 9.
Competition and Regulation in India: Evaluation of Status & Perceptions and Associated Advocacy (ICRR Project, 2011) PROSAIG Advisory Group Meeting 8 th.
Presentation by Wendy Launder General Manager CRC and Small Business Programs.
Encounter Data Validation: Review and Project Update August 25, 2015 Presenters: Amy Kearney, BA Director, Research and Analysis Team Thomas Miller, MA.
Final Update on the New Faculty Course Evaluation & Online System November, 2003.
Jun 2012 Phil Aungles Director, Performance and Analysis Unit Higher Education Group DIISRTE Department perspective on the AQHE initiative and the AGS.
NCAA Athletics Certification Orientation. Overview Origin, Purpose and Benefits. Athletics Certification Process. Operating Principles. Measurable Standards.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
On-line briefing for Program Directors and Staff 1.
GOVERNOR’S EARLY CHILDHOOD ADVISORY COUNCIL (ECAC) September 9, 2014.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Increasing Efficiency in Data Collection Processes Arie Aharon, Israel Central Bureau of Statistics.
Giving Your Vitae a JOLT Michelle Pilati Professor of Psychology Rio Hondo College Edward H. Perry Professor of Mechanical Engineering University of Memphis.
Studying the transition to college: A new prospective study IMPACTS Supported by National Institute on Alcohol Abuse and Alcoholism Grant 4 R37 AA
QILT June Webinar 1. Just before we get started… Who are we? How questions will be handled Resources available after the webinar QILT June Webinar 2.
Intensive Course in Research Writing: Session 1 (27 June 2011)
Just before we get started… Who am I? How questions will be handled Resources available after the webinar Key QILT Dates & the AGS Item Review 2.
Just before we get started… Who am I? How questions will be handled Resources available after the webinar What does QILT mean for you? 2.
QILT Reporting Approach and 2015 SES progress Quality Indicators for Learning and Teaching.
Graduate Outcomes Survey (GOS) 2015 Preparations Quality Indicators for Learning and Teaching.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Mid-Decade Assessment of the United Nations 2010 World Population and Housing Census Program Arona L. Pistiner Office of the Associate Director for 2020.
Just before we get started… Who are we? How questions will be handled Resources available after the webinar Key QILT Dates & the AGS Item Review 2.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Program Assessment Technical Assistance Meetings December 2009.
QILT May Webinar1. Just before we get started… Who are we? How questions will be handled Resources available after the webinar QILT May Webinar 2.
APRIL 2, 2012 EDUCATOR PREPARATION POLICY & PRACTICE UPDATE.
Introduction to the Pennsylvania Kindergarten Entry Inventory.
 Complete GDS in-house and an online link to complete the CEQ  Complete the GDS in-house and mail a hardcopy CEQ form  Reminder calls to complete.
An agency of the European Union EudraCt – Results Webinar # 1 Presented by Tim Buxton on 20 January 2016 IT Service Strategy Manager, IT Operations.
Graduate Outcomes Survey Longitudinal Quality Indicators for Learning and Teaching.
Co-funded by the European Union Ref. number: LLP FI-ERASMUS-ENW WP2: Identification of Industrial Needs for Open innovation Education in.
DLHE Longitudinal Survey Catherine Benfield HESA.
Rebecca L. Mugridge LFO Research Colloquium March 19, 2008.
WIND And Solar ENERGY CONVERSION MODEL GUIDELINES Consultation update – September 2016 Presented by Marcelle Gannon.
Taught Postgraduate Program Review
Distribution Workgroup 24/11/16
Beyond the Australian Graduate Survey
Digital Learning Framework Evaluation Overview
Outcome TFCS-11// February Washington DC
Overview of the FEPAC Accreditation Process
— How To Apply For A Research Degree And Scholarship
Menus of Best Practices and Strategies
Alternate Assessment Updates
NRF Evaluation & Rating
LAMAS Working Group 29 June-1 July 2016
The Estonian experience with ex-ante evaluation – set-up and progress
2017 GOS Preliminary Analysis
Review plan of the nature reporting – update 6
Taught Postgraduate Program Review
NatSIP-CRIDE Feasibility Study
Presented by Planning & Research
Gender Training Workshop Name of Institution Place Date
Presentation transcript:

Just before we get started… Who are we? How questions will be handled Resources available after the webinar Key QILT Dates & the AGS Item Review 2

The Graduate Outcomes Survey Trial Quality Indicators for Learning and Teaching

Overview 1.AGS / GOS Alignment 2.GOS Sampling 3.The GOQ 4.GOS Trial Objectives GOS Trial 4

AGS / GOS Alignment GOS Trial 5 AGSGOS Research designCross-sectionalLongitudinal Sample frameInstitutional dataHEIMS In-scope populationAll university graduatesAll higher education graduates Census / sampleCensus Data collectionOnline, telephone, hardcopyOnline with optional telephone DeploymentSeparate survey for each institutionSingle survey Target50%Maximum achievable Reference datesEnd of October, end of AprilLast four weeks Deployment datesVariousFirst week of May, first week of November

GDS-CEQ/PREQ / GOQ Alignment GOS Trial 6 AGSGOQ Labour force outcomesYes Further study outcomesYes CEQYesYes for 2015/16 PREQYes Optional itemsNoYes Additional populationsNoYes

Telephone non-response follow up CATI has been approved by the department in response to sectoral requests CATI is available for both the SES and the GOS All telephone interviewing will:  Be undertaken at the conclusion of the online data collection period  Be administered centrally by SRC  Only include students or graduates selected by SRC (to top-up underperforming strata)  Be an optional, fee-for-service activity GOS Trial 7

Telephone non-response follow up (2) Early booking required to avoid disappointment!  30 June for the SES  31 August for the GOS (November round) Please read the QILT CATI fact sheet if you’re interested Feel free to ask for further information or request a quote We will be monitoring and modelling mode effects to make sure that the data collected is comparable to the online data We will also making the required changes to the instruments (and any additional items) so they can be administered over the phone GOS Trial 8

AGS / GOS alignment The GOS has retained the majority of the key features of the AGS The GOQ has retained all of the key concepts of the GDS & the CEQ (for 15/16) GOS Trial 9

GOS sampling – Best option The validated Past Course Completions file submitted on 30 April for students who completed their courses in the previous year.  This is too late for the GOS. For the October survey we need a Completions file in early September for students who completed their courses between January and June. For the April survey we need a Completions file in early March for students who completed their courses between July and December of the previous year. GOS Trial 10

GOS sampling – Other options The alternatives are ‘less optimal’.  Institutions create the GOS sample file in-house to specifications provided by SRC. This file would be validated by SRC.  Use Submission 2 and 3 HEIMS data to determine which final year students are likely to complete their courses. This is a refinement of the process of determining final year students for the Student Experience Survey (SES). GOS Trial 11

GOS sampling – ‘Hybrid option’ Most institutions create a ‘preliminary’ Course Completions files for use the October and April rounds of the Australian Graduate Survey (AGS). Feedback indicates the majority of institutions can submit ‘preliminary’ Course Completions files to the Department for use in the October and April rounds of the GOS. Using these ‘preliminary’ Course Completions file as a base, the Department will append the demographic variables to the sample file from HEIMS (date of birth, gender, Indigenous background, etc.). GOS Trial 12

GOS sampling – Validating the file As in the Student Experience Survey (SES), you will be asked to append students’ addresses and phone numbers and to update their mailing addresses if these are missing in the HEIMS data. You will also be asked to provide information of interest to you which is not reported in HEIMS (Faculty and Campus details). The sample file will then be returned to institutions with all of the selected graduates flagged. GOS Trial 13

GOS sampling – Sample file contents The sample file will contain about 55 variables. A concordance with the AGS will be provided. The first component contains 30 variables required to execute the survey. The second component contains about 10 variables used to flag students that are out of scope for the GOS and to monitor response rates during the course of the survey. The third component contains 15 variables which are used for reporting purposes only - age, gender, etc. GOS Trial 14

GOS sampling – HDR students The GOS Sampling Expert Review Panel will (hopefully) start to examine options for sampling HDR students within the next week Panel members are welcome to submit preliminary ideas for sampling or relevant research papers to be tabled at the meeting Members will be asked to undertake (small-scale) investigations of potential options – (yes, there is homework)  If you have any concerns about accommodating additional tasks in your current workload, you might be more interested in the outcomes of the panel’s deliberations than participating GOS Trial 15

GOS sampling - Majors A study is being planned which:  Derives majors from the Discipline Codes (E464) assigned to units of study undertaken by students in the courses they completed.  Compares these with Fields of Education assigned to courses (E461 and E462) and Specialisation Codes (E463) recorded in the Course Completions file.  Compares these with the four major codes assigned by graduates in the AGS.  The object of the exercise is avoid the use of graduate assigned majors. GOS Trial 16

GOS sampling summary GOS sampling will be centralised and standardised.  We are confident that this will be the case for bachelor level graduates and  Are working with the sector to develop a solution for HDR graduates GOS Trial 17

The Graduate Outcomes Questionnaire (GOQ) Developed iteratively with the department and academic partners. The process broadly involved:  Sectoral identification of essential and important AGS data elements,  Standardisation of the GDS items with ABS items,  Review of the literature to identify validated scales,  Development of a draft GOQ,  Peer and policy review of the GOQ and related documentation, and  Review of the GOQ by the QILT Working Group (current). GOS Trial 18

GOQ review process GOQ review materials will be distributed tomorrow The review package will include:  GOQ review guidelines  A draft GOQ  A concordance between the GDS and the GOQ  A summary paper on a theoretical model of graduate outcomes Feedback is due by the 8 th of April GOS Trial 19

GOQ review guidelines Most of the items that are included in the GOQ are from ABS surveys or are standardised scales There are a number of items where we required feedback We would welcome suggestions regarding linking or explanatory text If the same gap is identified by a number of institutions we will see if it can be addressed. GOS Trial 20

The draft GOQ There are five modules in the draft GOQ  Introduction and course/major identification  Labour Force Outcomes  Further Study Outcomes  CEQ (for 2015/16) and the PREQ  ESS Bridge The final GOQ will have an additional module for institutional items and an additional module (which may or may not be used) for emerging policy issues GOS Trial 21

Why has the CEQ been retained? Feedback from the AGS item review indicated that the core CEQ scales were frequently used by institutions but that the scales were no longer ‘fit-for- purpose’ The scale concepts still seem to be relevant but the individual items are not as contemporary as they could be Graduates typically want to provide subject feedback on their course and the GOS won’t be ‘face valid’ with out these types of items We will use the 2015/16 cycle to further investigate whether the CEQ should be ‘renovated or demolished’ GOS Trial 22

A model of graduate outcomes The model was developed to provide a theoretical starting point for examining graduate outcomes. It accommodates current approaches to measuring outcomes and contextualises these outcomes, where possible. A set of indicators has been proposed but has not been finalised – please treat these as formative/indicative for the moment. GOS Trial 23

A model of graduate outcomes GOS Trial 24

Cognitive testing Will be undertaken with 12 recent graduates from a range of backgrounds Graduates will be asked to comment on:  Explanations of standard questions (where supporting text has been provided)  The ‘flow’ of the instrument  The non-standard items, and  The transition to the ‘sample build’ (collection of employer details) for the ESS. GOS Trial 25

GOQ – next steps Feedback collated from the QILT Working Group, the sector, and the cognitive testing process. A revised GOQ will be drafted. A summary of the changes made as a result of the feedback will be circulated. The GOQ will be programmed and tested for the GOS trial. GOS Trial 26

The GOQ The GOQ measures standard graduate outcomes for both employment and further education These outcomes can be contextualised to better understand why graduates may not be in full time work or are not employed in their ‘discipline area’ GOS Trial 27

GOS Trial objectives Test the GOQ  Examine all paths through the instrument for all discipline groups  Identify any respondent comprehension issues with the ABS or standardised items  Improve the linking text between modules  Assess the utility of courses vs admin majors vs student supplied majors Test the ‘ESS bridge’ Test a new approach to incentivisation Explore AGS vs GOS approach to sampling Compare AGS vs GOS identification of employment rates GOS Trial 28

GOS Trial parameters Potentially six universities and five NUHEIs will be participating 50% of the university graduate population and all of the NUHEI population will be sampled The GOQ will be administered, then the CEQ or PREQ and the ESS bridge for employed graduates Data collection will commence on 4 May and conclude in just over four weeks The second ESS trial will be run in parallel with the GOS trial GOS Trial 29

GOS Trial outputs Data will be made available to all participating institutions when coding issues have been resolved (approx. three to four weeks) Supporting documentation will be less detailed than usual Code for any derived variables will be provided Operational data and GDS/CEQ data will be compared to the AGS when the AGS has finished in field Outcomes from the GOS trial will be shared with the sector as soon as possible GOS Trial 30

To summarise… The GOS has retained all of the key features and concepts of the AGS Where possible, GOS sampling will be centralised and standardised The GOQ measures standard graduate labour force outcomes (If everything goes according to plan) The GOS Trial will provide the blueprint for the 2015/16 implementation of GOS GOS Trial 31

Message The GOS will meet institutional requirements for outcome data in a timely and (hopefully) pain free manner