Platform Influences on Survey Delivery and Response Rate Presenter Phillip Adams Evaluations Manager Australasian Higher Education Evaluation Forum 2008 Friday 2 October, 2008
Presentation Outline CHEQ (Centre for Higher Education Quality) The Way We Were The Response The Outcomes
CHEQ Established in September 2000 Lead and support quality assurance and improvement various areas of teaching, research, research training and support services
The Evolution Two drivers for the change – Quality was an integral part of the higher education sector Need for institutional measures of academic activity
Unit Evaluations The way we were 1998 – 2002 item bank academics create own survey 2002 – 2004 faculty takes responsibility faculty-wide questionnaire conduct at least every five years
The issues 1998 – 2002 singular, aggregated report per unit/subject no university benchmarking capacity
The issues 2002 – 2004 patchy take up saw mix of academic & faculty reports still no university benchmarking capacity few opportunities for monitoring improvement
Major change in common items across university Up to 10 faculty items To be taken of each unit each year it is offered Results posted on a common website Results to be systematically considered by faculty
Re-engineering Technology And Processes both paper-based and online surveys for all units large volume of survey responses in short turnaround time system capable of storing data in any survey, online or paper, in single location improved access to results for staff and key stakeholders reduced cost of survey development and processing allowed automated reporting on WWW 2005, CHEQ introduced Survey Management System (SMS). Significant benefits from change in technology and processes:
PDF survey Benefits able to fill in without downloading presentation appealing Drawbacks practically clumsy poor accessibility for visually impaired
PDF survey limitations Three main types of student queries/complaints students unfamiliar with a fillable PDF browser version incompatibilities web browser setup Plus lack of accessibility for visually impaired disenfranchised students
Our response prepared “how-to” information sheets initially assisted VI over phone and with customised survey shifted from PDF to HTML form
Outcomes student familiarity accessible for visually impaired assisted over phone special form design considerations No browser compatibility issues
Outcomes Zero complaints about access
Outcomes Increase in web based response rate
Conclusion Quality cycle relies on the effectiveness of the evaluation system Better meet the needs of stakeholders Survey more user friendly Positive influence on response rate