Download presentation
Presentation is loading. Please wait.
Published bySophie Russel Modified over 9 years ago
1
Evaluating the Quality of Online Programs Diane Ruiz Cairns, Ed.S. Lawrence Technological University eLearning Services
2
Agenda Overview Why monitor quality? Methods for monitoring quality of online programs Lawrence Tech, eLearning Services experience Next steps
3
Overview Program Evaluation Making data-driven decisions Support of performance improvement Alignment of resources, performance and strategic goals Add measurable value
4
Overview Quality of online programs impacts student retention, enrollment and graduation rates Online environment includes: Institutional Technology Course Development Course Structure Teaching and Learning Student and Faculty support Methods of Evaluation and Assessment
5
Overview Measures of success include: Enrollment Reports Assessment of Learning Student Evaluation Survey Informal Feedback
6
Why Monitor Quality? Results based Measurable results Effective course content Efficiencies in operation Teaching effectiveness
7
Why Monitor Quality? Alignment of methods for measuring and assurance of quality Stability of online programs Impacting student satisfaction Value of online programs
8
Evaluating Quality of Online Programs Views and methods for evaluating programs varies Adoption of a comprehensive tool or methods brings alignment A validated tool recognized by industry can assist with reliability Requirements before adopting a tool
9
Evaluating Quality of Online Programs Repeatable data collection results in meaningful collection of data Comprehensive approach Multiple collection cycles supportive of reasonable and responsible data
10
Evaluating Quality of Online Programs Requires careful planning Data collection focus on mega, macro, micro levels Systems approach Data collected include: Effective course content Efficiencies in operation Teaching effectiveness
11
Evaluating Quality of Online Programs Mega Success of online program at meeting university enrollment goals Support of teaching and learning goals Macro Technological infrastructure Individual courses support of teaching and learning guidelines Faculty engagement Micro Instructional design impact Student, Faculty, Staff use of technology Student and faculty support services
12
Evaluating Quality of Online Programs Planning Timeline Participation Communication, Communication, Communication Conduct the evaluation Plan for interventions
13
Monitoring of Online Program Create Dashboard Seven to Ten Metrics Method for reporting (communicating) Data collection periods
14
Example of Data Collection Schedule
15
Building a Dashboard Elements of organizational performance Enrollment Goals Teaching and Learning Goals Graduation rates Employment outcomes Technological metrics: uptime, type of support calls Quality of teaching and learning Faculty engagement Faculty training, participation Student evaluation survey data
16
Dashboard Examples
17
Lawrence Tech Experience Sloan-C Evaluation of Online Program organization Self assessment Baldrige Education Performance Excellence Evaluation of education organization Assessed by Baldrige evaluators Blackboard Exemplary Course Rubric Evaluation of course development Self assessment Quality Matters Evaluation of course development Assessed by qualified evaluators
18
Lawrence Tech Experience Operation Quality Course Quality Course Delivery Quality Documenting standards Identify metric requirements Adopting industry standards: Sloan-C Blackboard Exemplary Course Rubric QM Course Design Baldrige - future
19
Getting Started Why do this? What will you do with the data? Benchmarking Building team Confirming plan Collecting data, what data Reporting results Engagement across campus services
20
Confirming Monitoring schedule Reinforcement of quality measures Integration Policy and practices of monitor, evaluating, assessing Managing, planning for change Oversight
21
Change Be an agent of change Lens of student, employers, accrediting bodies, stakeholders Define critical success practices
22
Timeline
23
Conclusion Confirm metrics Begin program evaluation Sloan-C Develop Dashboard Report Refine Apply intervention
24
Dashboard Data
26
Discussion There is nothing wrong with change, if it is in the right direction. -- Winston Churchill
27
References Cokins, G. (2008, April 3). How are balanced scorecards and dashboards different? Information Management.com. Retrieved April 12, 2014, from http://www.information-management.com/news/10001076-1.html?zkPrintable=true Cowan, K. (2013, December 15). Higher education’s higher accountability. Accreditation and Standards, Winter(2014). Retrieved from http://www.acenet.edu/the-presidency/columns-and-features/Pages/Higher-Education%27s- Higher-Accountability.aspx Dessinger, J. C. & Moseley, J. L. (2004). Confirmative evaluation: Practical strategies for valuing continuous improvement. San Francisco, CA: John Wiley & Sons, Inc. Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2004). Program Evaluation: Alternative approaches and practical guidelines. Boston, MA: Pearson Education. Frigo, M. (2012). The balanced scorecard: 20 years and counting. Strategic Finance, p. 49-53. Griggs, V., Blackburn, M., & Smith, J. (2012). The educational scorecard: The start of our journey. The Electronic Journal of Business Research Methods, 10(2), 121-131. Guerra-López, I. (2007). Evaluating impact: Evaluation and continual improvement for performance improvement practitioners. Amherst, MA, HRD Press, Inc. Hell, M., Vidačić, S., & Garača, Ž. (2009). Methodological approach to strategic performance optimization. Management, 14(2), 21-42. Hughes, K. E., & Pate, G. R. (2013). Moving beyond student ratings: A balanced scorecard approach for evaluating teaching performance. American Accounting Association, 28(1), 49-75. Kaufman, R., Gurerra, I., & Platt,W. A. (2006). Practical evaluation for educators: Finding what works and what doesn’t. Thousand Oaks, CA: Corwin Press. Kaufman, R., Oakley-Browne, H., Watkins, R., & Leigh, D. (2003). Strategic planning for success: Aligning people, performance, and payoffs. San Francisco, CA: Josey-Bass/Pfeiffer. Kesler, G., & Kates, A. (2011). Leading organization design: How to make organization design decisions to drive the results you want. San Francisco, CA: Josey-Bass. Laureate Education, Inc. (Producer). (2011a). Assessment and accountability in education: Dashboards, part 1. Baltimore, MD: Author. Laureate Education, Inc. (Producer). (2011b). Assessment and accountability in education: Dashboards, part 2. Baltimore, MD: Author. Popham, W. J. (2008). Transformative assessment. Alexandria, VA: Association for Supervision and Curriculum Development. Shelton, K. (2010). A quality scorecard for the administration of online education programs: A Delphi study. Journal of Asynchronous Learning Networks, 14(4), 36-62. Shelton, K., & Saltsman, G. (2005). An administrator’s guide to online education. USDLA Book Series on Distance Learning. The Sloan Consortium (2012). Changing course: Ten years of tracking online education in the United States (2013). Babson Survey Research Group and Quahog Research Group. Retrieved from http://sloanconsortium.org/publications/survey/changing_course_2012 http://sloanconsortium.org/publications/survey/changing_course_2012 U.S. Department of Education, NCES (2011, October 5). Learning at a distance: Undergraduate enrollment in distance education courses and degree programs v. 154. NCES: Author. Retrieved from http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2012154 http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2012154 United States Government Accountability Office. Higher Education (2011). Use of new data could help improve oversight of distance education. (GAO-12-39). Retrieved from Retrieved from http://www.gao.gov/assets/590/586340.pdf U. S. News World Report (2014, January 7). Online education. Retrieved from http://www.usnews.com/education/online-education
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.