“Creating A More Educated Georgia” Using What You Have: Observational Data and the Scholarship of Teaching Catherine Finnegan Board of Regents of University.

Slides:



Advertisements
Similar presentations
Supplemental Instruction in Precalculus
Advertisements

What is it? What is it? Quality Matters (QM) is a nationally recognized, faculty- centered, peer review process designed to certify the quality of online.
Cross Institutional Implementation of Supplemental Instruction (SI)
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Robin L. Donaldson May 5, 2010 Prospectus Defense Florida State University College of Communication and Information.
Now That They Stay, What Next?: Using NSSE Results to Enhance the Impact of the Undergraduate Experience.
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
Writing an Effective Proposal for Innovations in Teaching Grant
Holyoke Public Schools Professional Development By, Judy Taylor
Introduction to Psychology: Northern Arizona University Fully implemented, 2009  2000/year foundational, survey-style class  Traditionally, 8-11 uncoordinated.
Supplemental Instruction & Tutoring Center for Student Achievement January 16, 2013.
An Assessment Primer Fall 2007 Click here to begin.
Academic Advising Implementation Team PROGRESS REPORT April 29, 2009.
IMPLEMENTATION OF AN E-LEARNING PLATFORM USING CMS
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
Introduction to teaching and assessing so students will learn more using learner-centered teaching Phyllis Blumberg Warm-up activity How can instructor’s.
Three Hours a Week?: Determining the Time Students Spend in Online Participatory Activity Abbie Brown, Ph.D. East Carolina University Tim Green, Ph.D.
May 18, Two Goals 1. Become knowledgeable about the QEP. 2. Consider your role in making the QEP a success.
The Hybrid Mathematics Class: The best of both worlds, or No Man’s Land? Mary Hudachek-Buswell Catherine Matos Clayton State University 1.
Blackboard 201 Communication Workshop Barbara Cooper. OCC Faculty Online Coordinator.
Want to be first in your CLASSE? Investigating Student Engagement in Your Courses Want to be first in your CLASSE? Investigating Student Engagement in.
NUMBERS ARE NOT ENOUGH. WHY E- LEARNING ANALYTICS FAILED TO INFORM AN INSTITUTIONAL STRATEGIC PLAN Presented by: Sajana Meera.
Prince George’s Community College Online Express Training Faculty to Teach Online Mary Wells Margo Chaires Andrew Habermacher.
Redesign of Beginning and Intermediate Algebra using ALEKS Lessons Learned Cheryl J. McAllister Laurie W. Overmann Southeast Missouri State University.
Tammy Muhs General Education Program Mathematics Coordinator University of Central Florida NCAT Redesign Scholar Course Redesign: A Way To Improve Student.
Evidence Based Teaching Strategy: Applied to Student Orientation
Nimisha Raval Data, Planning, & Research November 2014.
The Integration of Embedded Librarians at Tuskegee University Juanita M. Roberts Director Library Services Ford Motor Company Library/Learning Resources.
Managerial Role – Setting the Stage Lesson 6 Jeneen T. Chapman John Madden Facilitators.
The Role of Automation in Undergraduate Computer Science Chris Wilcox Colorado State University 3/5/2015.
School’s Cool in Kindergarten for the Kindergarten Teacher School’s Cool Makes a Difference!
Jason Cole Consultant As presented at the Sakai Summer Conference 12 June 2007 | Amsterdam, Netherlands The public face of eLearning.
A Supplemental Instruction Model for Precalculus Gabriela Schwab El Paso Community College Helmut Knaust Emil Schwab The University of Texas at El Paso.
The Four Components What’s Wrong? (Quantitative Data) Why? (Focus Group Student Data) Revised Interventions New Interventions Policy Changes Assess.
Student Centered Teaching Through Universal Instructional Design Part II.
Achieving the Dream Status Report Mentor Visit February 5-6, 2009.
Student Services Assessment Workshop College of the Redwoods Angelina Hill & Cheryl Tucker Nov 28 th & 30 th, 2011.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
I Have to Know What??!! Online Teaching Competencies Adapted from “Online Teaching Facilitation Course: Core Competencies:
Using CMS Data as a Force for Good? Applying Academic Analytics to Teaching and Learning Leah P. Macfadyen Science Centre for Learning and Teaching, UBC,
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
The Power of Voice Tools Vida Barker Professor, Centennial College May 2008.
TULSA COMMUNITY COLLEGE Julie Woodruff, Associate Professor of English Mary Millikin, Director of Institutional Research representing the AtD Data Team.
The Auditor Role The auditor has the same view of the course as the student does, but no marks are recorded for auditors.
Teaching Thermodynamics with Collaborative Learning Larry Caretto Mechanical Engineering Department June 9, 2006.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
The Redesigned Elements of Statistics Course University of West Florida March 2008.
Class will start at the top of the hour! Please turn the volume up on your computer speakers to access the audio feature of this seminar. WELCOME TO CE101.
Data Team Presentation July 2008 Faculty and Staff Focus Group Data Faculty and Staff Focus Group Data Persistence: First Semester to Second Semester.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
An Analysis of Successful Online Behaviors Across Disciplines Catherine Finnegan, University System of Georgia Libby V. Morris, University of Georgia Kangjoo.
Student Preferences For Learning College Algebra in a Web Enhanced Environment Dr. Laura J. Pyzdrowski, Pre-Collegiate Mathematics Coordinator Institute.
Increasing Success Rates in Online Learning Kathy Coleman, Chattahoochee Technical College.
MAP the Way to Success in Math: A Hybridization of Tutoring and SI Support Evin Deschamps Northern Arizona University Student Learning Centers.
Pedagogical Standards and Sustainable Distance Education Programming Karen Gersten Associate Provost for Academic Programs and Faculty Development Laura.
NOVA Evaluation Report Presented by: Dr. Dennis Sunal.
Assessment Taskforce Update College Council November 6, 2009.
Predicting Student Retention: Last Students in are Likely to be the First Students Out Jo Ann Hallawell, PhD November 19, th Annual Conference.
Instructors’ General Perceptions on Students’ Self-Awareness Frances Feng-Mei Choi HUNGKUANG UNIVERSITY DEPARTMENT OF ENGLISH.
Using IDEA for Assessment, Program Review, and Accreditation Texas A & M University November 8, 2012 Shelley A. Chapman, PhD.
Storyboard UNIV 101 – The online student Carla Oñate Instructional Designer.
Innovative Applications of Formative Assessments in Higher Education Beyond Exams Dan Thompson M.S. & Brandy Close M.S. Oklahoma State University Center.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
1 Embracing Math Standards: Our Journey and Beyond 2008.
NSSE Results for Faculty
Fostering Student Success: Leveraging Canvas Analytics for face-to-face, hybrid, and online courses Welcome February 16, 2018.
Topic Principles and Theories in Curriculum Development
The Heart of Student Success
USG Dual Enrollment Data and Trends
Presentation transcript:

“Creating A More Educated Georgia” Using What You Have: Observational Data and the Scholarship of Teaching Catherine Finnegan Board of Regents of University System of Georgia

Agenda Introductions and Definitions Sources of Data in CMS Study Examples –Engagement –Retention –Instruction

University System of Georgia 35 public colleges and universities –4 Research Universities, –15 Regional/State Universities –4 State Colleges –12 Associate Colleges –253,552 students –9,553 full-time faculty

Office of Information and Instructional Technologies Supports and coordinates the delivery of innovative technology resources, services, and solutions. Establishes a communications conduit among executive management for the university system about information and instructional technology.

Advanced Learning Technologies Provides academic enterprise systems and services for USG institutions. Fosters the development and implementation of collaborative online degree programs and training materials. Conducts research and evaluations to influence policy making, instructional practice and technology development.

Technology Use in Courses Adapted from Campus Computing Study,

Rising Use of IT in Instruction Percentage of courses using course management tools, by sector, Adapted from Campus Computing Study,

USG Faculty Use of CMS 2005 Nearly half (46.3%) of all USG faculty currently use a CMS in their instruction. Almost two-thirds of users have increased their usage over time. Over two-thirds of users believe that a CMS has provided important advantages in improving student engagement in learning. Over two-fifths of non-users would use a CMS if their issues were addressed.

What CMS was Used For 90.6% enhanced their face-to- face instruction 43.8% deliver fully on-line instruction 43.8% deliver hybrid courses * Based on 46.3% of respondents who were currently using a CMS.

CMS and Student Engagement Increased amount of contact with their students (55.6%) Increased student engagement with the course materials (63.5%) Allowed for inclusion of more interactive activities in their class (54.2%) Allowed them to accommodate more diverse learning styles (67.6%) * Based on 46.3% of respondents who were currently using a CMS.

Evaluation Measures the effectiveness of an ongoing program in achieving its objectives Aims at program improvement through a modification of current operations Two types of evaluations: –Project –Program

Assessment Systematic collection, review, and use of information about educational programs undertaken for the purpose of improving learning and development Two types of audience: –Accreditation –Accountability

Scholarship of Teaching Sustained inquiry into teaching practices and students’ learning in ways that allow other educators to build on one’s findings Directed toward other instructors in one’s field and beyond

Now Tell Me What you are interested in learning about your teaching practices and your students’ learning? What projects are you now conducting? What data are you using to investigate?

CMS In Scholarship of Teaching E-learning System

Student Online Activity LOGON RE-READ LECTURE NOTES REPLY to MESSAGE READ MESSAGE LOGOFF CREATE NEW MESSAGE

Emergence of a New Data Set = Large Data Set

How is this data different from other inputs to pedagogical research? It’s what the students actually did –Compared to self-reporting It captures the steps of the process –Rather than the outcome alone It’s quantitative It’s easy to collect this data across a large number of students.

How can CMS data be used? See patterns and trends Tell a story that explains the results Identify areas of improvement and targeted change Evaluate impact of changes

Patterns of Movement in Courses

New evidence for…? Course level inquiry Cross course and programmatic research College-wide policy review

Typical Sources of Data Student course evaluations and surveys Content analysis Grade distributions Interviews Portfolio review

CMS Data Sources Individual, course, group and institutional activity reports Assessment reports Survey reports Discussions Assignments Content analysis

Advantages of CMS Data Data captured automatically as students interact with software Reports available at each level (course, group, institution) Time parameters of reports allow more timely and granular review Consistency of data across time and course Instructor control of tools

Disadvantages of CMS Data Only reports actions – doesn’t explain them Access to data based on role “Canned” report data limited Data collection dependent on proper formatting of content and assessment

Activity Data Reports Available to Instructors Summary of Activity Tool Usage Components Usage Content File Usage Entry and Exit Pages Student Tracking

Entry Into Reports and Tracking Available from TEACH only

List of Available Reports Date and time parameters can be set.

Summary of Activity Reports Provides a general overview of student and auditor activity Information contained Total number of sessions Average session length Average sessions/day –by weekday –by weekend Most active day Least active day Most active hour of day Least active hour of day

Example Summary of Activity Report

Tool Usage Reports Provides an overview of how often tools are used Tools available Assessments Assignments Bookmarks Calendar Chat/Whiteboard Content File Discussions Mail Media Library Notes PowerLinks Proxy Tool SCORM Module Organizer Page URL Information contained Total number of sessions for each tool Average time per session Total time for all tool sessions Percent time for each tool compared with total time

Example Tool Usage Report

Component Usage Reports Provides an overview of how often students use each component of a course Component –which component student has accessed Visits –total number of times student has visited a component Average time/visit –average time students spend per visit Total time –total amount of time students spent for all components Percent of total visits –relates time spent in a given component compared to total time spent for all components

Example Component Usage Report

Entry and Exit Page Reports Provides an overview of pages used most frequently for course entry and exit Page Name –which page student entered or exited Tool Used –which tool was used to enter or exit Page Usage –total number of times student entered or exited from the page Percent of Total Usage –relates the number of times a page is used to enter or exit to total number of entries or exits

Example Entry and Exit Page Reports

Content Usage Reports Provides an overview of the content files viewed by students Content file –the content file that students have accessed Sessions –the total number of content file sessions Percent of Total Sessions –relates the number of content file sessions to the total number of sessions for all content files

Content File Usage Report

Content File Usage Graph

Student Tracking Reports Provides an overview of student activities in the course, displaying both general and detailed statistics First Access Last Access Sessions Total Time Mail –Read Messages –Sent Messages Discussion –Read Messages –Sent Messages Calendar Chat and Whiteboard Assessments Assignments URL Media Library Content Files

Aggregate Student Tracking

Individual Student Tracking

Data from Quizzes and Surveys Performance –Displays student scores for quiz submissions Item Statistics –Displays performance statistics for individual questions. Compares the performance of selected students with the entire class Summary Statistics –Compares all students’ results in one table Class Statistics –Displays class performance for individual questions

Performance Displays student scores for quiz and survey submissions

Item Statistics Displays performance statistics for individual questions.

Item Statistics Displays performance statistics for individual questions. Compares the performance of selected students with the entire class

Summary Statistics Compares all students’ results in one table

Class Statistics Displays class performance for individual questions

Additional Data Sources Discussions and Mail Assignments Course Evaluations and Surveys Student Information Systems

Now Tell Me Considering the projects that you outlined earlier, –What data found in a CMS might be used to investigate your theories? –How would you collect this data? –Would you triangulate this data with other sources?

Typical Statistical Methods Frequency Distributions and Trends Measures of Central Tendency ANOVA Regression

Want to play with some data? Go to Create an account Upload data file: ExampleData.xls Run Summary Statistics

“Creating A More Educated Georgia” Studies on Student Persistence and Achievement

Research Setting: eCore ® Fully online, collaboratively developed, core curriculum courses offered jointly by institutions in the University System of Georgia. Supported by University System. Courses include the humanities, social sciences, mathematics, and sciences. Over 25 courses and 2000 enrollments in Spring semester

Underlyling Problem: Student Retention Overall Course Retention: Fall 2000-Spring 2003

Findings from Four studies Predicting Student Retention & Withdrawal Tracking Student Behavior & Achievement Online Examining Student Persistence and Satisfaction Perspectives and Activities of Faculty Teaching Online

Study 1: Predicting Student Retention & Withdrawal Purpose: to investigate student withdrawal and retention in eCore courses. How well can a student ’ s group membership (completion & withdrawal) be predicted? A two group Predictive Discriminant Analysis (PDA) is used to predict students ’ withdrawals and completions in online courses. Authors: Morris, Wu, Finnegan (2005).

Variables Two grouping variables - student completers - student withdrawers Nine predictor variables - gender, age, verbal ability, math ability, current credit hours, high school GPA, institutional GPA, locus of control and financial aid.

Model A: Two-group PDA Predictive Model, Spring 2002 Grouping Variable Age Inst Cum GPA HS GPA SAT-Verbal SAT-Math Withdraw Complete Inst Cum Cr HR Gender

Model A : Findings The most important predictors in Model A are - high school GPA - mathematic ability (SAT-math) Model A, prediction with 62.8% accuracy

Model B: Two-group PDA Predictive Model, Fall 2002 Grouping Variable FA Locus Withdraw Complete

Model B : Findings Financial aid showed significant differences between the responses of withdrawers and completers (x2=4.84, df=1, p<.05). Completers were more likely to receive financial aid that withdrawers. Locus of control has significant differences between the responses of withdrawer and completer(X2= 4.205, df= 1, p<.05). Completers were more likely to have internal motivation than withdrawers. Model B predicted with 74.5% accuracy

Study 1: Summary Students withdraw for a variety of reasons. Primary instructional reasons for withdrawing included too much work in the online course, preferred the classroom environment, and disliked online instruction. High school grade point average and mathematics SAT were related to retention in the online courses. Students who completed courses were more likely to have received financial aid. Students who completed courses were more likely to have a higher internal locus of control.

Study 2: Tracking Student Behavior & Achievement Online Purpose: to examine student behavior by tracking what students do online and how long they spend on each activity. Data: analyzed student access tracking logs. Coded over 300,000 student activities. Frequency: number of times student did a behavior Duration: time spent on the behavior Authors: Morris, Finnegan, Wu (2005)

Research Questions What are the differences and similarities between completers and withdrawers in various measures of student behavior online? How accurately can achievement be predicted from student participation measures in online learning courses?

Variables (n=8) Frequency and Duration of –viewing course content –viewing discussions –creating new discussion posts –responding to discussion posts Over 400 students and 13 sections of 3 courses

Frequency of Learning Activities Content Pages Viewed Discussion Posts Viewed

Frequency of Learning Activities Original Posts CreatedFollow-up Posts Created

Duration of Learning Activities N=423 Total Time Spent During Term Viewing Content Viewing Discussions Creating Original Posts Creating Follow-up Posts Average Overall Time Per Week Withdrawers n= hours2.6 hours 3 hours<1 hour Non- Successful Completers n=72 18 hours9 hours6 hours<1 hour 1.2 hours Successful Completers n= hours19 hours 1 hour1.5 hours3.75 hours

Findings: Completers & Withdrawers Completers had more frequent activity and spent more time on task on all 4 measures than unsuccessful completers and withdrawers. Withdrawers spent significantly less time and had less frequent activity than completers on all 4 measures (p>.001). Expected. Significant differences in participation also existed between successful and unsuccessful completers.

Multiple Regression Model for Impact of Participation on Achievement Successful and Non-Successful Completers n = 286

Findings: Successful and Unsuccessful Completers The participation model explained 31% of the variability in achievement. 3 of 8 variables were significant at the p.<.05 level and good predictors of successful completion (achievement/grades). –# of content pages viewed –# of discussion posts viewed –Seconds viewing discussions

Summary: Study 2 Time-on-task matters; withdrawers did engage significantly in number or duration of activities at the online site. Successful completers engaged significantly with the online course: –Going repeatedly to content pages (frequency) –Going repeatedly to discussion posts (frequency) –Spending significant time reading discussion posts (duration)

Study 3: Understanding Student Persistence and Satisfaction Purpose: To investigate issues that affect course completion, course withdrawals and satisfaction with online courses. Survey (n=505, response 22%) Indepth Interviews –8 withdrawers –8 completers Authors: Boop, Morris, Finnegan (2005)

Successful completers Felt “membership” in the course. Understood course layout, expectations, assignments. Faculty feedback was important. Clarity about course was important. Used words indicating “drive” and “persistence” to succeed. Could overcome course-related” problems.

Withdrawers/ Unsuccessful Students Spoke of being “lost” & “confused” in the course. Needed more direction & help from faculty to understand the course goals, expectations, assignments & design. Needed more explicit help with discussions and understanding involvement. Needed more managerial and navigational help.

Study 4: Perspectives and Activities of Faculty Teaching Online Purpose: To explore the activities and perspectives of faculty teaching online Interviews (n=13) Analysis of archived courses (10) Authors: Morris, Xu, Finnegan (2005)

Classification of Faculty Roles

Summary: Study 4 Novice instructors are far less engaged with students online. Experienced faculty posted with a ratio of 1:6 --faculty to student posts Experienced faculty interchanged pedagogical, managerial, and social roles online Students in courses with experienced faculty engaged more often in discussions Faculty visibility is important to student participation. Novice faculty need extensive assistance to understand online instruction.

Best Practices: Students Students should be advised that for online courses –Time on task matters for successful achievement; –Online courses may be activity and time intensive; –requires pro-active, engaged students; –Will not be easier for academically marginal students; –Students should directly (and as needed) seek instructor help to understand course structure and course-related objects and objectives

Best Practices: Faculty 1 Faculty should –Understand Low participation early in the term as an indicator for withdrawal or unsuccessful completion. –Should monitor/track all students early in the course term to see lags in participation –Understand the role of student expectations & attitudes in persistence –Should understand the role of Locus of Control in Withdrawing and Unsuccessful completion

Best Practices: Faculty 2 –Should engage managerial functions to explain course layout, assignments expectations (may be more important than pedagogical function at times) –Understand that course layout and instructions are not necessarily intuitive to the students –Should seek to understand previous academic preparation of students and make adjustments accordingly

Comparing Student Performance to Programmatic Learning Outcomes Link graded activities within courses to eCore ® common student learning outcomes Determine achievement of learning outcomes based on trends in grades Identify additional means of documenting student achievement of learning outcomes

Benefits of CMS Data New quantitative evidence –Complements survey, grades, and portfolio data –Very detailed information about engagement and learning process Reduce burden on faculty and staff –Automatically collects evidence –Leverages tools already in use

Opportunities for Studies Increase awareness of data sources available to study pedagogy and outcomes Encourage systematic analysis of existing data for pedagogical improvement Identify additional data elements within CMS and other data sources

Challenges for Studies Use of CMS not widespread nor extensive Essential tools not used (i.e., gradebook) Siloed data sources (Green’s ERP Turtle)

Conclusions Data collected in CMS and other systems can be used to inform the scholarship of teaching –Systematic and ongoing New sources of data offer opportunities to study perennial questions from different perspectives.

Thank You! Catherine Finnegan Presentations and Citations Available at: