Download presentation
Presentation is loading. Please wait.
Published byVivien Day Modified over 6 years ago
1
Measuring Instructor Effectiveness in Higher Education: A Mixed Methods Case Study
Dr. Gregory T. Bradley Dr. Scott W. M. Burrus Dr. Melanie E. Shaw Dr. Kevin Stange
2
Overview This presentation is an overview of the methodology of a mixed methods case study that is part of phase two of the University of Michigan and University of Phoenix research partnership focused on measuring instructor effectiveness, factors that may influence instructor effectiveness and the relationship between effectiveness and student learning.
3
Methodology As discussed by Yin (2014), this study will follow a mixed method, multiple case studies approach. Multiple case studies provide the opportunity for what Yin calls "replication logic”. This involves selecting cases that demonstrate how findings are theoretically predictable. This replication results in demonstrably stronger case studies and extends the study’s opportunity for relevance outside the situated context, or provides greater transferability. The unit of analysis will be at the faculty level.
4
Study aims Near-term: Better understand the instructor’s background and behaviors that influence effectiveness Teaching experience and cross institutional activities Instructor cost of living Course engagement Teaching effectiveness Student learning outcomes Long-term: Develop and evaluate policies that harness this understanding to improve student performance Hiring, teaching assignment, other personnel policies Incentives for course engagement
5
Research Questions What are the characteristics of effective instruction? How do effective instructors engage in their courses? How do faculty define effective instruction? How do students define effective instruction? How many courses (or students) can effective instructors teach at one time? Is there a relationship between pay and effectiveness? What best practices can be implemented to facilitate effective instruction? What are the recommendation for instructor training, mentoring, and support?
6
Data collection methods
Examination of Course Room data (Mixed) Review of faculty-student engagement (e.g., discussion forums, assignments, other communications) Artifact assessment for student learning outcome mastery Academic metrics database for course room interactions Faculty Survey (Quantitative, follow-up to Phase 1)) Background (work & teaching experience, education, etc) Work commitments outside of UPX Philosophies and approaches to teaching
7
Data collection methods, continued
Focus groups/interview (Qualitative) Faculty focus groups/interview Student interviews/focus groups Faculty does faculty location or geography influence faculty engagement? (Quantitative, follow-up to Phase 1) How does cost of living related to the faculty engagement?
8
Sampling For the mixed and qualitative component: The sample for this study is comprise of two courses, Math 208/209 and Accounting 290/291. The sampling strategy for this project is purposive. Adhering to principles of saturating, we will purposively sample a maximum of 12 Math 208/209 course (six of these courses will be purposively selected from the last year of the phase 1 sample of those courses taught by faculty designated as “effective” and six of these courses will be purposively selected from those courses taught by faculty designated by those faculty designated as “ineffective”) and a maximum 12 Accounting 290/291 courses or until we reach saturation. This approach is supported by Mason, 2010 and Palinkas, et. al, 2015.
9
Sampling, continued . For the quantitative, follow-up to phase 1 component, the sample will include those faculty involved in phase 1.
10
Analysis and Reporting
As this is a mixed method multiple case studies we will follow a cross case analysis to explore how cases (based on the unit of analysis = faculty) are comparable and contrasting building thematic evidence to inform and answer the research questions. Quantitative analysis will also occur and align with appropriate hypothesis testing. Reporting will initially be internal to key university stakeholders and then externally and include both peer-reviewed conferences and publications.
11
Assignments Roles: Dr. Scott Burrus (UoP) and Dr. Kevin Stange (UM) are Co-Principle Investigators on the overall joint initiative for phase 2. Quantitative components focused on faculty cost of living are under the auspices of Kevin Stange. For the Mixed Method Multiple Case Study, Scott Burrus will take the lead as Project Director overseeing the and project managing the overall project. He will also participate in all data collection, analysis and reporting aspects of the project.
12
Assignments, continued
Dr. Melanie Shaw is the lead research fellow/research associate serving on the mixed method case study and will take the lead on the faculty survey and will assist the other research associates on the other dimensions. Dr. Meena Clowes is the lead research fellow/research associate on the mixed method case study and will take the lead on the examination of the course room dimension of the project and will assist the other research associates on the other dimensions. Dr. Helen Zaikina-Montgomery is the lead research fellow/research associate on the mixed method case study and will lead the faculty and student focus groups/interview component of the study and will assist the other research associates on the other dimensions. Note: Research Center Affiliates (e.g., SAS research faculty) will support various data collection activities.
13
Timeline Timeline: June-August, 2017 – Finalize design, instrumentation, sample selection, and data access. Obtain COR and IRB. Begin data collection. August-October, 2017 – Data collection across all dimensions October-November, 2017 – Begin Analysis process to find emerging themes November,2017-January, 2018 – Continue initial Analysis and determine if further analysis is warranted February-June, Draft initial reporting for internal and external audiences and consider external venues (e.g., AERA, DLA, OLC)
14
Early Alert System (EAS) in Online Education: Student Demographic Profile
Dr. Helen Zaikina-Montgomery Dr. Scott Burrus Dr. Meryl Epstein Dr. Elizabeth Young Dr. Roger Gung Dr. Pan Hu Ms. Sue Yuan
15
References Mason, M. (2010). Sample Size and Saturation in PhD Studies Using Qualitative Interviews. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research, 11(3). Palinkas, L. A., Horwitz, S. M., Green, C. A., Wisdom, J. P., Duan, N., & Hoagwood, K. (2015). Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Administration and Policy in Mental Health, 42(5), 533–544. Yin, Robert K., (2014). Case Study Research: Design and Methods, 5th Ed. Sage, Thousand Oaks, CA.
16
Early Alert System (EAS) in Online Education: Student Demographic Profile
Helen Zaikina-Montgomery, Ph.D. Scott Burrus, Ph.D. Meryl Epstein, Ed.D. Elizabeth Young, Ed.D. Roger Gung, Ph.D.
17
Overview and Purpose of Study
Examine the demographic characteristics of students who receive Early Alerts (EAs) in courses Compare demographics of students who receive EAs to students who do not receive EAs Existing research on course intervention strategies, such as EA identifies a need to better understand demographics of students who struggle in course work Build statistical models to evaluate the impact of demographic characteristics to receive Early Alerts Further evaluate the impact of Early Alerts on students’ probability of passing course [Ortagus, 2017]
18
EAS in the Present Study Context
Implemented in 2007 with the goal of increasing course completion by alerting the student’s Academic Counselor to contact the student & discuss concerns. Updated January 2016 to automatically alert the student’s Academic Counselor (AC) if the student did not submit an assignment or participate in the online discussions in the course. Current EAS Process Faculty files EA through classroom or EA issued by LMS Academic Counselor is notified of issue Academic Counselor reaches out to student
19
Supporting Research Overview
Student retention and graduation rates are a topic of institutional concern and academic examination Due to a difference in modality of delivery, online courses are structured differently than traditional on-ground or blended courses Online courses require students to possess more intrinsic motivation and higher levels of organizational and self-management skills Through a mutually collaborative ecosystem, [Allen, Seaman, Poulin, & Taylor, 2016; Braxton, 2002; Eaton, 2011; McElroy & Lubich, 2013]
20
Supporting Research Overview
National Postsecondary Student Aid Study (NPSAS) shows that being married, being a parent, and a full-time employee were positively associated with online course enrollment. NPSAS data showed that minority students were less likely to engage in online education than their non-minority peers Effective interventions for students in online courses need to be well-matched to the online learning environment and to the demographic characteristics of those who are most likely to struggle in their course work [Allen, et al., 2016; Donnelly, 2010; McElroy & Lubich, 2013; NPSAS, 2012; Ortagus, 2017]
21
Questions guiding this Research
What is the demographic profile of students who received an Early Alert, and how do they differ demographically from those who did not receive EA? What is the demographic profile of students who pass the courses, and can we measure the impact of EA on passing course?
22
Data Source Collected from the university Office of Business Analytics and Operations Research. Included records from students who were issues an early alert either by the course instructor or the learning management system (LMS) sometime during their course Included courses with start dates between November 24, 2015 and January 26, A total of 26,573 student records were accessed and used in the study of which 2.4% (n = 640) were students who received an Early Alert
23
Early alert Profiling results
24
What is the demographic profile of students who received an Early Alert?
Demographic Characteristic EAS Students (N = 640) Non-EAS Students (N = 25,933) Age in years M (SD) 33.9 (8.59) 35.1 (9.15) Number of people in family M (SD) 2.69 (1.41) 2.83 (1.50) GPA M (SD) 2.57 (0.59) 3.10 (0.62) Total transfer credits into the university M (SD) 14.17 (19.38) 15.8 (20.44) Gender N (%) Male 173 (27.4) 8,045 (31.3) Female 458 (72.6) 17,677 (68.7) Marital status N (%) Single 379 (63.5) 12,935 (54.7) Married 145 (24.3) 7,686 (32.5) Separated 36 (6.0) 1,193 (5.0) Divorced 37 (6.2) 1,819 (7.7)
25
Demographic Characteristic
EAS Students (N = 640) Non-EAS Students (N = 25,933) Have dependents N (%) Yes 86 (14.5) 3,475 (14.8) No 507 (85.5) 19,944 (85.2) Military status N (%) Military (past or current) 140 (21.9) 5,446 (21.0) Not Military 500 (78.1) 20,487 (79.0) Passed course N (%) 268 (41.9) 21,351 (82.3) 372 (58.1) 4,582 (17.7) Course grade N (%) A 10 (1.6) 5,890 (22.7) A- 4,150 (16.0) B+ 14 (2.2) 2,052 (7.9) B 18 (2.8) 2,011 (7.8) B- 28 (4.4) 2,196 (8.5) C+ 26 (4.1) 1,297 (5.0) C 31 (4.8) 1,331 (5.1) C- 67 (10.5) 1,243 (4.8) D+ 22 (3.4) 614 (2.4) D 42 (6.6) 554 (2.1) D- 30 (4.7) 546 (2.1) F 108 (16.9) 1,240 (4.8) W 234 (36.6) 2,800 (10.8)
26
Demographic Characteristic
EAS Students (N = 640) Non-EAS Students (N = 25,933) Form of payment N (%) Student Direct Pay 27 (11.2) 2,313 (8.9) Financial Aid 558 (87.2) 22,638 (87.3) Military 10 (1.6) 797 (3.1) Scholarship (non-EAS only) 0 (0.0) 185 (< 1.0) Program level N (%) Associate 172 (27.1) 6,546 (25.5) Undergraduate 383 (60.4) 16,262 (63.3) Graduate 79 (12.5) 2,875 (11.2)
27
Female students more likely to receive an EA than male students
How to students on who received an Early Alert differ demographically from students who did not receive an Early Alert? Female students more likely to receive an EA than male students Younger students and students who were single were more likely to have an early alert Students who received an EA were less likely to pass the course in which early alert was filed, three times more likely to withdraw from the course, and four times more likely to fail the course
28
Cramer’s V ɸc (p-value)
How to students on who received an Early Alert differ demographically from students who did not receive an Early Alert? Chi square tests applied to EAS status related to categorical variables Variable Chi-square df p-value Phi ɸ (p-value) Cramer’s V ɸc (p-value) Gender 4.276* 1 .039 -.031 (.039) Marital status 23.047** 3 .000 .031 (.000) Dependents .052 .820 -.001 (.820) Military status .288 .592 .003 (.592) Passed course ** .159 (.000) Course grade ** 13 .187 (.000) Form of payment 4.362 2 .076 .088 (.076) Program level 2.365 .307 .009 (.307) * = Values significant at the .05 level ** = Value significant at the level
29
Early alert Modeling
30
Modeling Approach
31
Early Alert Variables Exploration
32
Early Alert Variables Exploration
33
Early Alert Variables Coefficient
Factors Estimate Std Error P-Value Intercept -1.521 0.250 0.000 GPA: 2 < · <= 2.67 -0.406 0.121 0.001 GPA: 2.67 < · <= 3.14 -0.950 0.135 GPA: > 3.14 -2.147 0.149 AGI: <= 10647 0.254 0.200 0.204 AGI: < · <= 40205 0.104 0.192 0.587 AGI: > 40205 -0.061 0.205 0.767 Gender: M -0.119 0.100 0.232 Gender: O 0.623 0.353 0.078 Payment Option: DEF -1.814 0.719 0.012 Payment Option: MIL -0.771 0.374 0.039 Payment Option: OTPO -0.501 0.134 Program Level: Graduate 1.166 0.153 Program Level: Undergraduate 0.044 0.096 0.648 AC Tenure: 5 -0.048 0.279 0.864 AC Tenure: 5 < · <= 10 -0.320 0.101 0.002 AC Tenure: > 10 -0.031 0.128 0.809 CCC Flag: Non-CCC -0.650
34
Odds Ratio Note: Odds Ratio is the odds of the positive outcome at one level of Xi relative to the odds of the positive outcome at reference level of Xi
35
Early Alert Model Performance
The early alert model (7 variables) has a C statistic of 0.736, which can be considered a good model performance
36
COURSE-PASS Modeling
37
Course-Pass Variables Exploration
38
Course-Pass Variables Exploration
39
Course-Pass Variables Coefficient
Factors Estimate Std Error P-Value Intercept -3.782 0.184 0.000 EAS Flag: 0 1.485 0.094 Age: 33 < · <= 48.4 0.086 0.042 Age: > 48.4 0.133 0.074 Age: Missing 0.414 0.393 0.292 GPA: 2 < · <= 2.67 1.010 0.066 GPA: 2.67 < · <= 3.14 1.688 0.070 GPA: > 3.14 2.643 0.071 AGI <= 10647 0.006 0.120 0.958 AGI: < · <= 40205 0.135 0.123 0.272 AGI: > 40205 0.285 0.128 0.026 Payment Option: DEF 0.206 0.188 0.274 Payment Option: MIL 0.499 0.161 0.002 Payment Option: OTPO -0.139 0.068 AC Tenure: 5 0.096 0.141 0.497 AC Tenure: 5 < · <= 10 -0.083 0.049 AC Tenure: > 10 -0.140 0.063 0.027 CCC Flag: Non-CCC 2.115 0.081 Num Family: 3 -0.050 0.051 0.332 Num Family: 3 < · <= 6 -0.007 0.050 0.885 Num Family: 6 -0.057 0.166 0.731 Num Family: Missing 0.339 0.079 Course Level: 2 -0.117 0.017 Course Level: 3 0.453 Course Level: 4 0.592 Course Level: 5 0.306 0.001
40
Odds Ratio Note: Odds Ratio is the odds of the positive outcome at one level of Xi relative to the odds of the positive outcome at reference level of Xi
41
Course-Pass Model Performance
The early alert model (9 variables) has a C statistic of 0.798, which can be considered a good model performance
42
Conclusions & Future research
Students who receive EA tend to be more academically “at risk” than non-EA students, and have 41% lower passing rate. Some of the expected factors (those that add more responsibility, such as being married and having dependents) were not associated with EA status Additional data and further analysis, including hierarchical linear modeling to account for potential nested effects should be conducted Universities can continue to develop interventions for “at risk” students keeping in mind factors from this study Students’ GPA, collection status, form of payment, income have strong impact on students’ receiving EA For a non-EA student, the odds of passing a course is 4.4 times as the odds of an EA student, due to the inherent risk
43
Questions & Discussion
44
Campus Collection Center (CCC) Flag
Variables List Variable Attribute Description Early Alert Categorical 0: No Early Alert 1: Has Early Alert Passed Course Passing Grade is C- for graduate students, D- for bachelor and associate students GPA <=2; ; ; >=3.14 Campus Collection Center (CCC) Flag If the student’s risk code is CCC, then “CCC”; Otherwise “Non-CCC” Age <=33; ; >=48 AGI Annual Gross Income <= $10,647; $10,647 - $40,205; Missing (Missing may indicate cash students) Program Level Associate, Undergraduate, Graduate Form of Payment Payment Option DBEB: Direct Bill or Employee Bill; MIL: Military Payment; DEF: Deferred or Delayed payment OTPO: Other Payment Options
45
Variables List (Cont’d)
Attribute Description Total Transfer Credit Categorical <=20.67 credits; credits; >=55 credits Military Flag Y, N Gender F, M, O High School Diploma Y,N Academic Counselor Tenure Academic Counselor’s tenure with the company. <=4, 4-5, 5-10, >=10 Course Level 1, 2, 3, 4, 5 Number of People in the Family <=2, 3, 3-6, >6 Marital Status Divorced; Married; Separated; Single
46
Modeling Approach
47
Variable Rank Order Based on Information Value (IV)
Information Value Predictive Power >0.5 Suspicious or too good to be true 0.3 to 0.5 Strong predictor 0.1 to 0.3 Medium predictor 0.02 to 0.1 Weak predictor < 0.02 useless for prediction
48
Assessing the Validity and Reliability of the SmarterMeasure Learning Readiness IndicatorTM
Dr. Gregory T. Bradley Dr. Scott W. M. Burrus Dr. Melanie E. Shaw Dr. Karen Ferguson
49
Overarching Objectives of the Inquiry
To replicate the factor structure of the three factorable primary components and 18 subcomponents of the SmarterMeasure Learning Readiness IndicatorTM (SMLRI) using an online student population (i.e., construct validity). To measure the item reliability of the subcomponents using the same population of students.
50
Points of Guidance Relative to the Inquiry
With more than two decades of growth in online offerings at institutions of higher education (Allen & Seaman, 2013), institutional leaders are challenged to identify strategies to retain students. Student retention rates are historically lower in online courses than in similar face-to-face courses (U.S. News and World Report, 2015). As such, there are several categories identified as critical to student retention, including readiness for online learning and student support to ensure success (Harrell, 2008). To ascertain online learning readiness and needed support services, tools must be developed and effectively used. One of the tools commonly used is the SmarterMeasure Learning Readiness Indicator TM (SMLRI). This tool is an online assessment of student readiness to engage in online or technology-rich learning environments (SmarterServices, LLC., 2016).
51
SmarterMeasureTM Constructs
The 18 subcomponents of the SMLRI consist of a total of 79 items. The factorable components and subcomponents of the SMLRI are Learning Styles (Solitary, Logical, Aural, Verbal, Visual, Social, and Physical), Life Factors (Time, Place, Reason, Resources, and Skills), and Personal Attributes (Help Seeking, Time Management, Procrastination, Persistence, Academic Attributes, and Locus of Control). Of note, the two technology components (Technical Knowledge and Technical Competency) were not assessed in this study since the items are measured on a categorical scale.
52
Research Propositions
P1: Learning readiness is complex and motivated by multiple factors. P2: With large sample sizes, replication of the SMLRI factor structure and similar load coefficients will occur for the vast majority of the 18 subcomponents. P3: Subsequently, each of the 18 subcomponents will exhibit relatively high item reliability (> ).
53
Methodology We followed a multi-stage research process to focus our analysis on the multiple dimensions of learning readiness as indicated by the SMLRI. Stage 1 of the process included appending multiple files based on a unique non-identifying record number that linked each student’s SMLRI responses from 9,222 students at a large university that has both on ground and online courses. The criteria for inclusion included students that started and completed at least one course during the study period ( ) so that early academic success metrics could be examined after this factor analysis.
54
Methodology (continued)
Notably, the students in this sample answered all 110 questions pertinent to the SMLRI project scope, of which 79 of those questions are utilized in this analysis since they represent potential items of latent constructs and are factorable. Stage 2 consisted of analyzing the responses independently and attempting to replicate the factor structure of the SMLRI model in lieu of conducting confirmatory factor analysis via structural equation modeling.
55
Methodology (continued)
Stage 3 consisted of measuring the item (i.e., internal) reliability of either the existing or reconstituted constructs via Cronbach’s alpha. Note: The items in the SMLRI represent a number of questions and level of agreement statements that are linked to either constructs with established psychometric properties or proprietary SMLRI models. Of the 110 items, 79 are represented in SMLRI manifest variables leading to 18 latent variables.
56
Philosophy of Methodology
Following the guidelines set forth by Osborne and Fitzpatrick (2012), the primary objective of this research was to replicate the factor structure of the aforementioned factorable variables identified through the SMLRI model. The authors' position is that, since replication with exploratory factor analysis (EFA) is also exploratory, and prefaces a more diligent and scrupulous confirmatory factor analysis (CFA), simple summary measures are preferred. Moreover, it is posited that with very large sample sizes, replication through additional EFA procedures should produce similar results in valid and reliable models.
57
Sample Constitution Of those respondents who are included in this research, the vast majority (97.2%) are strictly online students. The average age of the respondents is years with a range of 18 to 80, indicating a more non-traditional online student. Regarding place of residence, all 50 states, Puerto Rico, and the Virgin Islands are represented in the sample.
58
Technicalities of Method
Following the guidance of Tabachnick and Fiddell (2007), an oblique factor rotation (direct oblimin) with the individual 18 factors as set forth by the SMLRI model was first specified to examine the correlation among factors. Since the factors exhibited little correlation (per Tabachnick and Fiddell, the threshold is r < .32), it was determined that an orthogonal rotation was clearly most appropriate for this analysis. Thus, a principal components analysis (PCA) with an orthogonal rotation (Varimax) was selected for independent data reduction. Following conventional protocol for factor analytic data reduction, the first step in the factoring process beyond rotation selection used in the first correlation matrix was to discard any inter-item variables correlated at less than .40 (Hinkin, 1998). This process serves to remove items that appear discrete to the common domain (Churchill, 1979; Kim and Mueller, 1978).
59
Results In this phase of the data reduction process, since this was an attempt at replication, there were no prescribed number of factors. The resulting reconstituted factor solution incorporated 67 of the original 79 variables, generated 19 factors, and accounted for 48.02% of the total variance. There were only two item cross loadings in the new factor solution. Moreover, the majority (56.7%) of the factor loading coefficients were above .60, with 19.4% above .70. Notably, negative items were reverse coded to conform to criteria facilitating reliability analysis.
60
Results (continued) With the revised factor solution, an examination of the Kaiser-Meyer-Olin measure of sampling adequacy suggested that the sample was factorable (KMO = .894). The approximate chi-square for Bartlett's test of sphericity was significant (χ2 (3,081) = 119,595, p<.001), indicating that the correlation matrix was not an identity matrix.
61
Results (continued) Both the eigenvalue criterion and an observation of the scree plot indicated a 19-factor solution, which accounted for 48.02% of the variance. Internal consistency for 18 of the 19 of the factors was examined using Cronbach's alpha. The alpha values for the 18 factors in which reliability could be measured was moderate to low, with 5 factors above .65, and the remaining factors ranging from .43 to .60. Hence, there are some concerns about unidimensional scale measurement. However, rules of thumb in this regard should be taken with caution when alpha has been computed from items that are not correlated highly.
62
Revised Factor Information and Interpretation
Component Number of Items Interpretation of Dimension _____________________________________________________________________________________________ Procrastination Anxiety about education Communication Time commitment ability Spatial utilization Assimilation Motivation Aural learning style Persistence Precision Solitude Outcome control perception Intellectual prowess Extrovert Outside distractions Intellectual challenge Intellectual equality Hours worked Other time distractions
63
Rotated Sum of Squared Loadings
Initial Rotated Rotated Cronbach's Component Eigenvalue % Variance Cumulative % Alpha _____________________________________________________________________________________________ NA * *The remaining components had an eigenvalue less than 1.000
64
Results Summary The objective of this research was to replicate the factor structure of the SMLRI model. In lieu of conducting a full CFA, it was decided to follow the guidance of Tabachnick and Fiddell (2007) and first implement a straightforward and basic process, especially with a sample size of n = 9,222. It is suggested that, regardless of the direction provided from the literature related to the ratio of subjects to factors, this research event far exceeds any ratio minimum recommended in the literature.
65
Results Summary With such a large sample size and a failure of over 40% of the items to load on the congruent factor, this replication attempt failed to meet the basic test of having items assigned to the same factors. Hence, the second replication step of confirming that the factor loadings were roughly equivalent in magnitude was unnecessary. A reconstituted factor structure was generated, explaining 48.02% of the variance with 19 factors. Ultimately, the validity and reliability of the aforementioned components of the SMLRI tool were not demonstrated in this research.
66
Discussion Researchers have conclusively shown the challenge of retaining nontraditional online students. It was surmised in this study that the SMLRI is one tool to help identify at-risk factors for selected student populations (SmarterServices, 2017). Based on the findings from this study, additional investigation is needed to determine the SMLRI model's validity and reliability with nontraditional students. It should be noted that a limitation to this study is that researchers were unable to fully include the technology domains that previous research has indicated may help predict student readiness.
67
Discussion However, the sole purpose of this research was to assess the validity of the factor structure and reliability of the scale items that were factorable. Another potential limitation is the students included in this study took the assessment as part of a re-enrollment activity. However, upon further analysis, this population is similar to the “average” university student. It is suggested that more in-depth research should be conducted to more deeply understand student perceptions of factors that resulted in online program attrition.
68
Discussion (continued)
In particular, research should be conducted to explore student reasons for withdrawal at various degree levels to provide a more complete picture of the interventions needed to support these students. Finally, long-term quantitative, experimental research should be conducted to determine if interventions provided to students, in response to data garnered from the SMLRI, resulted in any positive retention gains.
69
Questions? Where do we go from here?
From your perspective, what factors lead to online student attrition? What measurements or tools do you need to increase visibility to online student attrition and at what phase in the program, and at what degree program level?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.