Comparing 1-Year Out Surveys from Three Concurrent Enrollment Programs

Slides:



Advertisements
Similar presentations
Considerations for an AP Accounting Course & Exam Why consider? What is AP? Issues? Your Thoughts?
Advertisements

Sponsored by CEPA Foundation – Cultural & Educational Programs Abroad CEPA Foundation Webinar #3 on Curriculum Integration: Evaluation Integrating Education.
District Profile District Profile Arizona Agribusiness and Equine Center, Inc. Who we are.
NACEP Accreditation: Advancing Quality College Courses in High School and Improving Credit Transfer Kent Scheffel, President-Elect National Alliance of.
Graduating Senior Exit Survey Lindsay Couzens, M.S. And Bea Babbitt, Ph.D. Academic Assessment 1.
The Academic Assessment Process
Frequently Asked Questions: High School Mathematics October 2005.
AN EVALUATION OF THE EIGHTH GRADE ALGEBRA PROGRAM IN GRAND BLANC COMMUNITY SCHOOLS 8 th Grade Algebra 1A.
Minority Student Participation in International Programs: A Survey of Undergraduate Students Attending HBCUs Komanduri S. Murty & Jimmy D. McCamey, Jr.
Partner List. 1 Minute Discussion Find out what you have in common with your partner. Why were you paired together? Microsoft, 2011.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
The Importance of Creating an Environment of Mutual Respect In the Classroom Amanda J. Watson, PhD Murray State University Background Promoting an academic.
How All The Money Fits FAFSA Basics Sharon L Harper Director of Scholarship Programs.
RED RIVER COLLEGE PLAR/RPL IN ACTION! Recognizing Prior Learning.
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results Dr Pam Wells Adviser, Evidence-Informed Practice.
Concurrent Enrollment as a Vehicle for Recruitment & Retention: Does Tinto’s Model Apply to CE? USU Concurrent Enrollment Program VINCENT J. LAFFERTY MS,
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
2008 FAEIS Annual Longitudinal Assessment With a Comparison to the 2007 Survey Results The purpose of the FAEIS annual evaluation is to develop longitudinal.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
A Profile of BGSU Students Jie Wu Office of Institutional Research Summer 2008.
HELEN ROSENBERG UNIVERSITY OF WISCONSIN-PARKSIDE SUSAN REED DEPAUL UNIVERSITY ANNE STATHAM UNIVERSITY OF SOUTHERN INDIANA HOWARD ROSING DEPAUL UNIVERSITY.
Dual Enrollment Benefits and Quality Factors 2011 Advisor Training & Professional Development August 10, 2011.
Higher Education Act.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Advanced Writing Requirement Proposal
Postgraduate Taught Experience Survey (PTES) 2010 Interim Results
AUB Alumni Survey Report 2016
Assessment in student life
First-Year Student Career Readiness Survey 2016
Functional Area Assessment
Academic transfer.
Assessment & Evaluation Committee
Thomas White, Stephen F. Duncan, and Jeremy B. Yorgason
Driving Through the California Dashboard
A nationwide US student survey
Director of Policy Analysis and Research
NSSE Results for Faculty
AdvancED Accreditation Timeline for Huntsville City Schools
Kansas Leads the World in the Success of Each Student.
Why Consider Becoming a Teacher?
General Education Assessment
General Education Assessment
HLC
Operations and Performance of the Virginia Community College System
TRENT UNIVERSITY 2007 CGPSS REPORT
Director, Institutional Research
Business and Management Research
Understanding the student journey – from pre-arrival to graduation
Running Start Information Session
AB 705 and You: Your Program and Your Students – Noncredit, ESL, and Basic Skills Ginni May, Area A Representative, Math and Quantitative Reasoning Task.
African American College Students’ Perceptions of Valuable College Experiences Relative to Academic Performance Jeanette Davis, M.Ed., PC and Cassandra.
Senate Ad hoc Committee for the Assessment of the Higher Education Research Institute (HERI) Faculty Survey Report on Findings Felicia Lassk, Associate.
Indiana University School of Social Work
MISSION VISION VALUES Planning Retreat
Sarah Lucchesi Learning Services Librarian
Wednesday March 11, 2015 Board of Trustees Meeting
WHERE ARE WE? THE STATE OF DUAL CREDIT/ENROLLMENT IN MISSOURI
Business and Management Research
Assessment & Evaluation Committee
Student Equity Planning August 28, rd Meeting
Running Start Information Session
Driving Through the California Dashboard
Eden Collegiate High School Eden CISD School Board Presentation
Georgia’s Dual Enrollment Program
USER’S PERCEPTION AND ATTITUDE ABOUT E- LEARNING
Public Health Learning Network
Unduplicated Annual Enrollment: North Dakota
New Special Education Teacher Webinar Series
Presentation transcript:

Comparing 1-Year Out Surveys from Three Concurrent Enrollment Programs Daniel R. Judd, Ph.D. Judd Research and Gillian B. Thorne, Ph.D. Chair NACEP Research Committee

Professional, valid surveys are important if NACEP is to be perceived by the academic community as credible.

The 1 Year Out Survey is a standardized questionnaire intended for use by all institutions seeking NACEP accreditation Evaluation (E1) states: The CEP conducts annual program assessment and evaluation of its practices including at least course evaluations by CEP students and follow-up of the CEP graduates who are college or university freshmen. Qualified evaluators/ researchers and/or the college’s or university’s institutional research office conducts and analyze evaluations and assessments.

Comparison of 1 Year Out data from three institutions: to provide feedback on the usefulness of questions to serve as a basis for discussing improvements.

Student records submitted: 1,434 Boise State University 92 Utah State University 200 U. of Minnesota-Twin Cities 1,145     The larger number of records from UM-TC required sub-group analysis and reporting by percentages.

Big Question: Does size of the CEP make a difference in what the 1 year out survey measures?

Identifying in-common variables was the first challenge in making comparisons of 1 Year Out data between the three institutions: 20 in-common variables were identified

Variables fit into five categories: Numbers in parenthesis are the number of variables in each category. Contribution of CEP credit to student’s postsecondary education (PSE; 5). Difficulties encountered in transferring credit to a PSE institution (3). Student satisfaction with CEP (2). Personal development from CEP (4). Demographics (4). There needs to be more variables in each of the categories. Research questions were formulated from each of theses questions and answers sought from the variables contributing to each research question. These five research questions form the structure for this presentation.

Big Question: Let’s discuss this. Are these categories sufficient? Or are there other dimensions that you see as important to include? Let’s discuss this.

1 - What contribution did CEP make to students’ PSE? After high school graduation, did you attend a college, university, or professional school? Are you attending a 2-yr./4-yr. college? I was better prepared academically for college. I developed more realistic expectations about the academic challenges of college.

What is the population? After high school, did you attend a college or university? 2 yr.? 4 yr.? Is the population those who go on to college or those who graduate from HS with CEP credit? The CEP conducts annual program assessment and evaluation of its practices including at least course evaluations by CEP students and follow-up of the CEP graduates who are college or university freshmen. This question demonstrates the need for NACEP to clarify the procedure for collecting data, thus ensuring that all institutions are surveying the same population.

I was better prepared academically for college. BSU 86% Agree, USU 96% Agree, UM-TC 90% Agree Average for three inst. = 92%

I developed more realistic expectations about the academic challenges of college. BSU 78% Agree, USU 96% Agree, UM-TC 84% Agree Average for three inst. = 84%

So, it’s clear that CEPs contribute to PSE, what other ways could we measure that contribution? Let’s explore this.

2 - Were difficulties encountered applying CEP credit to PSE? I was allowed to count some or all of the CEP credits toward my college degree. I was exempted from a required course. I was able to start in a more advanced course in college. All three are yes-no questions, so they can be easily combined and reported as percentage answering yes.

Percentage answering “Yes”

Could we just ask, How difficult was it for you to transfer CEP credit Could we just ask, How difficult was it for you to transfer CEP credit? Why not?

3 - Having had some college, overall how satisfied are students? Would you recommend CEP classes to current high school students? Rate your overall experience with CEP.

Would you recommend CEP classes? Yes

Rate your overall experience with your CEP. Average “Excellent” for three institutions = 58% Average “Excellent” for three institutions = 58%

4 - What skills or abilities do students report acquiring through CEP ? I strengthened my study habits I was more confident in my ability to succeed in college I strengthened my writing skills I strengthened my analytical thinking

I strengthened my study habits BSU 72% Agree, USU 74% Agree, UM-TC 73% Agree Average across three institutions =73% Average agreement across three universities = 73%

I was more confident in my ability to succeed in college BSU = 81% Agree, USU = 87% Agree, UM-TC = 83% Agree, Average three institutions = 84% Average agreement across three universities = 84%

5 - Do demographics show significant differences in student responses? Did either parent attend college? Gender Reported low income Ethnicity

Did either of your parents attend college?   2-tailed Significance (p<.05) Rate your overall experience with your CEP 0.64 I was better prepared 0.43 More realistic expectations 0.09 More confident in my ability to succeed 0.33 An independent samples t-test was performed to answer RQ5, is there a statistically significant difference in students’ responses based on parents education. The table shows that parents’ education did not make a significant difference in students’ response to those subjective variables. A statistically significant difference would have been .0499999 or less.

Gender Rate your overall experience with your CEP 0.01 2-tailed Significance (p<.05) Rate your overall experience with your CEP 0.01 I was better prepared 0.18 More realistic expectations More confident ability to succeed 0.86 The table above shows a statistically significant difference based on gender to the satisfaction ratings of the overall CEP experience and the question asking for agreement on whether the student developed more realistic expectations of the academic challenge required by attending college.

Low Income Qualify for a free or reduced lunch? 11% Were you eligible for a Pell grant? 15% Eligible for a Pell grant? 2-tailed Significance (p<.05) Rate your overall experience with your CEP 0.02 I was better prepared 0.63 More realistic expectations 0.003 More confident ability to succeed 0.88

Ethnicity Measuring difference based on ethnicity was problematic because two of the institutions, Boise State and Utah State had 8 students returning a survey who reported an ethnic or racial background. University of Minnesota-Twin Cities had 116, which is more but proportionally about the same as Boise State, < 10%. The relatively small numbers of students reporting an ethnic background resulted in no findings of statistically significant differences in response to the four variables, however additional effort is needed.

Observations The population to survey needs to be clearly stated in the accreditation standard. It may be of greater practical use for institutions to survey all CEP students, rather than just those who are attending a postsecondary institution. Here I am before I shaved and had a haircut. These are best crack at wise observations.

An evaluation of the quality of CEP programs hinges on defining the larger purposes of offering college courses to high school students, which may be: Personal development of the student/citizen Advancement of students’ education or career Monitoring to improve institutions’ processes Quality of the CEP courses they offer Transferring of earned CEP credit.

Satisfaction with CEPs is very high, but a survey instrument should give sponsoring organizations an understanding of what’s behind the curtain. What specific aspects should be measured for satisfaction. Teaching? Course Content?

It’s been great to be with you! Thank you for your input.