Pilot Study of the CAAP Critical Thinking Test April 27, 2005 Lanette Raymond Research Associate, Suffolk County Community College.

Slides:



Advertisements
Similar presentations
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
Advertisements

October 17,  Review of CCSSE  Descriptions of HCC Students  Discussion of Benchmark Results  Item Results  Critical Thinking analysis.
CAAP Fall Report On Freshmen/Sophomores OIRA February 2007.
Using Student Ratings to Improve Program Quality and Student Learning Dr. Randi Hagen, Flagler College Dr. Sarah Logan, Angelo State University Dr. William.
BOARD ENDS POLICY REVIEW E-2 Reading and Writing Testing Results USD 244 Board of Education March 12, 2001.
Greenville Technical College Assessing and Developing Student Computing Technology Skills September 19, 2012 Dr. Lenna Young, Mark Krawczyk, and Mary Locke.
IB Math Studies – Topic 6 Statistics.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Update from the UNC General Education Council [presented to the UNC Board of Governors’ Educational Planning, Programs, and Policies Committee on February.
Critical Thinking Test Selection - Process & Evaluation Frances Dearing, PhD Assistant Dean of Assessment, Academic Support and Placement, Westchester.
Developing General Education Course Assessment Measures Anthony R. Napoli, PhD Lanette A. Raymond, MA Office of Institutional Research & Assessment Suffolk.
Assessing SUNY General Education Learning Outcomes with April 27, 2005.
Ivy Tech Adjunct Faculty Conference April 4, 2009.
SUNY GENERAL EDUCATION ASSESSMENT CONFERENCE Guidelines for and Implementation of Strengthened Campus-Based Assessment.
Establishing the Reliability and Validity of Outcomes Assessment Measures Anthony R. Napoli, PhD Lanette A. Raymond, MA Office of Institutional Research.
Presentation by Patsy Dougherty
Jan Weiss, PT, DHS, CLT-LANA
By Sanjay Kumar, Ph.D National Programme Officer (M&E), UNFPA – India
Understanding PSSS Results. 2Understanding PSSS Results, 01/08 4 Major Parts of Your PSSS Results 1.Your Scores 2.Review Your Answers 3.Improve Your Skills.
Social Science Research Design and Statistics, 2/e Alfred P. Rovai, Jason D. Baker, and Michael K. Ponton Internal Consistency Reliability Analysis PowerPoint.
Score Report Plus Score Report Plus Each section— Verbal, Math and Writing Skills—is color-coded.
BASIC STATISTICS WE MOST OFTEN USE Student Affairs Assessment Council Portland State University June 2012.
22nd International Conference STAR, July Palma de Mallorca, Spain The Perceived Stress Scale. Preliminary Psychometric Study with Spanish HIV+
Marquette University Office of Institutional Research & Assessment (OIRA) Summary of the Marquette On-Line Course Evaluation System (MOCES): Fall semester.
Implication of Gender and Perception of Self- Competence on Educational Aspiration among Graduates in Taiwan Wan-Chen Hsu and Chia- Hsun Chiang Presenter.
The Learning Behaviors Scale
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Evaluating the Validity of NLSC Self-Assessment Scores Charles W. Stansfield Jing Gao Bill Rivers.
Results from The College at Brockport 2014 NSSE Survey Presentation to President’s Advisory Council– 3/4/15.
Developmental Math, English and Reading Data Team Subcommittees Reports January 2011.
Naglieri Nonverbal Ability Test (NNAT) Miami-Dade County Public Schools NNAT Workshop March 26, 28, & 29, 2007.
Setting Performance Standards for the Hawaii State Alternate Assessments: Reading, Mathematics, and Science Presentation for the Hawaii State Board of.
Maryland Consortium Findings from the 2006 CCSSE Survey.
Presenter : Ching-ting Lin Instructor: Ming-puu Chen Developing a Usability Evaluation Method for E-learning Application: From Functional Usability to.
Your Research Study 20 Item Survey Descriptive Statistics Inferential Statistics Test two hypotheses – Two hypotheses will examine relationships between.
DOES CONTENT ORDER AFFECT PERFORMANCE ON MULTIPLE CHOICE EXAMS? Introduction 2.- Theory and hypotheses 3.- Methodology 4.- Results 5.- Conclusion.
A Comparison Study between the MMPI-2 and MMPI-2 RF Profiles of Convicted Stalkers McCullaugh, J.M 1.; Pizitz, T.D. 2 ; Stolberg, R. 1 ; Kropp, J. 1 1.
Reliability & Validity
Community College Survey of Student Engagement (CCSSE) Benchmarks of Effective Educational Practice Summary Report Background: The Community College Survey.
Office of Institutional Research CCSSE & Active and Collaborative Learning.
+ General Education Assessment Spring 2014 Quantitative Literacy.
Descriptive Research Study Investigation of Positive and Negative Affect of UniJos PhD Students toward their PhD Research Project Dr. K. A. Korb University.
The Nation’s Report Card: U.S. History National Assessment of Educational Progress (NAEP)
BOARD ENDS POLICY REVIEW E-2 Science and Social Studies Testing Results USD 244 Board of Education April 9, 2001.
Validity with the CAAP Assessment. Validity for what purpose? Placement/Progress Accountability/Accreditation.
R IDGEFIELD P UBLIC S CHOOLS DATA PRESENTATION P ART I October 2015.
Assessment: Reliability & Validity. Reliability Refers to the repeatability of a given testing instrument The extent to which a student would be expected.
College Comparisons. Mean Total Score by College (Possible Score Range 400 to 500) SSD = Total Scores for Colleges of Business, Education, Health and.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, April 2010.
Summary of CPHS Course Evaluations: AY and Chuck Phillips August 2009.
R IDGEFIELD P UBLIC S CHOOLS PARCC S CORE R EPORTS FOR P ARENTS J ANUARY 12, 2016 Janet Seabold.
National Survey of Student Engagement (NSSE): Overview of 2008 Results for Rutgers University.
Measuring College and Career Readiness 2015 PARCC RESULTS: YEAR ONE EDGEWATER SCHOOL DISTRICT ELEANOR VAN GELDER SCHOOL.
PARENT UNIVERSITY: Preparing Your Child for Success Presented by: Dr. Alicia Scelso, Principal, Pequannock Township High School Richard M. Hayzler, Principal,
Characteristics of Studies that might Meet the What Works Clearinghouse Standards: Tips on What to Look For 1.
1 Information Systems Use Among Ohio Registered Nurses: Testing Validity and Reliability of Nursing Informatics Measurements Amany A. Abdrbo, RN, MSN,
Standards-Based Tests A measure of student achievement in which a student’s score is compared to a standard of performance.
UNIT Standardization and Technical Properties n Standardization Sample n Reliability Studies Internal Consistency Reliabilities at Decision-Making Points.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, September 2009.
Psychometric Evaluation of an Instrument for Assessing Policy Outcomes for Families with Children Who Have Severe Developmental Disabilities: The Beach.
Understanding the Results Ye Tong, Ph.D. Pearson.
Further Validation of the Personal Growth Initiative Scale – II: Gender Measurement Invariance Harmon, K. A., Shigemoto, Y., Borowa, D., Robitschek, C.,
Elayne Colón and Tom Dana
HM 10-7 HM 10-7 contains a measure of global subjective happiness developed by Lyubomirsky and Lepper. They believed that the work on happiness lacked.
Personal Assessment of the College Environment (PACE)
CAAP Fall Report On Freshmen/Sophomores
Collecting and Interpreting Quantitative Data – Introduction (Part 1)
Understanding and Using Standardized Tests
SUNY Oneonta’s CLA Results:
The Relationship between Social Skills and Academic Achievement of Universitas Klabat Students Ate Gueen L. R. Simanungkalit
Presentation transcript:

Pilot Study of the CAAP Critical Thinking Test April 27, 2005 Lanette Raymond Research Associate, Suffolk County Community College

CAAP Test Description z32 item multiple choice test zDesigned for use with college students zAdministered within a single class meeting, zAppeared relatively credible in an in-class administration protocol zProvided documentation of reliability and validity across community college populations

CAAP Subscore Customization zCAAP Critical Thinking test contents closely match the SUNY CT learning objectives zVariance in the way the results are reported zACT developed a customized report for these sub- scores showing normative comparisons against ACT national community college data. zACT provided the student data files to SCCC, for further analysis of this data

Administration and Sample zFall, 2004 zAdministered in-class to 154 SCCC students in 7 general education courses zPredominately white (77%) zTraditional age (60% 20 years old or younger, 25% between 21 and 25 years old) z50% male, 50% female zMostly sophomore status (46%) zFulltime enrollment (85%)

Student Motivation zNo motivational tactics were employed zCAAP-CT instrument included an item that addressed students' self-reported motivation levels zOne-third of students (n = 52) did not respond to the motivation item z5 students indicated that they “gave no effort” (n = 1) or “gave little effort” (n = 4) to the assessment test.

Student Motivation zLower motivation results in less optimal performance zLess motivated students’ scores are less reliable and less valid zReporting sample is based on data from the 97 students who reported moderate to best effort zThe reliability coefficient (calculated with the data from the original 154 tests) for objective 1 (26 items) is within acceptable range (alpha =.80) zDue to the small number of items (6 items) contributing to objective 2, its reliability coefficient is much lower (alpha =.49).

Results zConfirmatory factor analysis substantiates the utility of the CAAP-CT test as a measure of 2 separate but related sets of critical thinking skills based on the 2 GEAR learning objectives

2-factor model of Critical Thinking based on the GEAR Objectives

Results zAll of the items loaded well onto their respective factors, with item 1 being only slightly below 1.96 (at 1.81). zThe model shows an excellent fit to the data (χ2(463) = 466, p =.46, CFI =.94), providing additional context validity to the assessment.

** Data for item 20 should be listed under objective 2

Standards Does not meet standard 59% or less Approaches standard 60% - 69% Meets standard 70% - 79% Exceeds standard 80% or more

Standards