Common Errors in Data Collection. Purpose of this Presentation  To point out the importance of accurate, consistent data collection  To illustrate trends.

Slides:



Advertisements
Similar presentations
Dual Credit in Illinois Debra D. Bragg, Professor, University of Illinois Presentation for the Illinois Dual Credit Task Force September 10, 2008.
Advertisements

Developing an Effective Tracking and Improvement System for Learning and Teaching Achievements and Future Challenges in Maintaining Academic Standards.
Campus-wide Presentation May 14, PACE Results.
Selecting and Identifying Programs of Study Division of School and Community Academic Programs Camden County College Camden Pathways Professional Development.
College & Career Readiness in Illinois Brian Durham Senior Director for Academic Affairs & CTE Illinois Community College Board
Illinois High School to College Success Report High School Feedback Reporting In Cooperation with ACT, Inc. Illinois Community College Board Illinois Board.
The “sheer volume of dissonant statistics”... demands colleges “fortify their institutional research capacities.” ~ Cliff Adelman. “The Toolbox Revisited:
An Assessment Primer Fall 2007 Click here to begin.
Noncredit Division Faculty Meeting AB 86 Overview June 24, 2014.
Improving Recruitment and Retention in the Level 3 Subsidiary Diploma in Construction By :Simon Cummings.
Building a Connected Infrastructure for Youth Success from Cradle to Career Nevada’s 2010 Dropout Prevention Summit Highlights and Accomplishments Since.
National Association of Student Financial Aid Administrators Presents … © NASFAA Fall Training An Institutional Approach to Developing and Updating.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Title I Needs Assessment and Program Evaluation
+ Monitoring, Learning & Evaluation Questions or problems during the webinar?
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Great Expectations Great Rewards. What Do You Expect for Your Future? Career- How will you support yourself? Family – Will get married / have children?
1. Factors That Say Yes Considers Crucial to Student Success 2.
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
1 Using Factor Analysis to Clarify Operational Constructs for Measuring Mission Perception Ellen M. Boylan, Ph.D. NEAIR 32 nd Annual Conference November.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Minnesota FastTRAC Adult Career Pathways
Dr. Albrecht Research Team EXAMPLE of EVALUATIO N RESEARCH SERVICE LEARNING
School Counselors Doing Action Research Jay Carey and Carey Dimmitt Center for School Counseling Outcome Research UMass Amherst CT Guidance Leaders March.
Public High School Graduates Orange County (Fall 2008) 9,328 students graduated with a standard diploma or GED; 2,707 enrolled at Valencia (29%) and 662.
Are We making a Difference
David Gibbs and Teresa Morris College of San Mateo.
Longitudinal Studies The ultimate Goal for many extension faculty is to implement programs that result in permanent, positive change in human social behavior.
Graduation Conversation Doug Kosty Assistant Superintendent, Office of Learning Derek Brown Manager, Assessment of Essential Skills Cristen McLean Operations.
1 Developing Programs of Study Benson Education Associates SECONDARY TO POSTSECONDARY CAREER PATHWAYS/programs of study meeting.
The Impact of CReSIS Summer Research Programs that Influence Students’ Choice of a STEM Related Major in College By: Alica Reynolds, Jessica.
GSA OGP Advisory Committee Engagement Survey ACES 2004 Overall Results September 23, 2004.
Challenges in Using IT for Learning and Teaching in Saudi Arabian Universities Present by : Fahad Alturise Supervisor : Dr. Paul Caldor.
Using Data to Improve Student Achievement Aimee R. Guidera Director, Data Quality Campaign National Center for Education Accountability April 23, 2007.
What this presentation has to offer… Continuous change and improvement to help our students! Comprehensive School Counseling Program Concentration on.
Making a Difference Update on the National i3 Evaluation of the Implementation and Impact of Diplomas Now.
Helena Business and Industry Survey Conducted by the Helena Education Foundation July-August, 2008 Summarized by: Patrick Kelly, National Center for Higher.
Post High School Planning: College and Career Naviance Family Connection Guidance Lesson Results Sample Deniece Chideme Ruben Escobar Fall 2010 Chideme.
Ramp-Up to Readiness TM Introduction Choose Rigor + Gain Access + Maintain Motivation + Practice Persistence = READINESS.
IDEA Student Ratings of Instruction Shelley A. Chapman, PhD Insight Improvement Impact ® University of Alabama Birmingham September 11, 2012.
Data analysis was conducted on the conceptions and misconceptions regarding hybrid learning for those faculty who taught in traditional classroom settings.
EXPANDING YOUR TRANSITION TOOLBOX: Teaching Transition Knowledge and Skills “Building Futures” Transition to Education and Employment Conference Salem,
Placement Testing Dr. Edward Morante Faculty to Faculty Webinar April 19, 2012 Sponsored by Lone Star College.
Four Year Planning and College Credit Opportunities McMinnville High School.
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
Copyright 2010, The World Bank Group. All Rights Reserved. Probation Statistics Part 2 Crime, Justice & Security Statistics Produced in Collaboration between.
Selected Results of President’s Office Survey of Alumni Graduating in 1997/98 The Office of Institutional Research and Policy Studies July 15, 2003 Jennifer.
Unit 5: Improving and Assessing the Quality of Behavioral Measurement
 Shelley A. Chapman, PhD Texas A & M University February 2013.
Key Considerations in Collecting Student Follow-up Data NACTEI May 15, 2012 Portland, OR Promoting Rigorous Career and Technical Education Programs of.
West Central Community School District Performance Document: Formative Evaluation Tool By John Johnson ortheast Iowa Charter School Northeast Charter School.
Institutional Effectiveness A set of ongoing and systematic actions, processes, steps and practices that include: Planning Assessment of programs and.
High Schools That Work An evidence-based design for improving the nation’s schools and raising student achievement.
The Three R’s of Articulation NTPN - October 1, 2005 READY, REACH, REPORT.
IDEA STUDENT EVALUATION REPORTS Insight Improvement Impact ® Using IDEA as a Tool for Reflection about Your Teaching Raritan Valley Community College January.
Definition Title: Motivation and Attitude toward Integrated Instruction through Technology in College-level EFL Reading and Writing in Taiwan Integrated.
Five Years of Lessons Learned and Moving into the Future :
WHAT TO DO WITH ALL THE AFTERSCHOOL DATA This webinar was presented on June 8, 2012; a recording of the webinar can be found on DESE’s website under ‘webinars’.
Erin M. Burr, Ph.D. Oak Ridge Institute for Science and Education Jennifer Ann Morrow, Ph.D. Gary Skolits, Ed.D. The University of Tennessee, Knoxville.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Education.state.mn.us Principal Evaluation Components in Legislation Work Plan for Meeting Rose Assistant Commissioner Minnesota Department of Education.
Perkins End of Year Evaluation Wilson Community College.
Palo Verde College Planning for Freshmen December 3 – 4, 2015.
Christopher Jacobson Frederick Community College Cindy Nicodemus Howard Community College.
To What Extent are Law Enforcement and Socioeconomic Improvements Significant to Deter Violent and Property Crime Chowdhury Khalid Farabee and Papa Loum.
: Five Years of Lessons Learned and Moving into the Future CCTI
July 28, 2009 Dr. Gary Wixom, Assistant Commissioner
Using outcomes data for program improvement
An Introduction to Evaluating Federal Title Funding
Presentation transcript:

Common Errors in Data Collection

Purpose of this Presentation  To point out the importance of accurate, consistent data collection  To illustrate trends in student outcomes that then may be utilized to improve services to meet students’ needs  To give examples of how evaluation data may be utilized to improve programs and influence policymakers

Important Concepts in Evaluation: Validity  Does the instrument measure what it is supposed to measure?  If valid, the interpretation of the results are sound and the results can be generalized.

Important Concepts in Evaluation: Reliability  Does the instrument measure the same thing repeatedly over time?  If reliable, instrument responses are consistent each time they are measured and the scoring is consistent.

For example: It is important to correctly identify CTP and CUP/CTP completers at the secondary level in order to track them at the postsecondary level. If this is not accomplished consistently, you will not have reliable data for comparisons from year to year (comparing apples to oranges).

Examples of Data Tracking & Comparisons If these data were not collected consistently, we may be indicating change that is not really happening. Therefore, any policy decisions based on faulty data will be pointless.

Review Slide Tracking Enrollment Trends at a Community College Over Time CUP completers show a decreasing enrollment while CTP completers show an increasing enrollment. This illustrates change at the consortium level.

Tracking Post-Secondary Intention Trends Over Time at the State Level

Remediation Rates 2004: Comparing National and State Statistics

Receive Feedback from Stakeholders for Program Improvement Utilizing Senior Survey results, business partner input and faculty and staff feedback for program improvement

Sample Senior Survey Item: “What Pathway Did You Follow?”

Top 10 Huskins and Dual Enrollment Classes Taken ClassN% Criminology Criminal Law Intro to Criminal Justice388.1 Juvenile Justice347.2 Law Enforcement Operations326.8 Ethics and Community Relations265.5 Music Appreciation204.3 Community Policing183.8 Expository Writing183.8 Organized Crime183.8

Receive Feedback about your CTE Work: Items from a Staff and Faculty Survey QuestionsYesNoNot Sure Do you believe that every student’s goal should be to attend a four-year University? 1.8%97%1.2% Based on your experience, do you think CTE lowers the dropout rate and keeps students in school? 86.5%4.1%9.4% Do you believe a community college education is adequate for some students? 98.8%1.2%

Open-ended Item Responses from Staff and Faculty Survey: “If CTE was removed from your curriculum, what impact do you think it would have on your school?”  A lot more dropouts. Many students only stay in school in order to take our CTE courses. Our CTE courses are rigorous, but allow for hands-on application. Our students learn a great deal in CTE and accordingly must be motivated to succeed.  I believe that removing the CTE program would leave a lot of students who lack the funding, family support and desire to attend a 4 year college lacking the skills they need to find gainful employment.  I shudder to think that would ever happen. I have students in my classes who would not continue in school if they did not have tech classes to keep them motivated. They identify with the tech programs.

Receive Feedback that May Impact Policy Decisions Accurate data presented to powerbrokers has power to influence decisions

Senior Survey Item: “Do you have a computer with internet access in your home?”

Senior Survey Item: “If you had had to complete two years of a foreign language, would you have completed your high school diploma?” (By Course of Study) CTPCUP/CTPCUP Yes39.3%87.5%96.9% No60.7%12.5%3.1% Chi-Square test found differences in cells statistically significantly different at.000 level

When planning your evaluation, plan to collect data that are:  Accurate  Reliable  Valid  Useful for your consortium  Useful for State and Federal reporting

Thank You! Praxis Research, Inc (704) 523 – 2999 Website: