Pilot STEM in OST Evaluation Preliminary Report March 2012 University of California, Irvine.

Slides:



Advertisements
Similar presentations
Student Survey Results and Analysis May Overview HEB ISD Students in grades 6 through 12 were invited to respond the Student Survey during May 2010.
Advertisements

Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
Nursing Home Survey on Patient Safety Culture
Lesson Study Overview Unlocking Minds Preparing Osceola County Students for a Bright Future Gladys Moreta K-12 Program Specialist Curriculum & Instruction.
Parent School Climate Survey Results and Analysis November 2010.
EPAS: Elevating the Postsecondary Aspirations of Students! Using ACTs EPAS Data Effectively Glenn Beer Louisiana Tech University
Washington State E VALUATION D ATA C OLLECTION 1.
Neil Naftzger Principal Researcher Washington 21st CCLC Evaluation February 2015 Copyright © 20XX American Institutes for Research. All rights reserved.
Frequently Asked Questions TELL Texas Survey 2014 February 2014.
Minnesota Assessment System Update Jennifer Dugan “Leading for educational excellence and equity. Every day for every one.”
Math TLC Tutor Lab Overview: All sections of Math 010 and 110 are taught in a single, dedicated, technology-enhanced classroom that is adjacent to a.
ACCESS for ELLs and Alternate ACCESS for ELLs
UNIVERSITY OF CALIFORNIA, IRVINE CAOMP UNIVERSITY OF CALIFORNIA, IRVINE CAOMP ONLINE TOOL BOX.
Fifth Annual NSF Robert Noyce Teacher Scholarship Program Conference July 7-9, 2010 Enrique Ortiz University of Central Florida Using a Teaching Goals.
© 2013 K12 Insight Central Office Climate Survey Results Las Cruces Public Schools March , 2013.
Assessing College Wide SLOs using a Student Perception Survey: A Tale of Two SLOs Jeanne Edman and Brad Brazil Cosumnes River College.
So What Can I Expect When I Serve on an NEASC/CPSS Visiting Team? A Primer for New Team Members.
Southern Regional Education Board HSTW Administering the 2012 MMGW Surveys.
CRIOP Professional Development: Program Evaluation Evaluatio Susan Chambers Cantrell, Ed.D. Pamela Correll, M.A. Victor Malo-Juvera, Ed.D.
A Mixed Method Study Explores the Impact of UCI-NSF PreK/K Leadership Training for Early Childhood Educators Integrating Science, Math, and Literacy Linda.
Student Engagement Survey Results and Analysis June 2011.
Evaluation 101: After School Programs February 1, 2007 Region 3 After School Technical Assistance Center Conference.
Understanding and Administering the School- Wide Evaluation Tool (SET)
GTEP Resource Manual Training 2 The Education Trust Study (1998) Katie Haycock “However important demographic variables may appear in their association.
The Impact of the Maine Learning Technology Initiative on Teachers, Students, and Learning Maine’s Middle School 1-to-1 Laptop Program Dr. David L. Silvernail.
TOM TORLAKSON State Superintendent of Public Instruction National Center and State Collaborative California Activities Kristen Brown, Ph.D. Common Core.
The WASL : What do Our 10th Graders Think about It? Washington Educational Research Association 22nd Annual Washington State Assessment Conference December,
Introduction & Step 1 Presenter: Updated 6/21/2013.
TEAM-Math Teacher Leader Meeting October 28, 2004.
Welcome to the State of the STEM School Address National Inventor’s Hall of Fame ® School Center for Science, Technology, Engineering and Mathematics (STEM)
ESU’s NSSE 2013 Overview Joann Stryker Office of Institutional Research and Assessment University Senate, March 2014.
Principal’s Mid-Year Report Shelley Jusick Napoleon Middle School Math Goal.
Certified Evaluation Plan Training Module Products of Practice, Student Voice Surveys December 2014.
AIM Getting ready to launch!. WHICH STUDENTS ARE ELIGIBLE?  Students who are economically disadvantaged  Who are below proficient- first those on the.
California Afterschool Outcome Measures Project California Afterschool Outcome Measures Project UNIVERSITY OF CALIFORNIA, IRVINE PRINCIPAL INVESTIGATOR:
Unpacking the Math Standards Session 3 Dwayne Snowden & Michael Elder Richlands Area: January 5 Jacksonville Area: February 2 Southwest Area: February.
Service Learning Dr. Albrecht. Presenting Results 0 The following power point slides contain examples of how information from evaluation research can.
Welcome to 3 rd Grade Parent Information Night Mrs. Combs Mrs. Davis Ms. Foley Mrs. Khazzoum Ms. McWilliams Expect Success!
Georgia’s Writing Assessments Grade 3 Writing Assessment Pre-Administration Webinar Fall 2011 Webinar Etiquette Please use the Audio Setup Wizard in the.
November 26, :00am – 11:00am Monitoring Quality of Tutor Instruction
S UMMER L EARNING P ROJECT Overview of Outcome Measures SUMMER 2011 University of California, Irvine Department of Education Department of Education.
Jim Dorward Sarah Giersch Kaye Howe Rena Janke Mimi Recker Andy Walker NSF Awards: NSDL ;TPC Using Online Science & Math Resources in Classrooms.
Professional Development Opportunities for the New Math Standards.
Science Mentoring Program Hughes STEM High School Community Partnership Experiences February 7, 2012 Kent Buckingham, Ph.D., Program Coordinator.
BARREN COUNTY NON-TRADITIONAL LEARNING PLAN.
SD WriteToLearn Assessment Pilot TIE Conference April 20, 2010.
Outcomes for Mathematical Literacy: Do Attitudes About Math Change?
Presentation to the Before and Afterschool Advisory Committee California Department of Education January 11, 2011 January 11, 2011 Pilot Findings and Field.
TESTING AND ACCOUNTABILITY DEPARTMENT New Hanover County Schools September 2012 ACT and PLAN.
MICAH GONZALES EDCU 6331 INTERNSHIP II SECTION KKL Campus Improvement Project II S.T.E.M. Career Nights.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
Research, Accountability and Grants Orange County Public Schools Florida Standards Assessment.
Texas Regional Collaboratives for Excellence in Science Teaching Guidelines for Program Evaluation Presented by Carol L. Fletcher, Ph.D. Project Manager.
Increasing parent engagement in student learning using an ITS with automated messages. A thesis presentation for the degree of Master of Science in Computer.

Research and Evaluation Center Assessment of the YouthBuild Mentoring Initiative Kathleen Tomberg, Research Analyst Research and Evaluation Center John.
Student Perception Survey: Survey Coordinator Training Office of Policy and Evaluation Spring 2016.
Student Perception Survey: Survey Coordinator Training Office of Policy and Evaluation April 2016.
Instructional Plan | Slide 1 AET/515 Instructional Plan For Associate’s Degree in Library Skills (Donna Roy)
For the Students Students in elementary school right now have always used technology, classes seem outdated and boring to most because of the lack of.
IF GIRLS AREN’T INTERESTED IN COMPUTING CAN WE CHANGE THEIR MINDS? Julie Fisher Monash University, Melbourne, Australia,
Principal Investigator: Dr. Ann Robinson Project Director: Ms. Kristy Kidd Evaluator: Dr. Jill Adelson In collaboration with: Museum of Science, Boston.
Welcome to the Nevada Test Administration Training and Q&A Session
Washington 21st CCLC Data Collection Webinar Samantha Sniegowski
Florida Standards Assessment
Post Secondary Planning for 12th Grade Students
Background This slide should be removed from the deck once the template is updated. During the 2018 Legislative Session, Act 555 was passed requiring schools.
Woodland Public Schools Parent Survey Results
Background This slide should be removed from the deck once the template is updated. During the 2019 Legislative Session, the Legislature updated a the.
Presentation transcript:

Pilot STEM in OST Evaluation Preliminary Report March 2012 University of California, Irvine

Study Sample Selected 17 programs in 9 Regions participating Pilot Evaluation Study 17 programs in 9 Regions participating Pilot Evaluation Study Regions 1, 2, 3 4, 7, 8, 9, 10, 11 Regions 1, 2, 3 4, 7, 8, 9, 10, 11 Total of 65 sites recruited Total of 65 sites recruited 52 Elementary 52 Elementary 9 Middle School 9 Middle School 4 K-8 4 K-8 Criteria for Selecting Study Sample: Criteria for Selecting Study Sample: Range of STEM curriculum approaches Range of STEM curriculum approaches Range of student age groups (grades 3-12) Range of student age groups (grades 3-12) Diverse ethnic and socio-economic backgrounds, representative of students in the State of California Diverse ethnic and socio-economic backgrounds, representative of students in the State of California Internet access Internet access

Overview of Evaluation Activities: Selected Study Sites engage in the following activities: Administer pre (fall 2011) and post (spring 2012) online student surveys Administer online staff surveys at two time points: pre (fall 2011), and end-of-year (spring 2012) Document Weekly Stem Activities—to be recorded daily by site implementers on STEM Activity Documentation Forms Provide UC Irvine with copies of program schedule and lesson plans

Pilot Study Launch Fall 2011 ACTIVITIES TO DATE:  Selection of pilot study sample sites  Set up of staff pre-survey and student pre-survey online links  Communication with pilot study sites: 1. sent informing Program Liaison of selected sites and overview of evaluation activities to be carried out 2.Individual s sent to STEM Implementers at each study site including: 1)Individual staff IDs assigned & instructions distributed for staff surveys 2)Site IDs assigned & instructions distributed for student surveys 3)STEM Activity Documentation Form (electronic and hard copies) distributed with instructions & envelopes for returning forms to UC Irvine 3.Packages sent to each study site or program liaison with: 1)Hard copies of STEM Activity Documentation Forms and instructions 2)Prepaid and addressed envelopes completed forms to UC Irvine

Winter 2012 Evaluation Activities  Follow-up Communications & Reminders: 1.Program liaisons sent a list of study sites and corresponding staff IDs and site IDs Nov. 30/Dec. 1, Reminders sent to Program Liaisons & STEM Implementers Reminders to complete surveys and begin sending STEM Activity Documentation Forms Reminders to complete surveys and begin sending STEM Activity Documentation Forms Accounting of data collected from their program sites Accounting of data collected from their program sites All site and staff codes and survey instructions resent with All site and staff codes and survey instructions resent with  Ongoing contact with program Program Liaisons & STEM Implementers by phone and

Data Collected to Date—March staff pre-surveys have been completed 35 sites have completed 1,277 student pre-surveys  Preliminary results of Staff and Student Pre-Surveys summarized in slides that follow 103 STEM Activity Documentation forms received reporting on 310 Individual Stem Activities  Items reported by STEM implementers about each activity include: Date and Duration of Activity Date and Duration of Activity Name of activity Name of activity STEM content area addressed STEM content area addressed Number of students and grade level Number of students and grade level 4 point ratings of 4 point ratings of 1.Level of student Engagement 2.Level of challenge 3. Overall assessment of success of activity  Data are being entered in the data base for analysis and will be correlated with staff survey data and student survey site level data.

Summary STAFF Pre-Survey Data Data collected December 2011–March 2012

SUMMARY OF STAFF PRE-SURVEY RESULTS  90 staff surveys collected between November 2011 and March 2012  73% respondents female.  1/3 are between 18 and 25 years old.  72% are 35 or under.

S TAFF S URVEY —L EVEL OF E DUCATION A CHIEVED Nearly half (48%) have attended college or have an AA degree 42% have B.A. Degrees or higher

Staff Reflect the Diversity of Students and Communities Served

S TAFF P OSITION IN A FTERSCHOOL P ROGRAM  40% staff have less than one year in current position  33% have more than 5 years in position  27% have 1-3 years in position  40% staff have less than one year in current position  33% have more than 5 years in position  27% have 1-3 years in position

S TAFF E XPERIENCE IN OTHER S CHOOL S ETTINGS 62% have some experience as a classroom aide or TA  23% 1-5 years14% more than 5 years 37% reported having some classroom teacher experience  15% 1-5 years10.5% more than 5 years 15% have some school administrative staff experience 8% student support staff experience Few staff report any administrative experience (6%) Professional Experience n< 6 mo6-12 mo1-2 yrs2-5 yrs5-10 yrs10+ yrs School Administrator74 3%00 00 Student Support Staff74 1%03%1%03% Administrative Staff76 3% 5%4%00 Classroom Teacher79 2.5%9%4%11%2.5%8% Instructional Specialist75 4%3% 9%3% Classroom Aide, TA84 14%11% 12%6%8%

Staff Self-Identified Instructional Roles

Variety of STEM Activities Implemented Prompt: In your current position, do you implement any STEM activities with students? If so, please specify which types of activities (select all that apply):

Prior Experience & Time per Week Implementing STEM Prompt: In your current position, how much time per week do you spend implementing STEM activities with students?  22% spend NO time implementing STEM  Nearly half (49%) do at least 30 minutes to 2 hours of STEM per week  30 % have no prior experience implementing STEM at another program

Staff Meetings Around STEM 21 % report never discussing STEM at staff meetings 55% discuss STEM in staff meetings at least once a month to once a week n =90 Frequency of Staff Discussion of Program and STEM Issues

Compensation for Staff Meetings ( n = 89) $66% staff report being compensated for all meetings $10% for most meetings $7% for some meetings $15% receive no compensation for meetings  2% staff report having no meetings

Training and Support A little over half (52%) of staff surveyed have had at least 1-4 sessions of Stem Related Training in the past academic year 42 % have had NO Stem Training n = 90

S TAFF R ELATIONS WITH T EACHERS During the past academic year, how often have you discussed…  32% never speak with classroom teachers about STEM concepts taught in classroom

S TAFF R ELATIONS WITH P ARENTS n = 89  76% of staff report never holding STEM-Related events for parents  36% say they never speak with parents about STEM activities  76% of staff report never holding STEM-Related events for parents  36% say they never speak with parents about STEM activities

Staff Outcome Measures Beliefs & Sense of Competency nRangeMeanSDAlpha Staff Beliefs about STEM Staff Sense of Competency Two staff outcome measures:  5 point rating scale o strongly disagree, disagree, neither agree or disagree, agree, strongly agree [some reversed coded]  Staff beliefs about STEM in the afterschool program. Example Items I think the students enjoy doing STEM Activities In general, I think these students [in the afterschool program] are very capable of doing hands-on science activities I don’t think there is enough time at the program for students to learn much about STEM  Staff sense of competency implementing STEM in the afterschool program Example Items Example Items I have a strong background in at least one area of STEM I do not know enough about Science, Technology, Engineering and/or Mathematics to teach any of them well I feel confident about teaching Science, Technology, Engineering and/or Mathematics in the afterschool program

Summary STUDENT Pre-Survey Data Data collected December 2011–March 2012

Student Survey Sample 1,277 in Full Sample 51% Boys and 49% Girls Grade and Gender of Student Respondents

Time to Complete Survey 51% students overall report taking 15 minutes or less to complete survey 55% Middle School & 28% Elementary students report taking 9 minutes or less

Survey taking Experience Full sample : n = 1200  The majority of students find the surveys Easy to Read, Understand and Answer 83-85% state “mostly true” or “really true” 83-85% state “mostly true” or “really true”

STUDENT OUTCOMES Pre-Survey Results 1,277 surveys completed by students grades 3-8 Eight scales: 1.Work Habits 2.Misconduct 3.Social Competencies 4.Math Efficacy 5.Science Efficacy 6.Interest and Engagement in STEM 7.Future Outlook 8.Science Career Aspirations

Work Habits  Work Habits—mean of 6 items  Assessed on a 4-point scale (1 = not at all true, 4 = really true)  Sample items include: “I work well by myself” “I work well by myself” “I finish my work on time.” “I finish my work on time.” Full Sample n = 1,277 SD =.67 Alpha =.81

Misconduct  Misconduct—mean of 9 items  Assessed on a 4-point scale: (0 = never, 3 = more than once a week) Lower score means LESS misconduct Lower score means LESS misconduct  Sample items include: “I have gotten into a fight at school” “I have gotten into a fight at school” “I have taken something that belongs to someone else.” “I have taken something that belongs to someone else.”

Social Competencies  Social Competencies—mean of 7 items  Assessed on a 4-point scale: (1 = not at all true, 4 = really true)  Sample items include: “I work well with other kids” “I work well with other kids” “I can tell other kids what I think, even if they disagree with me.” “I can tell other kids what I think, even if they disagree with me.”

Math Efficacy  Math Efficacy—mean of 4 items  Assessed on a 4-point scale: (1 = not at all true, 4 = really true)  Sample items include: “I expect to do well in math” “I expect to do well in math” “I am interested in math.” “I am interested in math.”

Science Efficacy  Science Efficacy—mean of 4 items  Assessed on a 4-point scale: (1 = not at all true, 4 = really true)  Sample items include: “I expect to do well in science” “I expect to do well in science” “I am interested in science.” “I am interested in science.”

Excited, Engaged and Interested Science Learner [PEAR-Harvard]  Excited, Engaged & Interested Science Learner—mean of 24 items  Assessed on a 4-point scale: (1 = strongly disagree, 4 = strongly agree) Some items reverse coded Some items reverse coded  Sample items include: “Science is something I get excited about.” “Science is something I get excited about.” “I am curious to learn more about science, computers or technology.” “I am curious to learn more about science, computers or technology.” “Science is boring” “Science is boring” “I pay attention when people talk about recycling to protect our environment.” “I pay attention when people talk about recycling to protect our environment.” “I enjoy visiting science museums or zoos.” “I enjoy visiting science museums or zoos.”

Future Outlook  Future Outlook—mean of 7 items  Assessed on a 4-point scale: (1 = strongly disagree, 4 = strongly agree)  Sample items include: “I will go to college.” “I will go to college.” “I will have a job that I enjoy doing.” “I will have a job that I enjoy doing.”

Science Career Aspirations  Science Career Aspirations—mean of 4 items  Assessed on a 4-point scale: (1 = strongly disagree, 4 = strongly agree)  Sample items include: “I will have a career in Science, Technology, Engineering, or Mathematics.” “I will have a career in Science, Technology, Engineering, or Mathematics.” “I will graduate with a college degree in a major area needed for a career in science.” “I will graduate with a college degree in a major area needed for a career in science.”

Next Steps Initiate Post-Survey administration: April 15, 2012-June 15, 2012 Ensure staff have received information, instructions and forms for completing Activity Documentation Forms UC Irvine to Request sample copies of site schedule and weekly lesson plan Final analysis of pre/post student and staff outcome measures and program implementation data Refinement of measures for STEM in OST evaluation study