Download presentation
Presentation is loading. Please wait.
Published byJonathan Spencer Modified over 8 years ago
1
Annual Performance Review (APR) and CREP Teacher & Student Survey 21 st CCLC Spring Institute March 16, 2016
2
Overview Review of first submission Importance of the terms Data collected Most common issues Preparing for the next submission Teacher Survey Overview Introduction to Student Survey
3
Review of First Submission Thank you for your hard work!
4
Importance of the Terms The data had to be entered by terms. Why? GPRA indicators are intended to demonstrate extent of improvement over time.
5
Levels of Access SEA Super User SEA User Grantee User (Subgrantee)
6
SEA Super User Only one person may be assigned to this level User management Manage profile data for grantees and centers Manage APR data Manage state award and competition data View visualized reports (Unless Delegated) Certify data
7
SEA User Two people may be assigned to this level User management (if assigned) Manage profile data for grantees and centers Manage APR data Manage state award and competition data (If Assigned) View visualized reports (if assigned)
8
Grantee User (Subgrantee ) Multiple people may be assigned to this level Manage profile data for their grant and associated centers Manage APR data for their centers
9
Data Collected Center information Whether Extended Learning Time is implemented Feeder school information Partner information Activities Staffing Participation Outcomes
10
Most Common Issues Log-in Personnel Center Activities Staffing Participation Outcomes
11
Issues Logging In Never received the e-mail from the Tactile Group with login information Password must be changed every 60 days Browser problems Response time was slower than stated from the Tactile Group Link was not posted to the VDOE website Too many requirements for the password
12
Suggestions for Improving Login Problems Post the link Allow password to remain valid for a longer duration of time Quicker response or a response period to questions or request for support from USED Reduce the number of characters for the password
13
Issues with the Assigned Personnel The person assigned by VDOE was not the person who needed to complete the survey
14
Suggestions for Assigning Correct Personnel Department will add a section to the contact information form to collect this information Contact form will be updated yearly.
15
Issues with Center Data Feeder schools Co-applicant should be considered a partner. When checking summer data during the second round of checking n February, some data was missing. As partners were added, they did not appear in the review. Once the data was entered, several times, when reviewed again, it had disappeared. After entering the initial school information, it wasn't made clear that you had to click on the school name to proceed to enter specific data.
16
Issues with Center Data Difficult to figure out how to input data, as the instructions to do so were limited. The process to get to the data entry was counterintuitive and difficult to remember each time I logged in Consistency across program sites The issue was not in entering the data, the data that was asked for was the issue None after realizing the little arrows at the top corners moved it from page to page
17
Suggestions for improving Center Data Problems Improve technical support Having to back track to collect the grades; this was a new data collection State specific directions Add these instructions to the system Maybe not have to click so many times back and forth
18
Suggestions for Improving Center Data Problems Really helpful to be able to view fall data when entering spring data Data request lengthy Make it more obvious to get to the data input, instead of using a sidebar menu and having to click within that menu Dates need to be highlighted at the beginning of the document so grantees know when due, instead of going through the entire document to find out
19
Issues with Activities Data College and career readiness definition is too restrictive Hard to locate the activities section--- hidden in the upper right hand corner Difficult to find the weekly option Confusion about what kind of activities fit in which category Difficult to determine if data "submitted" to VDOE
20
Suggestions for Improving Activities Data Allowing to print or save a copy of what was submit for our records List items down straight instead of placing them to the side at the top...confusing An additional category added under Enrichment for "Other" to allow program offerings that are not listed to be recognized and accounted Ability for centers to add more than one entry per category, thus allowing for further explanation of all the “wonderful” activities we do with our children
21
Suggestions for Improving Activities Data Many of activities are both literacy and English language learner support; it would be helpful to be able to have a box for "was ___________ also English language learner support.” Data request lengthy
22
Issues with Staffing Data The issue with the system for reporting data about staffing is that it doesn't give a means to account for staffing changes.
23
Suggestions for Improving Staff Data Entry Specify "School Day Teachers" as "After School Teachers" and clarify it is those teachers working with the 21 st CCLC program Provide a section for comments
24
Issues with Participant Data Numbers greater than 999 corrected Very time consuming entering data for the different groups Gathering data are the challenge Race/ethnicity categories do not match the ones used in school system which resulted in many students counted in "other" category when in fact they do identify in one particular race in our school data system
25
Issues with Participant Data Difficult to be consistent across seasons with student turnover Somewhat confusing Difficult to know if data "submitted" to VDOE; definitions of participants always challenging to interpret Asks for total number of participants for all grades, however state assessment scores only can be entered for SOL grades causing an error; may have had 30+ day students that did not take SOL and thus data are skewed
26
Suggestions for Improving Participant Data Entry Advance notice of data collection categories List it in the center of the page. Tell the user when you have completed the section and all sections For Student Attendance, many students participated more than 25 days but less than 30. Suggest lowering the minimum day category to "less than 25."
27
Suggestions For Improving Participant Data Entry Helpful to know that spring data were going to be cumulative before logging in to enter the data More explanation about what qualifies a student as "limited English language proficiency” Allowing to see fall data when entering spring data
28
Issues with Outcome Data Summer outcomes – none Fall outcomes – grades Spring outcomes – state assessments, grades, and teacher surveys “Needs Improvement” – if the student does not have an “A” reporting as needing improvement Definition of “proficient” Time it takes analyzing the data
29
Issues with Outcome Data Outcomes ill-defined; outcomes new and may not correspond with grant profile. If data not collected, cannot respond. Time consuming entering data for different groups. Data were collected by the center; however, the information was broken into these categories. Asking respondent to determine if a student needs to improve in math or reading based on the year before SOL test is not valid.
30
Issues with Outcome Data Vagueness of the questions and the subjective interpretation of the reporter Consistency with information from many sources; as a program grade data are not collected because does not reflect growth in the same way as standardized assessments Lengthy
31
Issues with Outcome Data Needed additional knowledge of what outcomes would be tracked before being asked to report them Not knowing when collecting data that it would need to be broken down in the method that it was requested Non SOL grade students are included in the participant data; however, no SOL scores to assess them and thus the numbers do not tally
32
Issues with Outcome Data Outcomes for grades (report card marks) confusing No parameters offered about what specific grades (subject areas) to use and how to define needing progress or making progress. Different scale in some schools for K, 1-2, and 3-5, all with their own set of letter marks. Each center defined "at risk" and "improvement"
33
Issues with Outcome Data Asked to address improvement in "English grades" This left definition to each center so not reflective of consistent progress for 21 st CCLC participants Not informed to collect report card marks last year and researching and collecting data very time consuming and frustrating State assessment area was unclear. We have students in K -5 in our after school programming and so the SOL data is only reflective of a fraction of participants.
34
Issues with Outcome Data Current 3 rd graders did not have an SOL score from last year. To measure improvement in the SOL test had to go back two years which was frustrating to school partners who were asked to quickly gather data There were no outcomes asked of the K-2 grades for state assessments when there are other state assessments and county benchmarks used to measure their progress.
35
Issues with Outcome Data Teacher survey unclear because it grouped homework completion and class participation together when some students improved on only one of the areas and thus no way to document Counterintuitive to enter the data in semester form rather than year State APR data and the collection of most data entered for full year rather than semesters Separating data into semester format frustrating and did not relate to program goals or correlate with data already collected Extra time spent collecting the specific data and entering it during unusual time windows.
36
Preparing for the Next Submission of APR Data Tentative date for submission of the 2015-2016 data is spring 2016 Submission includes summer 2015, fall 2015, and spring 2016. This will be the yearly timeframe for USED APR data collection
37
Teacher and Student Surveys
38
Teacher Survey School day teachers complete the survey for students with 30+ days of attendance. Only one survey per student needs to be completed. English teacher at the secondary level Survey will be completed using the CREP survey system. Teachers will receive a confirmation. Email to themselves or to the program coordinator/site director Teachers will have an opportunity to complete another survey without having to log back in again.
39
Teacher Survey Questions have been modified to correlate with the questions that are asked on the new USED APR System. Coordinators will receive a report by grant, eliminating tally sheets. The report can be sorted and used to input data into on the new USED APR System for the 2015-2016 data.
40
Teacher Survey The survey will be available from March 1, 2016, through March 31, 2016. Program coordinators should have already received an e-mail with a username and password. Forward the link and letter to school day teachers that explains the purpose and importance of the survey.
41
Teacher Survey There is not any action necessary If your teachers have already submitted surveys for students with less than 30 days of attendance; OR If more than one teacher submitted a survey for a student.
43
Student Survey Evaluation results do not tell the whole story Other ways to determine success Grades Teacher survey Homework and participation Behavior Student survey - new
44
Student Survey Purpose – to determine student perception and benefit of the 21 st CCLC program Will receive a report of the results Results will Not be used for the USED APR Data System Be used to write the evaluation of the program in Virginia
45
Student Survey Online survey through CREP Questions for elementary and middle school Questions for high school Types of questions Short, easy to read statements Select all that apply Yes/sometimes/no Agree/not sure/disagree
46
Student Survey Suggestions for student access to the survey Link to teachers to write on the board or a sheet of paper Bookmark the webpage Create a shortcut on the desktop Directions in the e-mail to coordinators Which students will complete the survey Time frame to consider when deciding which students Grade levels
47
Survey Monkey https://www.surveymonkey.com/r/TMZ9X9S When would be the best time to complete Virginia’s Annual Performance Report? at the same time we are completing the USED APR (this spring) in the summer in the fall the following school year Please complete by Friday, March 18, 2016, at 4 p.m.
48
Contact Information Marsha Granderson- USED APR System Education Specialist Marsha.Granderson@doe.virginia.gov (804) 786-1993 Tiffany Frierson- Teacher and Student Surveys Education Specialist Tiffany.Frierson@doe.virginia.gov (804) 371-2682
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.