Presentation is loading. Please wait.

Presentation is loading. Please wait.

Adolescent Literacy – Professional Development

Similar presentations


Presentation on theme: "Adolescent Literacy – Professional Development"— Presentation transcript:

1 Adolescent Literacy – Professional Development
Module 3: Assessment Adolescent Literacy – Professional Development Unit 3, Session 2

2 Guiding Questions Session 2 Key Questions Session 2 Objectives
What is screening and why do we do it? What is diagnostic assessment and why do we do it? What kinds of screening and diagnostic assessments can we use? Session 2 Objectives Participants will learn about the role screening and diagnostic assessment play in guiding instruction for struggling students and students with learning disabilities. Participants will practice administering curriculum- based measurements for reading.

3 Activity Spend some time at the beginning of the session reporting out on the After the Session Activities from the last meeting. Assessment Vocabulary Probe In Session 1, participants were asked to identify what assessments are used to measure reading fluency, comprehension, spoken language skills, and written language skills. As participants share examples, consider categorizing them on a whiteboard, smartboard, or poster board for later additions and to reference at the end of the session when participants will be asked to further investigate assessments. The Assessment Vocabulary Probe is a sample of a curriculum-based vocabulary measure. These terms are selected from the Assessment Module. The Vocabulary Probe can be used in several ways—here it provides the facilitator with a sense of what terms may need review, and it provides participants with a focus on key assessment vocabulary. The list (two 10-item matching probes) is not exhaustive. If a measure such as this were to be used to monitor student progress within a content classroom, the teacher would prepare a master list of all terminology for the year, and select items from it (10–20) to create matching probes to be administered on a regular basis. This is different from a traditional vocabulary test in that it does not “count” toward an achievement grade, but is used instead to guide teachers and students toward next instructional steps.

4 Screening Screening is: For all students
A quick assessment that gauges students’ skill level Ideally conducted at the beginning of the school year, and can be done periodically throughout the year Usually focused on reading fluency Screenings for listening, speaking, and writing can also be administered, though much of the instruction on informal screening is focused on reading.

5 Screening: Students At Risk
Screening helps identify students who may need extra or different instruction, or further evaluation. In addition to below average performance on measures of reading fluency and/or reading comprehension, look at: Scores below “Proficient” on previous year’s MCAS Below average performance on other standardized achievement tests Teacher reports Students who are reasonably fluent on measures of oral reading, but show difficulty on achievement tests (MCAS) and on other assessments of reading comprehension (e.g., maze passages) may have comprehension challenges that need to be addressed with an enriched program targeted at developing vocabulary and reading comprehension strategies. Students who score below average on measures of oral reading and below average on achievement tests (MCAS) and on other assessments of reading comprehension are likely to need significant intervention in basic literacy skills. The further targeted assessment of this group is essential in order to ensure that instruction is appropriate to their needs. For example, some of these students will have excellent comprehension ability that is easily masked by their decoding weaknesses. These students will need targeted instruction to develop their decoding fluency. Other students in this group may struggle with both decoding and comprehension and require a significantly different intervention that addresses not only decoding, but also general receptive language skills and targeted reading comprehension skills.

6 Two Types of Screening Curriculum-Based Measures Standardized Measures
Teacher-created passages gleaned from grade-level texts Scores interpreted in relation to school created norms, or, more commonly, the guides provided by the Florida Institute for Reading Research Standardized Measures Commercially available Normed on large groups Note for CBM: Many high school texts are written significantly above grade-level.

7 Activity CBM: Oral Reading Fluency
This is an example of a CBM (curriculum- based measurement) for ORF (oral reading fluency) Assess each other. Reform into different pairs. Discuss the experience with a partner. As a tester (What information do you get?) As a reader (How did this assessment experience feel? How might it have felt if you performed differently?) Go over the instructions for CBM as a group and answer questions. Ask participants to pair up in twos. Make sure that at least one of the pair has a watch with a second hand, or that there is a clock with a second hand in the room. Have the pairs assess each other. There are four reading passages in the handout. Each has a reader form and a tester form. Participants should take turns being the reader, then being the tester. When it is the tester’s turn to become the reader, he or she should not read the passages they just scored for the other person. The tester should administer the assessment, telling the reader he/she has 1 minute to read the passage quickly and accurately. Any missed words, substituted words, added words, and mispronunciations should be noted on the tester’s sheet. The score is obtained by subtracting the number of errors from the number of correctly read words. Each tester should administer two ORF passages to the reader, marking errors as the reader reads. Scoring: Note – While a tester would usually score the screening, it is recommended in this setting that they return the marked passages to the readers so that the readers score themselves rather than be scored by one of their colleagues. This will prevent any discomfort on the part of those participants who may not be fluent readers themselves. Have participants follow the instructions for scoring their own oral reading fluency, and then look at the charts provided to see where they fall on the 12th grade fluency levels. Have participants pair up with a different partner to discuss this experience.

8 Activity CBM: Maze Comprehension
This is an example of CBM Maze passages are selected and created from grade-level texts Assess each other. Reform into different pairs. Discuss the experience with a partner: As a tester (What information do you get?) As a reader (How did this assessment experience feel? How might it have felt if you performed differently?) In this sample screening, because of time constraints, participants will administer only one maze passage to each other, even though in an actual screening, students would take two and their scores would be averaged together. Go over the instructions for CBM maze passages as a group and answer questions. Ask participants to pair up in twos. Make sure that at least one of the pair has a watch with a second hand, or that there is a clock with a second hand in the room. Have the pairs assess each other. Scoring: Note – It is recommended that the assessors provide the tester passage to the readers so that the readers score themselves with the answer key rather than be scored by one of their colleagues. This will prevent any discomfort on the part of those participants who may not be fluent readers themselves. Have participants follow the instructions for scoring their own maze passages, and then look at the charts provided to see where they fall on the 12th grade maze chart. Ask participants to pair up with a different partner to discuss this experience.

9 Divide into Two At-Risk Groups
Scores on CBM Oral Reading Fluency are average or slightly below. Does not meet standards on MCAS/scores on maze measures may be below average. Group 1 Scores on CBM Oral Reading Fluency are significantly below average. Does not meet standards on MCAS/scores on maze measures below average. Group 2 Torgeson & Miller, 2009, p. 15. Torgeson and Miller recommend that initial screenings guide initial intervention placement, with Group 1 receiving less intensive interventions focused primarily on vocabulary and comprehension skills, and Group 2 receiving very intensive interventions focused on decoding and comprehension.

10 Example Formal Measure
TOSWRF: Test of Silent Word Reading Fluency Ages: 6–6 through 17–11 Testing Time: 3 minutes for a single form or 10 minutes for both forms Administration: Group or individual The Test of Silent Word Reading Fluency (TOSWRF) measures a student's ability to recognize printed words accurately and efficiently. Note: There are a variety of screening measures available. While CBMs are both valid and cost efficient, some schools may prefer to invest in a commercially available tool that comes with pre-made forms and instructions. The two examples in these slides are simply that—examples. This assessment is recommended as convenient (i.e., group administered) and an accurate measures to screen students at risk. The publisher notes: “The TOSWRF accurately identifies students who are struggling with reading. It can also be used for monitoring reading progress and as a research tool. Because the test can be administered easily and quickly in a group format, it is an efficient and cost-effective screening method. The TOSWRF is not intended to be the sole measure for making eligibility or placement decisions; rather, it is best used as an initial screening measure to identify poor readers. Once students with poor reading skills have been identified, a more detailed diagnostic assessment can help determine the factors contributing to reading difficulties and the goals for intervention.”

11 Example Formal Measure
TOSCRF: Test of Silent Contextual Reading Fluency Ages: 7–0 through 18–11 Testing Time: 10 minutes Administration: Individual or Group Test of Silent Contextual Reading Fluency (TOSCRF) measures a student's essential contextual reading abilities (i.e., word identification, word meaning, word building, sentence structure, comprehension, and fluency). The publisher notes: “The Test of Silent Contextual Reading Fluency (TOSCRF) provides a quick and accurate method to assess reading ability in children ages 7–0 through 18–11 and features four equivalent forms. Passages, adapted from the Gray Oral Reading Test and Gray Silent Reading Test, become gradually more complex in their content, vocabulary, and grammar (embedded phrases, sequenced adjectives, affixes, etc.). Each passage is presented as rows of contextually related words printed in uppercase without any spaces or punctuation (e.g., AYELLOWBIRDWITHBLUEWINGS). For each passage, students draw lines to separate as many words as they can in three minutes (e.g., A/YELLOW/BIRD/WITH/BLUE/WINGS). To do well on the test, the student has to read the meaning of the text. The TOSCRF measures a student's essential contextual reading abilities and reliably identifies students who are struggling with reading. It can also be used to periodically monitor reading progress.”

12 Know What Data the Instrument Provides
Ask participants to look at this chart and talk about what was assessed on the two sample assessments provided. What kinds of data are important to guide instruction, but are NOT provided by screening? Image captured 4/1/10 at coe.sfsu.edu/crlp/els.php Also in Participant’s Resource Packet.

13 Targeted Screening/Diagnostic Information
Students whose scores on screenings put them at risk for poor academic performance must be further evaluated. The goal of screening is simply to highlight who is at risk. Screening does not guide instruction. Close observation of student performance in class, and analysis of student work samples should always be part of this next level of evaluation. Further, targeted screening of students at risk should be done to guide instruction. Decoding single words, sight words, reading rate Background knowledge, vocabulary, syntax Understanding text structure, self-monitoring making inferences Targeted screening does not provide information about what is causing the difficulties, but it is important because it helps to focus instruction on the particular areas of need. A student who is struggling at the single word decoding level, for example, needs either focused phonological processing instruction or structured, sequential, and individualized phonics instruction. Further diagnostic assessment will indicate these areas of need. This and the next few slides emphasize the importance of gathering information about students’ area(s) of need in order to inform instruction. While further assessment does take time, it ensures that instructional time is not wasted with students but is targeted specifically to their areas of need. The recommended reading materials for participants include several analyses of literacy assessments that focus on reading. These slides provide examples of other formal assessments that may be used to guide instruction.

14 Diagnostic Information is Necessary
Reading difficulty can take a variety of forms. We must know where to begin instructing the student. Targeted assessment and close observation can provide the diagnostic information to guide instruction. What are the students specific areas of need in reading fluency? In reading comprehension? In writing? In listening and speaking?

15 Analysis of Errors Expertise in language components enable assessors to analyze patterns of student errors to determine targeted areas for instruction. For example: Student skips words, phrases, or lines of text. Student omits prefixes or suffixes from words. Student misreads multisyllabic words. Student reads with little inflection or lack of response to punctuation. Language Components: Phonemes Morphemes Syntax Lexicon Semantics Prosody Discourse Pragmatics

16 Activity Categorizing Reading Errors Form three groups.
Using the handout, categorize the oral reading errors from the example assessment. Ask participants to form three groups. They should review the sample scored CBM of oral reading fluency, and use the handout to categorize the oral reading errors. This categorization can assist in guiding reading instruction (not exclusively, but to focus on particular areas of fluency difficulty). These documents are in the Participant’s Resource Packet. Note: This is a sample activity that assumes a good store of linguistic knowledge. Participants in the group who do not have background in assessing reading skills should be encouraged to analyze the assessments as best as possible, and to join a group with at least one reading specialist or educator with reading instruction background. At the conclusion of the activity, ask the larger group to spend a few minutes talking about how this analysis might make important differences in instruction.

17 Informal Reading Inventory
Qualitative Reading Inventory – 5th edition Informal assessment instrument Similar to CBM but not CBM The QRI provides an example of further informal assessment that can assist educators in efforts to pinpoint particular areas of need. For example, the new version (5) includes prompts for the assessor to: Ask the student an open-ended question to assess the extent of background knowledge (a key piece of reading comprehension) Stop the student during reading to ask them what they are thinking about to assesses self-monitoring (another key process of reading comprehension) Product Description: This easy-to-use best-selling collection of reading materials effectively assesses reading ability at emergent though high school levels. Qualitative Reading Inventory–5 includes both narrative and expository passages at each grade level, questions to assess prior knowledge, and word lists. Instructors can measure comprehension by retelling passages, implicit and explicit questions, and other devices. Based on the latest reading research, this comprehensive inventory focuses assessment on specific questions regarding word identification, fluency, and comprehension. It also provides suggestions for intervention instruction, procedures for assessment of strategic reading, and inclusion of results in classroom portfolios (from Allyn & Bacon website:

18 Receptive Language PPVT-4 Peabody Picture Vocabulary Test
Ages : 2–6 through 90+ years Administration: The PPVT-4 takes about 10 to 15 minutes The PPVT-4 assesses oral comprehension/vocabulary development.

19 Receptive/Expressive Language
CREVT-2: Comprehensive Receptive and Expressive Vocabulary Test – Second Edition Ages: 4–0 through 89–11 Administration: Individual; takes 20 to 30 minutes CREVT-2 measures receptive and expressive oral vocabulary. There is also a longer and more comprehensive test of receptive/expressive language with which many participants may be familiar: Test of Adolescent and Adult Language – Fourth Edition (TOAL-4) Ages/Grades: Ages: 12–0 through 24–11 Administration: Individual/group; 1–3 hours Qualification level: B-Level The Test of Adolescent and Adult Language – Fourth Edition (TOAL-4) is efficient, reliable, and valid. It was designed to measure spoken and written language abilities of adolescents and young adults, with varying degrees of knowledge of the English language. It was normed on 1,671 individuals in 35 states, all between the ages of 12–0 years and 24–11. It is nonbiased in regard to gender, race, and ethnicity.

20 Phonological Processing
Lindamood Auditory Conceptualization Test (LAC-3) Ages: 5–0 through 18–11 Administration: Individual; minutes The LAC-3 measures an individual's ability to perceive and conceptualize speech sounds using a visual medium.

21 Phonological Processing
CTOPP: Comprehensive Test of Phonological Processing   Ages: 5–0 through 24–11 Administration: Individual; 30 minutes CTOPP assesses phonological awareness, phonological memory, and rapid naming.

22 Written Language TOWL-4: Test of Written Language — Fourth Edition
Ages:  9–0 through 17–11 Administration:  Individual or group; 60–90 minutes TOWL-4 identifies students who write poorly, determines students’ particular strengths and weaknesses in various writing abilities, and documents students’ progress in special writing programs.

23 Written Expression WPT: Writing Process Test Ages: 8 through 19
Administration: Individual; 45 minutes WPT is a direct measure of writing that requires the student to plan, write, and revise an original composition. The WPT assesses both written product and writing process.

24 Other Areas We Should Assess
Cognitive Skills and strategies Experiential Prior knowledge and lives outside school Affective Motivations and attitudes Terms from Afflerbach, P. (2008). “Meaningful assessment of struggling adolescent readers.” pp. 249–264. In Lenski, S. and Lewis, J., eds. (2008). Reading success for struggling adolescent learners. New York, NY: Guilford Press. p. 254. Most school assessments deal with the cognitive bubble. We need to keep in mind, however, the importance of the affective and experiential bubbles as well, particularly when it comes to adolescent learners at risk.

25 On Literacy Assessment
Our assessments must inform us about student characteristics, which can help us provide the most appropriate reading instruction and experiences. The related cognitive, affective, and experiential domains “represent important and powerful aspects of student learning, and they must be addressed if we are to have any hope of meeting struggling adolescent readers’ needs.” Afflerbach, P. (2008). “Meaningful assessment of struggling adolescent readers.” pp. 249–264. In Lenski, S. and Lewis, J., eds. (2008). Reading success for struggling adolescent learners. New York, NY: Guilford Press. p. 254.

26 The Affective Domain Motivation to read Reading self-concept
General motivational style Reading interests Pitcher, Albright, DeLaney, Walker, Seunarinesingh, Mogge, et al., 2007 Lavoie, 2007 Hildebrandt, 2001 In addition to screening and gathering diagnostic information about students cognitive abilities and skills, we must not forget the other domains that profoundly influence academic performance—the affective and the experiential. Most participants will readily acknowledge that student engagement and motivation are essential to making progress, and that students’ academic lives are shaped as much by the experiential domain as by the cognitive. Adolescents’ motivation to read and reading self-concept can be assessed informally using the forms in the following article: Pitcher, S., Albright, L., DeLaney, C., Walker, N., Seunarinesingh, K., Mogge, S., et al. (2007). Assessing adolescents' motivation to read. Journal of Adolescent & Adult Literacy, 50(5), 378–396. Rick Lavoie, in his 2007 book The motivation breakthrough: 6 secrets to turning on the tuned- out child, includes many inventories that can be used with students to assess their overall motivational styles. If there is interest in student motivation, Lavoie’s DVD of the same title provides much fodder for teacher discussion. Hildebrandt’s 2001 article, “But there’s nothing good to read,” addresses librarians but is informative in terms of adolescent reading interests. Assessment of the experiential domain for adolescents has a variety of factors, some of which crossover with the Motivation to Read Profile mentioned above and included in the handout.

27 The Experiential Domain
Learning styles and thinking styles Past school experiences Parental involvement Cultural background and expectations Mental and physical health In addition, as part of an overall approach to improving academic performance, assessments of learning styles and thinking styles can provide a tremendous amount of information that can help guide instruction and provide students with the vocabulary they need to advocate for themselves in any learning situation. There are numerous websites that purport to be inventories of learning and thinking styles, but either charge a fee or do not reflect assessments that have been developed, piloted and published by educational and psychological researchers. The best and most comprehensive online assessment of learning and thinking styles (about 104 questions) requires only an address to which results can be sent and is at the Learning Disabilities Resource Community site at

28 Activity Take the Motivation to Read Profile Survey.
Pair up with a partner and engage in the Motivation to Read Profile Conversational Interview. This activity can be subject to time availability in the session. If the participants do not do the activity, they should receive the handout to read through in their own time, as it is a valuable process for gaining insight about their students. Distribute the Motivation to Read handout. Ask participants to turn to pp. 381–2. As the facilitator, read the instructions on p. 387, and tell participants to follow your instructions. Ask participants to read the scoring instructions on pp. 389–90. If there is time available, ask participants to engage in the Conversational Interview. Read over the assessor instructions for the conversational interview on p. 388, and then look over the interview questions on pp. 383–6. Ask students to pair up and choose who will be the assessor and who will be the student. Use part or all of the Conversational Interview.

29 For Next Time Read the assigned readings.
Consider taking the online thinking styles inventory at Determine one aspect of your school assessment plan that needs work. Get information about one of the assessments mentioned and be ready to share out to the group about what the assessment tests, for what age group it would be appropriate, how long it takes, how much training is required, and the cost. Create, administer, and score a CBM oral reading assessment and maze assessment to a group of students and come to the next session prepared to discuss what you learned.


Download ppt "Adolescent Literacy – Professional Development"

Similar presentations


Ads by Google