The Learning Behaviors Scale

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

Adjustment Scales for Children and Adolescents-Home (ASCA-H)
Project VIABLE: Behavioral Specificity and Wording Impact on DBR Accuracy Teresa J. LeBel 1, Amy M. Briesch 1, Stephen P. Kilgus 1, T. Chris Riley-Tillman.
Standardized Scales.
Initial Validation of a New Measure of Facial Expression Recognition: Survivors of Childhood Cancer Compared to Typically-Developing Children Melanie J.
Sensory Processing Measure (SPM) Overview and Practical Applications for Teams Home Form L. Diane Parham, PhD, OTR/L, FAOTA, and Cheryl Ecker, MA, OTR/L.
The Research Consumer Evaluates Measurement Reliability and Validity
Evaluation of the Iowa Algebra Aptitude Test Terri Martin Doug Glasshoff Mini-project 1 June 17, 2002.
Kristen Davidson Alyssa Heggen Lauren Lafayette. * Norm-referenced checklist measuring symptoms of ADHD * Measures both inattentive and hyperactive-impulsive.
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
BOARD ENDS POLICY REVIEW E-2 Reading and Writing Testing Results USD 244 Board of Education March 12, 2001.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
Chapter 9 Flashcards. measurement method that uses uniform procedures to collect, score, interpret, and report numerical results; usually has norms and.
Chapter 7 Correlational Research Gay, Mills, and Airasian
Author: Sabrina Hinton. Year and Publisher: American Guidance Service.
Social Science Research Design and Statistics, 2/e Alfred P. Rovai, Jason D. Baker, and Michael K. Ponton Internal Consistency Reliability Analysis PowerPoint.
By: Allan & Nadeen Kaufman Published by: American Guidance Service.
Copyright © 2008 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. John W. Creswell Educational Research: Planning,
CLASSROOM ASSESSMENT FOR STUDENT LEARNING
But What Does It All Mean? Key Concepts for Getting the Most Out of Your Assessments Emily Moiduddin.
Copyright © 2001 by The Psychological Corporation 1 The Academic Competence Evaluation Scales (ACES) Rating scale technology for identifying students with.
Barbara A. Wilson, Eve Greenfield, Linda Clare, Alan Baddeley, Janet Cockburn, Peter Watson, Robyn Tate, Sara Sopena, Rory Nannery & John Crawford (2008)
Educational Psychology, 11 th Edition ISBN © 2010 Pearson Education, Inc. All rights reserved. Classroom Assessment, Grading, and Standardized.
Student Engagement Survey Results and Analysis June 2011.
Instrumentation.
LECTURE 06B BEGINS HERE THIS IS WHERE MATERIAL FOR EXAM 3 BEGINS.
Collecting Quantitative Data
Student Classroom Engagement in 4 th to 12 th Grade Christi Bergin, Ze Wang, David Bergin, Rebecca Bryant, & Renee Jamroz University of Missouri American.
Chapter 3 Understanding Test Scores Robert J. Drummond and Karyn Dayle Jones Assessment Procedures for Counselors and Helping Professionals, 6 th edition.
 Collecting Quantitative  Data  By: Zainab Aidroos.
TACL-3 Test for Auditory Comprehension of Language
Miller Function & Participation Scales (M-FUN)
Chapter 4: Test administration. z scores Standard score expressed in terms of standard deviation units which indicates distance raw score is from mean.
Review of Basic Tests & Measurement Concepts Kelly A. Powell-Smith, Ph.D.
Standardized Testing (1) EDU 330: Educational Psychology Daniel Moos.
Final Report for East Carolina University
The KOPPITZ-2 A revision of Dr. Elizabeth Koppitz’
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Classroom Assessment, Grading, and Standardized Testing
1 Chapter 4 Numerical Methods for Describing Data.
Building the NCSC Summative Assessment: Towards a Stage- Adaptive Design Sarah Hagge, Ph.D., and Anne Davidson, Ed.D. McGraw-Hill Education CTB CCSSO New.
Psychometrics. Goals of statistics Describe what is happening now –DESCRIPTIVE STATISTICS Determine what is probably happening or what might happen in.
SCHOOL COUNSELING INTERVENTIONS Adrienne WatkinsBall State University.
 1,001 adolescent boys (47%) and girls (53%)  Fairly diverse: 58% Caucasian; 23% African American,12% Hispanic, 2% Asian, 5% Other  Age Range:
Chapter 6 - Standardized Measurement and Assessment
Steven W. Evans, Christine Brady, Lee Kern, Christiana Andrews and the CARS Research Team Measurement Development and Inclusion Criteria: Developing Meaningful.
The authors would like to acknowledge the families at the Children’s Hospital of Wisconsin Jane P. Pettit Pain and Palliative Care Center. For more information,
The Normal Distribution and Norm-Referenced Testing Norm-referenced tests compare students with their age or grade peers. Scores on these tests are compared.
◦ th and 11 th grade high school students (54% girls) ◦ 63% Caucasian; 24% African-American; 13% Hispanic; remaining were Asian or “other” ◦ Mean.
1 Information Systems Use Among Ohio Registered Nurses: Testing Validity and Reliability of Nursing Informatics Measurements Amany A. Abdrbo, RN, MSN,
UNIT Standardization and Technical Properties n Standardization Sample n Reliability Studies Internal Consistency Reliabilities at Decision-Making Points.
Protective Effects of Language Development Among Children in Head Start: A Person-Centered Approach Christine Meng Curriculum and Instruction University.
Project VIABLE - Direct Behavior Rating: Evaluating Behaviors with Positive and Negative Definitions Rose Jaffery 1, Albee T. Ongusco 3, Amy M. Briesch.
San Luis Valley Gifted Education Network Meeting October 17, 2013.
Further Validation of the Personal Growth Initiative Scale – II: Gender Measurement Invariance Harmon, K. A., Shigemoto, Y., Borowa, D., Robitschek, C.,
Assessment in Language Teaching: part 1 Lecture # 23
Test Design & Construction
Kristen Davidson Alyssa Heggen Lauren Lafayette
Data Usage Response to Intervention
Overview This presentation provides information on how districts compile evaluation ratings for principals, assistant principals (APs), and vice principals.
Delaware School Survey Data Understand, Interpret, Use & Share
Cognitive Abilities Test (CogAT)
Weschler Individual Achievement Test
Overview This presentation provides information on how districts compile evaluation ratings for principals, assistant principals (APs), and vice principals.
Validity and Reliability II: The Basics
Parent Alliance Measure By: Richard R. Abidin & Timothy R. Konold
Descriptive Statistics
Delaware School Survey Data Understand, Interpret, Use & Share
Relationship between Standardized and Classroom-based Assessment
Presentation transcript:

The Learning Behaviors Scale P. A. McDermott, L. F. Green, J. M. Francis, & D. H. Stott

Description of the LBS I 29 items, each presenting a specific learning-related behavior Observer is required to indicate whether behavior Most often applies, Sometimes applies, or Does not apply. Some items indicate positive learning behaviors and others indicate negative behaviors to reduce response sets.

LBS Description II The 29 items provide 4 subscale scores: Competence Motivation (8 items) Attitude Toward Learning (9 items) Attention/Persistence (7 items) Strategy/Flexibility (7 items) Subscales allow for targeted intervention You can also obtain a global LBS score

CONTEXT FOR DEVELOPMENT I Knowing that a student is “bright” or “not as bright” does not fully explain performance Knowing a student’s intellectual capacity provides limited information for intervention What do you do when faced with a “bright” student who is not doing well?

Context for Development II There are behaviors associated with learning. What behaviors do you think are associated with learning? Some of them are as follows: Listening attentively Participating in classroom activities Accepting correction Sticking to tasks until completed Working to please teacher

Context for Development III There is evidence that these learning behaviors are teachable. Idiographic data is useful in this arena, but time-consuming to gather We did not have good nomothetic data on learning behaviors. However, we know that school teachers are relatively accurate, reliable, unobtrusive, cost-beneficial observers of classroom behavior, when they have had ample opportunity to observe.

Context For Development IV In light of the previous information, the authors of the LBS set out to develop a scale to measure learning behaviors reliably and validly in 5 - 17 year olds, using teacher observation. The research work on the LBS started in the mid 1980s and the scale was published in 1999.

Preliminary Research Examined Reliability of subscale scores and total score in small samples. Validity of subscale and total scores in small samples. Created scale that worked well.

US Standardization Sample 1,500 students, 750 males and 750 females 5 to 17 years old Used 1992 U.S. Census to obtain demographics Blocking for sex, age, grade in school Stratified random sampling by race, class, family structure, community size, and geographic region

Final Norm Sample Race: 67.7% White, 15.9% Hispanic, 12.1% African American, 4.3% other groups. Family: 76.6% with two parents or guardians, 21.3% single mom, 2.1% single dad. Representation by SES based on parent education, and exceptionality Final selection was randomly selected from those who gave consent, restricted only by stratification quotas and ≤ 2 students per teacher.

Test Retest Reliability (n = 77) Competence Motivation .92 Attitude Toward Learning .91 Attention/Persistence .92 Strategy/Flexibility .93

Inter Rater Reliability (n = 72) Competence Motivation .83 Attitude Toward Learning .83 Attention/Persistence .83 Strategy/Flexibility .83

Internal Consistency I

Internal Consistency II

Validity Coefficients

T & T Sample The LBS was completed on all 700 students in the sample, with no rater missing more than 2 items. The distribution was skewed toward the higher end--that is, most students were rated as having learning behaviors in the normal range.

Factor Analyses As is recommended, we used factor analysis to look at the structural validity of the LBS in the T & T sample. We used multiple criteria to determine how many factors would work best. The goal was to find a factor structure that was generalizable across the whole sample, as well as the gender subgroups.

Factor Analyses 2 In the US, the LBS is made up of four factors. We ran five factor, four-factor, three-factor, and two-factor models. The only structure that generalized from the whole sample across gender groups was the two-factor one (see p. 8 in manual and note pattern coefficients).

Factor Analyses 3 Factor I consists of 18 items and is labeled Attitude Toward Learning (AL). The items on this factor made up the Competence Motivation, Attention-Persistence, and Attitude Toward Learning subscales in the U.S. structure. Factor II, labeled Strategy Flexibility, consisted of the 7 SF items in the U.S. norming, but of 9 items in Trinidad. The AL and SF factors have two items in common (10 & 14).

Attitude Toward Learning Items

Strategy Flexibility

Let’s look at reliability estimates for the T & T scores

Reliability and Validity Evidence Reliability estimates for scores on the total scale were consistently high across all subgroups. Reliability estimates for subscale scores were very high for Factor 1 and moderate for Factor II. No reliability estimates fell below .75. There were no statistically significant differences between genders, among ethnic groups, or among grade levels.

Administration Speak to teacher who has seen students for at least 6 school weeks or 30 days. Ask teacher to rate the student as accurately as possible. Let the teacher know that this information will help in your assessment of the student. Teacher should rate all responses. Will require 5 to 10 minutes to complete.

Scoring Use scoring template to complete raw score for each dimension. Raw scores for Factor 1 (AL) range from 0 to 36. Raw scores for Factor II (SF) range from 0 to 18. Raw scores on Total Scale range from 0 to 54. Put raw scores in boxes on Score Summary sheet. Convert raw scores to %tiles using table on p. 10. Always double check your scoring.

Interpretation I Scores on the LBS should be only one part of a broader psychoeducational evaluation. Higher scores represent the presence of more learning behaviors. Students who obtain scores at or above the 40th percentile are displaying learning behaviors at or above the average range.

Interpretation II Students whose learning behaviors are between the 20th and 40th percentiles are students who may benefit from interventions aimed at increasing their learning behaviors in general. Students whose scores fall below the 20th percentile are manifesting deficits in learning behaviors and may benefit from immediate interventions.

Let’s Practice