Presentation is loading. Please wait.

Presentation is loading. Please wait.

Technology Enhanced Items – Signal or Noise?

Similar presentations


Presentation on theme: "Technology Enhanced Items – Signal or Noise?"— Presentation transcript:

1 Technology Enhanced Items – Signal or Noise?
Are we Delivering on our Promise of Better Measurement Fidelity with TEI? Jon S. Twing, Ph.D. Senior Vice President Psychometrics & Testing Services 2017 National Conference on Student Assessment, June 28-30, 2017, Austin, Texas

2 What WERE the claims that we made?
“Advances in technology…make it possible for us to obtain a richer, more intelligent, and more nuanced picture of what students know and can do than ever before” “To measure the intended constructs, the tests will likely need to use a range of tasks and stimulus materials, and will need to include more than traditional multiple‐choice questions.” Lazer, S., et. al., (2010)

3 Specific and Detailed “Stretch” Goals
Our task design should be guided by the general goal of measuring each construct as validly, effectively, and thoroughly as possible. These may include: scenario‐based tasks long and short constructed response tasks that involve the exercise of technology skills, and simulations. Audiovisual stimuli; speaking and listening Lazer, S., et. al., (2010)

4 What Did I Say? I took a more simple minded approach:
Comparing technology enhanced measurement with current standards of measurement sets the bar too low We can’t assume the fidelity / validity of current measures is the standard we want to compare future measures to

5 Consider the “Water Cycle” Example

6 The Evidence Base In my industry, our research is very applied and usually in direct service to our customers questions regarding implementation and support of policy This limits, to some extent, the types of experimental designs we can implement As such, much of the evidence currently comes from cognitive labs, surveys, observations or usability investigations

7 Put Up or Shut Up! For technology enhanced items, the stronger the student content knowledge, the less technology mattered Most usability issues were so easily recovered from or were so closely related to content knowledge gaps that disentangling the effect of one from the other was unachievable Supporting technology tools might reduce the burden on working memory

8 Put Up or Shut Up (Continued)!
In general and for most students, layout and overall formatting of items did not appear to be a significant factor in determining the usability of items Language, directions and consistency are important Selection of the type of TEI item to measure the content matters Similar items rendered in different TEI (QTI) types yielded different outcomes

9 Put Up or Shut Up (Continued)!
More issues with TEI by student interactions on tablets than with computers Students used scratch paper despite online tools—and not just for math calculations Scrolling reading passages was anticipated and seemed well understood by students Highlighting was slightly more difficult on a tablet

10 Summary and My Conclusions
Bold claims were made about the value of TEI in the improvement of our measures It seems that before we get too innovative with “…richer, more intelligent, and more nuanced picture of what students know and can do…” we need to revisit fundamentals, like sources of construct irrelevant variance

11 Summary and My Conclusions (Cont.)
Content knowledge and technology interact in a manner similar to before TEI but more likely so under TEI Working memory might be a concern for TEI Language, directions, formats and item types do seem to matter and may be exasperated under TEI

12 But We also Said… “If we are to do something new and different, it is necessary that our items and tests be developed with an awareness of how students learn” “A test built around an understanding of available learning progressions is likely to be a better provider of information to formative components of the system” “Items that model good learning and instruction should make “teaching to the test” less of a problem” Lazer, S., et. al., (2010)

13 An Example TEI - login?username=LGN &password=3KD5NZCC This simulation is an example of a high-school science measure intended to engage and evaluate students knowledge and understanding of enzyme-substrate interactions via experimentation

14 Copyright © 2014 Pearson, Inc. or its affiliates. All rights reserved
Contacts Jon S. Twing, Ph.D. Senior Vice President Pearson, School Assessments (Iowa City) (San Antonio) (US Cell Phone) +44-(0) (UK Cell Phone) Copyright © 2014 Pearson, Inc. or its affiliates. All rights reserved

15 References Lazer, S., Mazzeo, J., Twing, J.S., Way, W. D., Camara, W.J., Sweeney, K. (2010). Thoughts on an Assessment of Common Core Standards. ETS, Pearson, CEEB published online, Twing, J. S., (2011). The Power of Technology: The Water Cycle. Pearson Video Series, published online, ACARA (2015). NAPLAN Online Research and Development. Published online NAP,

16 References (continued)
Davis, L. L., Strain‐Seymour, E., & Gay, H. (2013). Testing on tablets: Part I of a Series of Usability studies on the use of tablets for K–12 Assessment Programs. Pearson White Paper. Davis, L. L., Strain‐Seymour, E., & Gay, H. (2013). Testing on tablets: Part II of a Series of Usability studies on the use of tablets for K–12 Assessment Programs. Pearson White Paper.

17 References (continued)
Davis, L. L., Kong, X., & McBride, Y., (2015). Device Comparability of Tablets and Computers for Assessment Purposes. Paper presented at the national NCME conference.

18 Experiment Simulation Backup Slides

19 Experiment Simulation Backup Slides (Cont.)

20 Experiment Simulation Backup Slides (Cont.)

21 Experiment Simulation Backup Slides (Cont.)

22 Experiment Simulation Backup Slides (Cont.)

23 Experiment Simulation Backup Slides (Cont.)


Download ppt "Technology Enhanced Items – Signal or Noise?"

Similar presentations


Ads by Google