Kajal Shah, Karina Silva, and Natalya Koehler. Rationale for Development  It was designed to test a recent prescriptive theory of inductive reasoning.

Slides:



Advertisements
Similar presentations
By: Edith Leticia Cerda
Advertisements

CCE GUIDELINES FOR CLASSES 6 TO 8
TELPAS Grades K-1 Holistic Rating Training Spring 2010 Hitchcock ISD.
User Interface Design Yonsei University 2 nd Semester, 2013 Sanghyun Park.
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
Cognitive Walkthrough More evaluation without users.
Lesson Delivery SIOP Component #7.
Science Breakout New Teacher Meeting 6, Year 2 March 31, 2011.
Group 7. Goals and Objectives to teach the children about genes and how different combinations produce different offspring. To help children easily recognize.
Delmar Learning Copyright © 2003 Delmar Learning, a Thomson Learning company Nursing Leadership & Management Patricia Kelly-Heidenthal
COMP6703 : eScience Project III ArtServe on Rubens Emy Elyanee binti Mustapha Supervisor: Peter Stradzins Client: Professor Michael.
Formative and Summative Evaluations
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
U.C. Berkeley Calendar Network Usability Evaluation Nadine Fiebrich & Myra Liu IS214 May 4, 2004.
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
WIKI IN ACTION: BEYOND CLASSROOM LIMITS BEYZA YILMAZ İ stanbul,2010.
Science Inquiry Minds-on Hands-on.
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Joy Hamerman Matsumoto.  St Jude Medical Cardiac Rhythm Management Division manufactures implantable cardiac devices ◦ Pacemakers ◦ Implanted defibrillators.
Creating Assessments with English Language Learners in Mind In this module we will examine: Who are English Language Learners (ELL) and how are they identified?
Bank of Performance Assessment Tasks in English
Instructional Design Process Connect Your Website: Application Program Interfaces Jullien Gordon Aneto Okonkwo Gilbert Zaragoza.
Agenda Welcome Session Objectives
Professional Development by Johns Hopkins School of Education, Center for Technology in Education Supporting Individual Children Administering the Kindergarten.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Put it to the Test: Usability Testing of Library Web Sites Nicole Campbell, Washington State University.
IMPRINT Pro Usability Assessment Pratik Jha, Julie Naga and Dr. Raja Parasuraman George Mason University.
By: Amy Lingenfelter Senior English Language Fellow
Human Computer Interaction
PSRC SIOP: Train the Trainer 2009 Sheltered Instruction Observation Protocol (SIOP) Leonardo Romero PSRC.
Heuristic evaluation Functionality: Visual Design: Efficiency:
SEG3120 User Interfaces Design and Implementation
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Conducting Usability Tests 4 Step Process. Step 1 – Plan and Prep Step 2 – Find Participants Step 3 – Conduct the Session Step 4 – Analyze Data and Make.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
1 Technical & Business Writing (ENG-315) Muhammad Bilal Bashir UIIT, Rawalpindi.
Final Project Presentation ETEC 550
Software Architecture
Dr. Engr. Sami ur Rahman Assistant Professor Department of Computer Science University of Malakand Research Methods in Computer Science Lecture: Data Generation.
SURVEY RESEARCH.  Purposes and general principles Survey research as a general approach for collecting descriptive data Surveys as data collection methods.
Evaluation of Shoreline Science Jia Wang & Joan Herman UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation,
Teaching Reading Comprehension
Newspaper in Education Web Site (NEWS) Usability Evaluation Conducted by Terry Vaughn School of Information The University of Texas at Austin November.
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
By Godwin Alemoh. What is usability testing Usability testing: is the process of carrying out experiments to find out specific information about a design.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
A Pilot Study of a Multimedia Instructional Program for Teaching of ESL Grammar with Embedded Tracking.
Program Evaluation Making sure instruction works..
Paper III Qualitative research methodology.  Qualitative research is designed to reveal a specific target audience’s range of behavior and the perceptions.
LITERACY-BASED DISTRICT-WIDE PROFESSIONAL DEVELOPMENT Aiken County Public School District January 15, 2016 LEADERS IN LITERACY CONFERENCE.
1 Evaluating the User Experience in CAA Environments: What affects User Satisfaction? Gavin Sim Janet C Read Phil Holifield.
22C:082:001 Human-Computer Interaction. Fall Copyright © 2013 Juan Pablo Hourcade. 1 Group Project Phase Four.
PEPs Suggested guidelines For getting started Suggested guidelines For getting started.
Steps in Planning a Usability Test Determine Who We Want To Test Determine What We Want to Test Determine Our Test Metrics Write or Choose our Scenario.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
1 Instructing the English Language Learner (ELL) in the Regular Classroom.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
6. (supplemental) User Interface Design. User Interface Design System users often judge a system by its interface rather than its functionality A poorly.
Day 8 Usability testing.
Chapter 6 : User interface design
NEEDS ANALYSIS.
Assessment in Language Teaching: part 1 Lecture # 23
Your mission, should you choose to accept it, is to provide as
Usability Evaluation, part 2
Generation Nexters’ Financial Learning Environment
Cognitive Walkthrough
E-learning Projects Overview
Presentation transcript:

Kajal Shah, Karina Silva, and Natalya Koehler

Rationale for Development  It was designed to test a recent prescriptive theory of inductive reasoning by testing a strategy that enables participants to solve any kind of inductive reasoning problem  It is accessible on the web site: 2/18/20162

Description of Prototype  It consists of Introduction, 6 lessons with quizzes, and 2 extra quizzes  Problems and quizzes increase in complexity  Quizzes help assess a student’s understanding of the material 2/18/20163

Navigation Structure  Combination of linear and tree model navigation structure  At some point students have no choice but follow the path constructed by the designer  At other points it is for students to choose the direction of the instruction 2/18/20164

Database for Data Collection  Collects data to track student’s behavior and performance  A unique user ID is assigned when a student registers with the program  The administrator assigns class codes to different grades in different schools in Iowa  One can take the training independently under the class code GENERAL 2/18/20165

Target Audience  Ages: 10 and 11 years old  Program designed for 4 th grade students of public schools in Iowa  Can also be used as a remedial activity for students up to sixth grade facing difficulty in math. 2/18/20166

Data Collection Process: Identifying Evaluation Questions  Divergent Phase:  Interviewing main stake holders to Gain an idea about expectations and concerns about the product Know the information they seek from the evaluation Know aspects they consider key to meeting the program’s goals.  21 concerns were identified: Concerns about Content Material and Instruction (6) Concerns about Learning (4) Concerns about Design and Navigation (11)

Evaluation Questions  Convergent Phase:  #1 Are participants able to navigate the program?  Measures: ease of use by error rate while performing performance-based tasks  #2 Are navigation instructions appropriate and sufficient to allow participants ease of navigation?  Measures: Clarity of instructions for target audience and navigation errors caused by lack of instructions

Evaluation Questions  #3 Does the vocabulary and language of lesson instructions address the target audience?  Measures: Ability of participants to comprehend lesson instructions  #4 How many of the performance tasks are the participants able to complete successfully?  Measures: overall effectiveness of the program

Evaluation Questions  #5 Do participants find the visual design appealing and does it meet their expectations?  Measures: likeability and preference  #6 How do participants feel about their experience with the program?  Measures: satisfaction, likeability, ease of use, and areas of difficulty

Evaluative Instruments and Matrix of Objectives  Think aloud protocol  Performance based tasks  Observation grid  Debriefing  Navigation report from program database

Participants  Participants drawn from Target Audience : t wo 4 th graders, one 5 th grader, and one 6 th grader  Computer experience: use computers both at home and at school for different purposes  Language proficiency: near native speakers of English.

Process: Challenges and Modifications  Think aloud protocol: Had them work in pairs and discuss amongst each other  Microphone and Video Recording to gather subtle non-verbal information  Observation grid – checklist, actions and behaviors  Time to get used to testing area and equipment – snacks  Background Questionnaire using interview protocol  Debriefing using interview protocol and walk-through on specific problem pages

Data Analysis Process  Data were typed and compiled to facilitate the analysis.  The videos were watched and comments from the think aloud were typed.  Data were summarized to address the evaluation questions.  A qualitative analysis of the data was conducted.  The triangulation of the findings was done. 2/18/201614

Findings  Program navigation Participants were able to navigate the site with little to no help. ○ They could Register and login Understand error messages Figure out solutions to the problems ○ Intuitive set of buttons However… ○ They had problems with the Summary Page. ○ The 5 th /6 th graders had problems with the feedback of a question in the quiz. ○ They expected to interact more with the program. 2/18/201615

Findings (2)  Navigation instructions Sometimes they did not know how to go about answering a question. Some of the instructions seemed to be too long or unclear (Participants yawned) Top-down perception error - Some examples given were too close to the answers –confused the 4 th Graders 2/18/201616

Findings (3)  Vocabulary and language instruction Both pairs claimed to have had no problems reading and understanding the instructions. However… ○ The video recordings showed them having trouble to read some words such as characteristics and appropriate. ○ Reading the instructions again even more carefully, we found an example that does not make much sense to us. 2/18/201617

Findings (4)  Task completion Both pairs completed all the tasks successfully 2/18/201618

Findings (5)  Visual design All participants liked the program  “very very cool” They also suggested changes: ○ Bigger and clearer pictures ○ Feedback could be easier to see ○ Back to Summary page button (4 th Graders) : Use brighter color Change language to ‘Back to score page’ ○ 5 th Graders – Give specific instructions to press ‘Back to Summary page button 2/18/201619

Findings (6)  Participant experience The questions made them think. They liked the instructions. 4 th graders thought the quizzes were hard – 5 th /6 th graders liked them. Both pairs said they would like to see some drag and drop. 4 th graders suggested that the questions were categorized according to their level of difficulty. Non-verbal behavior showed participants’ interest. 2/18/201620

Recommendations  Instructions Reduce the number of instructions and present them in an interactive way. Include a note on the first page telling people not to use the browser/keyboard buttons. Mention that it doesn’t matter if the letters are capitalized or not – quizzes and login pages. Mention that some questions might need the user to work out the fractions before they can see the characteristics or relationships. 2/18/201621

Recommendations  Navigation Put an error message which gives people a chance to correct their navigation mistakes.  Language Use simpler words (e.g., correct) instead of appropriate. Reword some of the sentences in the instruction’s pages.  More interactivity Drag and drop Clickable items Space for typing tentative answers 2/18/201622

Recommendations  Visual design Make pictures bigger and clearer. Make feedback easier to see or use a pop-up window. Make the previous/back buttons invisible in the Review Answers Page.  Program Structure Make program less linear and closed – allow users to go back in quizzes and change answers or retake quiz in same user name. This will allow them to demonstrate cognitive change and learning 2/18/201623

Limitations of the Usability Testing  We told them not to use the back and forward buttons.  Observation grid could be improved.  We might have lost important data because we had to replace a disk.  The kids were highly motivated and computer savvy.  The database did not record some of the events. 2/18/201624

Methodological Challenges of working with children  Children do not remember specific things.  Children may not understand some terms such as ‘drag and drop’, ‘feedback’, and ‘interactivity’.  They might want to please the adult.  Non-verbal gestures might reveal more than specific debriefing questions.  Having them work in pairs is a great way to get their initial reactions. 2/18/201625

Recommendations for Further Testing  Conduct the test separately with each pair of children.  Consider having more children of different levels of linguistic proficiency (e.g., ESL learners and remedial children).  Conduct a brief survey in schools to get a good user profile and feedback. 2/18/201626