LOG O Development of a diagnostic system using a testing-based approach for strengthening student prior knowledge Computers & Education (September 2011)

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

Importance of Questioning and Feedback Technique in developing 3 Cs
Using the IDEA Student Ratings System: An Introduction University of Saint Thomas Fall
Presenter: Han, Yi-Ti Adviser: Chen, Ming-Puu Date: Jan 19, 2009 Sitthiworachart, J. & Joy, M.(2008). Computer support of effective peer assessment in.
The world is becoming more and more technologically oriented. Hence, every educator should be modern with their outlook in teaching. The use of multimedia.
Effect Size and Meta-Analysis
Dr. Hari Singh School of Business Humboldt State University.
Specialized Understanding of Mathematics: A Study of Prospective Elementary Teachers Meg Moss.
Critiquing Research Articles For important and highly relevant articles: 1. Introduce the study, say how it exemplifies the point you are discussing 2.
Characteristics of on-line formation courses. Criteria for their pedagogical evaluation Catalina Martínez Mediano, Department of Research Methods and Diagnosis.
Student Assessment CERRA National Board Candidate Support Workshop Toolkit WS
Comparison of an Integrated (HBSE/Practice) Blended Learning Course with Non-integrated Face-to-Face Courses Rose McCleary Leigh Collins Sam Jenkins California.
ISEM 3120 Seminar in ISEM Semester
Assessment Assessment involves the sampling of some aspect of a person's learning/knowledge at a particular moment. Depending upon the kind of sample.
The Effectiveness of Supplemental Online vs. Traditional Tutorials on Students’ English Proficiency and Learning Achievement Ponlak Pantahachart Faculty.
Quantitative Research
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
The Impact of On-line Teaching Practices On Young EFL Learners' Instruction Dr. Trisevgeni Liontou RHODES MAY
Factors affecting contractors’ risk attitudes in construction projects: Case study from China 박병권.
Animation Based Learning of Electronic Devices: Practical Engineering Student Case A. Gero, W. Zoabi & N. Sabag Technion – Israel Institute of Technology.
Qatar University Exemplary Online Course Award
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
1 How can self-regulated learning be supported in mathematical E-learning environments? Presenters: Wei-Chih Hsu Professor : Ming-Puu Chen Date : 11/10/2008.
A Framework for Inquiry-Based Instruction through
Dissertation Theme “The incidence of using WebQuests on the teaching-learning process of English Foreign Language (EFL) for students attending the seventh.
Examining Attitude Toward Statistics Among Graduate Nursing Students MyoungJin Kim, PhD, Illinois State University INTRODUCTION While the integration of.
EDU 385 EDUCATION ASSESSMENT IN THE CLASSROOM
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
EDU 385 Education Assessment in the Classroom
Evaluating a Research Report
User Study Evaluation Human-Computer Interaction.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Understanding Meaning and Importance of Competency Based Assessment
Learning From Assessment: Evaluating the Benefits of DALI (Diagnostic Assessment Learning Interface) Hershbinder Mann & Guinevere Glasfurd-Brown, University.
Marica Romano1 Teaching English Language in Mixed Ability Classes The Challenge of Heterogeneous Classes.
Evaluating HRD Programs
Teaching Today: An Introduction to Education 8th edition
Last Time Today we will learn how to do t-tests using excel, and we will review for Monday’s exam.
Dr. Sande Caton. Assessments Why do we assess our students? Individually, write at least three ideas you have about assessments With one or two colleagues.
1 Learning styles and formative assessment strategy: enhancing student achievement in Web-based learning Authors: K.H. Wang, T.H. Wang, W.L. Wang, S.C.
Using Common Core State Standards of Seventh Grade Mathematics in the Application of NXT LEGO® Robotics for CReSIS Middle School Students.
Educator’s view of the assessment tool. Contents Getting started Getting around – creating assessments – assigning assessments – marking assessments Interpreting.
Using Alice in an introductory programming course for non-CS majors Adelaida A. Medlock Department of Computer Science Drexel University
Summative vs. Formative Assessment. What Is Formative Assessment? Formative assessment is a systematic process to continuously gather evidence about learning.
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Development of a reading material recommendation system based on a knowledge engineering approach Presenter.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
Surveying instructor and learner attitudes toward e-learning Presenter: Jenny Tseng Professor: Ming-Puu Chen Date: April 12, 2008 Liaw, S., Huang, H.,
1 Hypermedia learning and prior knowledge: domain expertise vs. system expertise. Timothy J. F. Mitchell, Sherry Y. Chen & Robert D. Macredie. (2005) Hypermedia.
“Teaching”…Chapter 11 Planning For Instruction
Comparison of Student Learning in Challenge-based and Traditional Instruction in Biomedical Engineering Others: Taylor Martin, Stephanie D. Rivale, and.
Tips and Guidelines. Chapter Four: Results Assessments Questionnaires/SurveysTest Scores/Report Card Data Rationale Why study is needed?What results will.
The purpose of evaluation is not to prove, but to improve.
S010Y/Shopping Presentation – Slide #1 © Willett, Harvard University Graduate School of Education S010Y: Answering Questions With Quantitative Data Shopping.
AN ACHIEVEMENT DEGREE ANALYSIS APPROACH TO IDENTIFYING LEARNING PROBLEMS IN OBJECT- ORIENTED PROGRAMMING ACM Transactions on Computing Education, Vol.
Problem Design in Problem-based Learning: Evaluating Students’ Learning and Self- directed Learning Practice Yeung, E., Au-Yeung, S., Chiu, T., Mok N.,
IN WHAT WAYS DO PRESERVICE TEACHERS UTILIZED AN WEB-BASED LEARNING SUPPORT SYSTEM? Fethi Ahmet Inan The University of Memphis Soner Yildirim.
AUTHOR: NADIRAN TANYELI PRESENTER: SAMANTHA INSTRUCTOR: KATE CHEN DATE: MARCH 10, 2010 The Efficiency of Online English Language Instruction on Students’
Learning Goals, Scales, and Learning Activities Clarity and Purpose.
Taeho Yu, Ph.D. Ana R. Abad-Jorge, Ed.D., M.S., RDN Kevin Lucey, M.M. Examining the Relationships Between Level of Students’ Perceived Presence and Academic.
Investigate Plan Design Create Evaluate (Test it to objective evaluation at each stage of the design cycle) state – describe - explain the problem some.
English Reading Guidance with Learning Portfolio Analysis Ting-Ting Wu Graduate School of Technological and Vocational Education, National Yunlin University.
Learning Objectives for Senior School Students. Failing to plan is planning to fail. / Psychology of Achievement /
The pre- post assessment 3 points1 point Introduction Purpose Learning targets Instruments Eliminate bias & distortion Writing conventions.
Do Now  You have given your beginning of the year diagnostic assessment. Your 30 students produce these results:  20 score below 50%  7 score between.
Action Research for School Leaders by Dr. Paul A. Rodríguez.
8 Experimental Research Design.
DEPARTMENT OF HUMAN AND SOCIAL CIENCES APPLIED LINGUISTICS IN ENGLISH CAREER    “THE INFLUENCE OF TEACHER’S ATTITUDES AND BELIEFS INTO TECHNOLOGY-RELATED.
Chapter 8 Experiments.
ASSESSMENT OF STUDENT LEARNING
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Presentation transcript:

LOG O Development of a diagnostic system using a testing-based approach for strengthening student prior knowledge Computers & Education (September 2011) Yi-Chun Lin, Yen-Ting Lin, Yueh-Min Huang* Department of Engineering Science, National Cheng Kung University

Introduction  Assist instructors in diagnosing and strengthening students’ prior knowledge before new instructions and to enable students to attain greater learning motivation and improved learning performance  A testing-based diagnosis system is proposed in this study to cope with these problems

Methodology  To measure the strength of understanding of prior knowledge  Prior knowledge diagnosis (PKD) model is proposed  Two data sources:  Testing information assigned by teachers  Testing information derived by students Represents a relationship between each concept and test item in a test, and the relationships among the concepts Represents a relationship between student’s answers and the test items

Methodology  A course specifies n concepts  C 1, C 2, C 3,…, C i,… C m,…, C n  Prior knowledge of the subject for r participating students  S 1, S 2, S 3,…, S l,…, S r  Teacher select k test items from the test item bank to form the pre-test  T 1, T 2, T 3,…, T j,…, T k

Methodology  X mj indicates the degree of relevance between the m-th concept and the j-th test item  represent the degree of relevance between each concept and test item  Z im indicates the relationship between the ith and the mth concepts (ranged from 0 to 1)  represent the relationship between the concepts

Methodology - Strength of concept  The strength of concept C i in the pre-test  Z im represents the relationship between the i-th and the m-th concepts, 0 ≤ Z im ≤ 1  X mj indicates the degree of relevance between the m-th concept and the j-th test item, 0 ≤ X mj ≤ 1 0 ≤ S(C i ) ≤ nk

Methodology - Importance ratio of concept  The importance ratio of concept C i in the pre-test  Z im represents the relationship between the i-th and m-th concepts, 0 ≤ Z im ≤ 1  X mj indicates the relevance degree between the m-th concept and the j-th test item, 0 ≤ X mj ≤ 1 0 ≤ IRP(C i ) ≤ 1

Methodology - Understanding strength of the lth student  The understanding strength of the lth student on the ith concept  R lj indicates the answer of the l-th student on the j-th test item If the student answers the test item correctly, R lj is 1; otherwise R lj is 0  Z im represents the relationship between the i-th and the m-th concepts  X mj indicates the degree of relevance between the m-th concept and the j-th test item

Methodology - Understanding strength of the concept  Translate the importance ratio of the concept into the understanding strength of the concept  To t(C i ) represents the threshold value of the i-th concept, 0 ≤ t(C i ) ≤ 1  m indicates the gradient of the function, m = 1  IRP(C i ) represents the importance ratio of concept C i in the pre-test, 0 ≤ IRP(C i ) ≤ 1  b is the point at which the line crosses the y- axis, b = 0

Illustrative example Test itemConcept C1C2C3C4C5 T T T T T Illustrative example of relationship between test items and concepts

Illustrative example Concept C1C2C3C4C5 C C C C C Illustrative example of relationship between concepts

Strength of Concept  The strength of the first concept (C 1 ):  S(C 1 ) = Z 11 *X 11 + Z 11 * X 12 + Z 13 * X 32 = 1.0 × × × 0.6 = 1.44  S(C 2 ) = Z 21 *X 23 + Z 21 * X 24 + Z 23 * X 32 + Z 25 * X 52 + Z 25 * X 54 = 1.0* * * * *0.4 = 1.88 Test item Concept C1C2C3C4C5 T T T T T Concept C1C2C3C4C5 C C C C C

Importance Ratio of Concept Importance ratio of concept Concept C1C2C3C4C5 IPR  The importance ratio of second concept (C 2 )  IRP(C 2 ) = 1.88 / 7.52 = 0.25

Illustrative example Test item Student S1S2S3S4S5 T T T T T Illustrative example of the relationship between students’ answers and test items

Relationship between students’ answers and test items (R) Illustrative example Student Concept C1C2C3C4C5 S S S S S Relationship between students’ understanding strength and concepts (USS) Test item Student S1S2S3S4S5 I I I I I Test item Concept C1C2C3C4C5 I I I I I Concept C1C2C3C4C5 C C C C C Relationship between concepts (Z) Relationship between test items and concepts (X) R 41 Z 51 X 11 R 41 Z 52 X 21 R 41 Z 53 X 31 R 41 Z 54 X 41 R 41 Z 55 X 51 R 42 Z 51 X 12 R 42 Z 52 X 22 R 42 Z 53 X 32 R 42 Z 54 X 42 R 42 Z 55 X 52 R 43 Z 51 X 13 R 43 Z 52 X 23 R 43 Z 53 X 33 R 43 Z 54 X 43 R 43 Z 55 X 53 R 44 Z 51 X 14 R 44 Z 52 X 24 R 44 Z 53 X 34 R 44 Z 54 X 44 R 44 Z 55 X 54 R 45 Z 51 X 15 R 45 Z 52 X 25 R 45 Z 53 X 35 R 45 Z 54 X 45 R 45 Z 55 X x 0.0 x x 0.2 x x 0.6 x x 0.4 x x 1.0 x x 0.0 x x 0.2 x x 0.6 x x 0.4 x x 1.0 x x 0.0 x x 0.2 x x 0.6 x x 0.4 x x 1.0 x x 0.0 x x 0.2 x x 0.6 x x 0.4 x x 1.0 x x 0.0 x x 0.2 x x 0.6 x x 0.4 x x 1.0 x x 0.0 x x 0.2 x x 0.6 x x 0.4 x x 1.0 x x 0.0 x x 0.2 x x 0.6 x x 0.4 x x 1.0 x x 0.0 x x 0.2 x x 0.6 x x 0.4 x x 1.0 x x 0.0 x x 0.2 x x 0.6 x x 0.4 x x 1.0 x x 0.0 x x 0.2 x x 0.6 x x 0.4 x x 1.0 x 0.0

Illustrative example Threshold value of concept Concept C1C2C3C4C5 IPR *m = 1, b = 0 for threshold function in this case

PKT&D System Architecture

Experiment  Participants :  A course instructor  80 university students  Course: bioinformatics  Group:  Control group: 40 students (used the PKT&D system)  Experiment group: 40 students (did not use the PKT&D system)

Experiment Subject: sequence analysis approaches and tools Concepts in prior knowledge: sequence characteristics and structures, statistical hypothesis testing, and formula expression format UnitInstruction activitiesTime (min) Understanding the importance of similarity 1.A series of guided questions (5) 2.Slide presentation (15) 3.Discussions (10) 30 Introduction to the most popular data-mining tool: BLAST 1.A series of guided questions (5) 2.Slide presentation (15) 3.Practice (10) 30 BLASTing protein sequences 1.A series of guided questions (5) 2.Slide presentation (10) 3.Practice (15) 30 Understanding BLAST output 1.Slide presentation (15) 2.Discussions (15) 30 BLASTing DNA sequences 1.A series of guided questions (5) 2.Slide presentation (10) 3.Practice(15) 30 The BLAST way of doing things 1.Slide presentation (15) 2.Practice (15) 30

Experiment Process

The learning motivation post-test score between the two groups Group Number of students MeanS.D. Adjusted mean F(1, 77)P-value Experimental group * Control group Total number of students  To measure the students’ learning motivation, Motivated Strategies for Learning Questionnaire was adopted in this study.  using nine questionnaire items and a seven-point Likert scale

The paired t-test results of learning motivation for the two groups of students GroupTestsNMeanS.D.t(39) Experiment al group Pre-test * Post-test Control group Pre-test * Post-test

Students’ attitude towards bioinformatics learning #Item Experiment Group (Mean, S.D.) Control Group (Mean, S.D.) t-value 1I like learning bioinformatics5.45/ / The bioinformatics learning activities are helpful 5.68/ / I like to practice using the software in the bioinformatics learning 5.75/ / I had enough ability to learn the bioinformatics material 5.28/ / * 5 I can meet the instructor’s requirements during the bioinformatics learning process 5.38/ / * 6 I can understand the bioinformatics material taught by the instructor 5.45/ / *

Experiment group students’ perceptions of using the PKT&D system #QuestionEU(%)QU(%)SU(%) Neither(%)SL(%)QL(%)EL(%)Mean 1 Using the PKT&D system in learning bioinformatics would enable me to diagnose and strengthen prior knowledge more effectively Using the PKT&D system would improve my bioinformatics learning performance Using the PKT&D system in learning bioinformatics would increase my learning comprehension productivity Using the PKT&D system would make it easier to learn bioinformatics I would find the PKT&D system useful in the bioinformatics class Note: EU: Extremely Unlikely; QU: Quite Unlikely; SU: Slightly Unlikely; SL: Slightly Likely; QL: Quite Likely; EL: Extremely Likely.

Independent pre-test on knowledge of bioinformatics of the two groups Variable Pre-test t-value NMeanS.D. Experiment group Control group

The paired t-test results of the learning improvement for the two groups GroupTestsNMeanS.D.t-value Experimental group Pre-test * Post-test Control group Pre-test * Post-test * p<0.05.

Independent Post-test on knowledge of bioinformatics of the two groups Variable Post-test t-value Number of students MeanS.D. Experimental group * Control group

Example interview comments about the three topics Inductive topics PerspectivesIntervieweesTranscript sample of interview comments Instruction Realization of students IN I felt that the students in the experiment group better met my requirements during the course. Achievement of students IN I felt that the students in the experiment group demonstrated high performance in each learning activity. Progress of instruction IN I had much more time to teach the concepts in more detail and interact with the students of experiment group. Learning situationIN Only one third of the students from the control group could fully follow the activities and understand my instructions. SC & SE Learning about bioinformatics software through practicing using it was interesting. I would have preferred more direct instruction from the course instructor. SC I felt some of the concepts were difficult to grasp, which led to obstacles when I used the software during the course. Interaction Discussion willingness IN The students in the experiment group had better discussions than the control group. Responsiveness of students IN The students in the experiment group often gave feedback and asked questions. Technology System usefulness SE I felt the user interface of the PKT&D system was clear, straightforward, and convenient to use..I clearly saw the diagnostic results and learning suggestions. Auxiliary components IN & SE I felt that the PKT&D system served as a guide that helps the students to diagnose the weakness of their concepts. I can learn more prior knowledge using the PKT&D system.

Evaluation of correctness rate results for the three concepts diagnosis

Student ID Correctness rate of diagnoses Expert 1100% 66.60%100% Expert 2100%66.60%100%66.60%100% Student ID Correctness rate of diagnoses Expert %100%66.60%100% Expert 2100% 33.30%100% Student ID Correctness rate of diagnoses Expert 1100% Expert 2100% 66.60%100% Student ID Correctness rate of diagnoses Expert 1100%66.60%33.30%100% 66.60% Expert 2100% 66.60%100% Student ID Correctness rate of diagnoses Expert 1100% 33.30%100%33.30%100% Expert 2100% 33.30%100%

Evaluation of correctness rate results for the five concepts diagnosis Student ID Correctness rate of diagnoses Expert 180%100% 80%100% 80%100% Expert 2100% 80%100% Student ID Correctness rate of diagnoses Expert 1100% 80%100%80%100%60% Expert 2100% 80%100%40% Student ID Correctness rate of diagnoses Expert 1100%60%100%60%100% Expert 260%40%80%60%100%80%100% Student ID Correctness rate of diagnoses Expert 140%80%100% Expert 2100%80%100% 80%100% Note: Correctness rate obtained by comparing the diagnoses of the experts with those obtained using the proposed approach to artificial intelligence course. The average correctness rates are % and 90% for the students.

Conclusion & Discussion  Propose a testing-based approach to diagnose the strength of individual students’ prior knowledge of concepts, and then provide them with appropriate materials to strengthen this  Provide instructors can undertake their teaching more smoothly  Educators can use the proposed system in different educational contexts

Limitation  Two variables of the linear function have to be adjusted based on instructors’ expertise in different educational contexts

Future Work  The number of test items in the item bank should be continually increased to address various subject objectives and the instructors’ needs  Students’ learning portfolio can be integrated into the proposed system to develop more appropriate diagnosis mechanisms