Beyond Computational Thinking

Slides:



Advertisements
Similar presentations
Recommender Systems & Collaborative Filtering
Advertisements

YOU CANT RECYCLE WASTED TIME Victoria Hinkson. EXPERIMENT #1 :
ARCHITECTURES FOR ARTIFICIAL INTELLIGENCE SYSTEMS
What is a CAT?. Introduction COMPUTER ADAPTIVE TEST + performance task.
Part II Sigma Freud & Descriptive Statistics
1 SSS II Lecture 1: Correlation and Regression Graduate School 2008/2009 Social Science Statistics II Gwilym Pryce
INFERENTIAL STATISTICS. Descriptive statistics is used simply to describe what's going on in the data. Inferential statistics helps us reach conclusions.
Copyright © Allyn & Bacon (2007) Data and the Nature of Measurement Graziano and Raulin Research Methods: Chapter 4 This multimedia product and its contents.
Introduction to Statistics and Research
The Art and Science of Teaching (2007)
Assessment: Reliability, Validity, and Absence of bias
Data and the Nature of Measurement
Coding Scheme in Gestures Analysis Liang Zhou Dr. Manolya Kwa.
VALIDITY.
Validity, Sampling & Experimental Control Psych 231: Research Methods in Psychology.
Demystifying the new Primary computing curriculum
Why teach coding?.
Experimental Group Designs
VENDORS, CONSULTANTS AND USERS
Negative Urgency, Distress Tolerance and Problematic Alcohol Use Abstract Purpose: This study aimed to explore the relations among Negative Urgency, Distress.
LEARNING PROGRAMME Hypothesis testing Intermediate Training in Quantitative Analysis Bangkok November 2007.
1 Chapter No 3 ICT IN Science,Maths,Modeling, Simulation.
Computational Thinking: Assessment in Alice Class Overview.
The Sex and Gender Role Differences in Exploration and Curiosity T. Beth Carroll Crystal Ann Fravel Frank White Amy R. Childress Radford University.
1 Psychology 2020 Measurement & Observing Behavior Unit 2.
Brain Mapping Unit The General Linear Model A Basic Introduction Roger Tait
Measuring Complex Achievement
Shades of Gray: Ambiguity Tolerance & Statistical Thinking Robert H. Carver Stonehill College/Brandeis University Session 385 JSM 2007 Salt Lake City.
VENDORS, CONSULTANTS AND USERS. WHY CAN’T COMPANIES DEVELOP THEIR OWN ERP PACKAGES? To develop an ERP package is a complex & time consuming activity which.
Introduction to Programming G50PRO University of Nottingham Unit 2 : Introduction To Scratch Paul Tennent
Validity and Reliability Neither Valid nor Reliable Reliable but not Valid Valid & Reliable Fairly Valid but not very Reliable Think in terms of ‘the purpose.
Decision Making Chapter 7. Definition of Decision Making Characteristics of decision making: a. Selecting a choice from a number of options b. Some information.
Data Mining: Knowledge Discovery in Databases Peter van der Putten ALP Group, LIACS Pre-University College LAPP-Top Computer Science February 2005.
Article Summary of The Structural Complexity of Software: An Experimental Test By Darcy, Kemerer, Slaughter and Tomayko In IEEE Transactions of Software.
SOCW 671: #5 Measurement Levels, Reliability, Validity, & Classic Measurement Theory.
Introduction To Statistics. Statistics, Science, ad Observations What are statistics? What are statistics? The term statistics refers to a set of mathematical.
Software Engineering1  Verification: The software should conform to its specification  Validation: The software should do what the user really requires.
Experimental Research Methods in Language Learning Chapter 12 Reliability and Reliability Analysis.
Chapter 7 Measuring of data Reliability of measuring instruments The reliability* of instrument is the consistency with which it measures the target attribute.
EDUC 8464 – Assignment 2, Part 1 1 Science: LIFE & LIVING EDUC 8464 – Assignment 2 Part 1: Lesson Planning Science: LIFE & LIVING Miss Anderton.
Data Analysis: Statistics for Item Interactions. Purpose To provide a broad overview of statistical analyses appropriate for exploring interactions and.
Research Methods Systematic procedures for planning research, gathering and interpreting data, and reporting research findings.
Experimental Design Ragu, Nickola, Marina, & Shannon.
Meta-analysis Overview
Multiple Regression.
Recommender Systems & Collaborative Filtering
Effects of Character Voice-over on Players’ Engagement
The Five Point Test: Age and Education on Test Taking Strategy
Institute of Facility Management Workplace Research & Management
Assessment Theory and Models Part I
Computational Thinking Throughout the Curriculum
Identifying Interactive Components of the Horse-Rider
Cognitive Processes: Thinking and Problem Solving
Dr. Siti Nor Binti Yaacob
Lesson 10: Building an App: Color Sleuth
Title: Validating a theoretical framework for describing computer programming processes 29 November 2017.
Introduction to Statistics and Research
VENDORS, CONSULTANTS AND USERS
Methods of Studying Human Behavior
Finding Answers through Data Collection
Methods of Studying Human Behavior
Multiple Regression.
Presentation to the ACARA Digital Technologies National Working Group
Reliability.
Chapter 0 : Introduction to Object Oriented Design
15.1 The Role of Statistics in the Research Process
Differentiated Instruction
Binary pixel challenge 2
Year 8 Computer Science Digital Portfolio
Presentation transcript:

Beyond Computational Thinking A Modern problem solving approach and its application 29. November, 2017

Background New world – ICT/STEM HYBRID JOBS Science, business, designer, … Not only computer science Programming ‘Complex problem solver’ SKILLS ICT jobs grow faster than other JOBS ICT jobs: demand > supply USA – EU – OECD

Background New world – ICT/STEM Computational Thinking

Computational Thinking features Problem Solving Approach Decompose the problem Abstraction: neglecting information Abstraction: pattern recognition Algorithmic design

Computational Thinking The answer? How is Computational Thinking applied in solving a programming problem? What is the relationship between Computational Thinking ability and the ability to solve programming problems?

Method DESIGN measurements Participants

Method Coding independent CT measurement – The Bebras task (20) Phase 1 – CT measurement Coding independent CT measurement – The Bebras task (20) International informatics contest since 2004 Logical problems – without coding elements 8 easy (2p) + 7 medium hard (3p) + 5 hard (4p) = 20 items (57p) Covariates (gender, age, IQ, …)

Method Phase 2 – Design & Scratch In pairs (Bebras score): “Program a story or a game where a hero has to overcome a challenge in order to defeat the villain(s)” in Scratch. While participants solving programming task: Videotaped Voice recorded Screen captured

Method CT feature Example behavior Decompose the problem Abstraction: Phase 2 – Computational thinking behaviour scheme CT feature Example behavior (.61 < κ(5 vid; 2 rater) < .69) Next step to do? Putting problem into smaller problem Disc. If-then relation of story/ game Focusing on x, actively neglecting y Simplifying anything (problem, codes, tasks, …) Rephrasing meaning of anything (codes, functions, …) Identifying similar structure (problems, codes, …) Aha-moments (must be related to a prior event) Any usage of copy-paste; copy-paste behavior Putting code chunks together Testing and judging code script (i.e. clicking on “run”) Debugging; adjust code script Decompose the problem Abstraction: neglecting information Abstraction: pattern Recognition Algorithmic design

Method Phase 2 – Computational thinking behaviour scheme - interact

Method Assessment of programming skills Phase 2 – assessment programming skills Assessment of programming skills Richness: “What and how much is happening their code?” Variety: “How many different code elements are they using?” Organisation: “How messy/ clean does their work space look?” Functionality: “How well is their code working?” Efficiency: “How well developed is their control flow? Many repetitions?” = Weighted mean Reliability: .93 < ICC(CI95%) < .98

Method participants Participants Programming pairs Expected: N ≈ 50; Expected: pairs ≈ 25; Age: 𝑥 =24.29 𝑆𝐷=5.78 Gender ratio: 62 % female; 36 % male IQ (paired): 𝑥 =114.74 𝑆𝐷=12.98 Bebras score (paired): 𝑥 =59% 𝑆𝐷=17% actual: N = 127 actual: pairs = 27

Results Explorative results Out of 40 min, participants spent... Mean How is computational thinking applied in solving a programming problem? Rather explorative question  explorative data analysis Out of 40 min, participants spent... Mean Min – Max Decompose the problem 03 min, 06 sec 00 min, 24 sec – 09 min, 03 sec Abst.: neglecting information - Abst.: pattern recognition 00 min, 34 sec pairs = 17 00 min, 04 sec – 01 min, 30 sec Algorithmic design 14 min, 59 sec 04 min, 09 sec – 24 min, 25 sec = CT behavioural (total) 18 min, 28 sec. 06 min, 18 sec – 28 min, 10 sec

Results Hypothesis Testing CT (w/o coding elements) What is the relationship between computational thinking ability and the ability to solve programming problems? H1: There is a positive correlation between Bebras score and programming ability. CT (w/o coding elements) Programming ability r Bebras score Richness .39 Variety .26 Organisation .05 Functionality .29 Efficiency .24 = weighted mean .30 Bold: p < .05

Results Hypothesis Testing Programming ability CT What is the relationship between computational thinking ability and the ability to solve programming problems? H2: There is a positive correlation between the frequency of computational thinking behavioural components and programming ability. Programming ability CT (with coding elements) r Weighted mean Decompose the problem .24 Abst.: neglecting information - Abst.: pattern recognition .12 Algorithmic design .63 CT behavioural (total) .62 Bold: p < .05

Results Hypothesis Testing CT (w/o coding elements) What else? Control analysis: H3: There is a positive correlation between the Bebras scores and the frequency of computational thinking behavioural components. CT (w/o coding elements) (with coding elements) r Bebras score Decompose the problem .29 Abst.: neglecting information - Abst.: pattern recognition .15 Algorithmic design .34 = CT behavioural (total) .39 Bold: p < .05

Results Hypothesis Testing Outcome = programming ability weighted mean What else? Regression analysis: H4: CT without coding elements and CT with coding elements, can predict programming ability (while controlled for IQ). Outcome = programming ability weighted mean std. β IQ 0.36 Bebras score -0.41 CT behavioural 0.74 Model statistics F(3,20) 6.602 R2 (adj.) .42 Bold: p < .05

Results In summary CT instruments: satisfactorily reliable CT without programming aspects: Bebras Already well developed CT with coding programming aspects: CT behaviour scheme Reliability estimations “high enough” Programming assessment: satisfactorily reliable Participants spent a fair amount of time with CT in general …but not all features are equally important In general… Medium positive relationship between CT (w/o coding) & programming skills Large positive relationship between CT (w coding) & programming skills “The best predictor for programming skills are hands on computational thinking elements (if controlled for intelligence and theoretical CT).” However, it really depends!

Obstacles Suggestions? CT (Bebras) ≠ CT (behavioural); what does this mean? What to do with abstraction? Pairing not perfect and