Simulation-driven Enterprise Modelling: WHY ? Architecture and Modelling of Information Systems (D0I71A) Prof. dr. Monique Snoeck Simulation-driven Enterprise Modelling: WHY ?
Why Model Simulation ? I’LL NEED TO KNOW YOUR REQUIREMENTS BEFORE I START TO DESIGN THE SOFTWARE FIRST OF ALL, WHAT ARE YOU TRYING TO ACCOMPLISH ? I’M TRYING TO MAKE YOU DESIGN MY SOFTWARE. I MEAN WHAT ARE YOU TRYING TO ACCOMPLISH WITH THE SOFTWARE ? I WON’T KNOW WHAT I CAN ACCOMPLISH UNTIL YOU TELL WE WHAT THE SOFTWARE CAN DO. TRY TO GET THIS CONCEPT THROUGH YOUR THICK SKULL: THE SOFTWARE CAN DO WHATEVER I DESIGN IT TO DO. CAN YOU DESIGN IT TO TELL YOU MY REQUIREMENTS ? 11:15
Why model simulation ? Understanding a ‘Plan’ (e.g. for validating it against requirements) requires capability of “picturing” the resulting artefact ???? 11:15
Conceptual modelling …is a complex learning task Requirements Analysis and Design of Information Systems through conceptual modeling cognitive task: analyzing business requirements and constructing a semantically correct conceptual model that reflects the structural and dynamic views of a given domain description
Problem : skill transferability Mentally picturing resulting software from models, needs high levels of expertise Teaching “experience” (domain specific knowledge) is difficult Some aspects cannot be gained with reading and lecturing alone, e.g. dynamic representation of a system-to-be Lack of technical insights novice expert
Simulation of requirements (Prototyping) effective instrument allowing to achieve complex learning goals by Visualizing design choices Learning by experiencing Successful transfer of the skills to real-world environments problems Imprecise execution semantics Complex and time consuming to achieve conceptual model simulation using current standards Sometimes it is difficult to interpret simulation results
Feedback in JMermaid Observed problem: students cannot understand their own model Solutions: Model to text feature Augment generated prototype with a feedback feature that links results of a model test to its causes in the corresponding part of the model
MERODE Prototyping 11:15
MERODE Prototyping 11:15
MERODE Prototyping 11:15
Requirements Engineering with Executable Models Fully functional (single click) prototype Requirements Engineering with JMermaid single conceptual model generate simulate test and early defect detection revisit/refine model Revisit / refine Reflect on Test -> early detection of defects 11:15
Scientific research QUESTION 1 Does feedback-enabled prototyping (simulation) improve modeling knowledge of a novice modeler in terms of his/her capability of assessing a model’s semantic quality ? Empirical evaluation Assessing the effects on learning outcomes by comparing the test scores with and without a use of proposed simulation technique Understanding of structural model Understanding of behavioural model Understanding of interplay between structure & behavior Understanding of inheritance
Experimental studies 5 studies : with participation of 169 final year master-level students from Leuven and Brussels campuses, spanning 3 academic years (2012-2013-2014-2015) Dependent variable = model quality
Experimental studies H1: Feedback-enabled simulation significantly improves model validation capabilities of a novice business analyst. confirmed : magnitude of the effect = 2.33 – 4 ( out of 8 ) H2: The use of the prototype has a persisting learning effect on student’s test scores when is no longer used. confirmed H3 : The test scores are not influenced by any particular personal characteristics of users confirmed
Q1: Does simulation help ? Conclusion: Simulation positively affects a student's understanding of a conceptual model
Scientific research QUESTION 2 Does the process of modelling matter ? How we can adapt teaching guidance to achieve process-oriented guidance (how to do it right ?) based on learning process observations vs. outcome feedback (is the solution correct? Why(not)?)
Observing learner behavior Log the behavior of the modeler in JMermaid Modeling behavior (activities) data have been collected through logging while students were working on a group project 2 studies, 165 students (86 in the original and 79 in the replicated experiment) randomly assigned to 39 groups (20 and 19 respectively) Exploratory : 3-dimensional analysis
A bird’s eye view on the modeling process Worse groups: - sequential Best groups - more activities - simulate more - simulate beh. Top-level process map representing the modelling process of the 5 worst performing groups (replication experiment 2013). Top-level process map representing the modelling process of the 5 best performing groups (replication experiment 2013). Top-level process map representing the modelling process of the 5 worst performing groups (replication experiment 2014). Top-level process map representing the modelling process of the 5 best performing groups (replication experiment 2014).
Global process analysis (EDG/OET/FSM) Best groups - iterate - integrate Worse groups: - sequential Top-level process representing modeling activities for class diagram (EDG), business events (OET) & event sequence constraints (FSM) – Worst 5 groups (original study 2013) Top-level process map representing modeling activities for class diagram (EDG), business events (OET) & event sequence constraints (FSM) – Best 5 groups (original study 2013) Best groups - simulate more - for all perspectives Top-level process map representing modeling activities for class diagram (EDG), business events (OET) & event sequence constraints (FSM) – Worse 5 groups (replication 2014) Top-level process map representing modeling activities for class diagram (EDG), business events (OET) & event sequence constraints (FSM) – Best 5 groups (replication 2014)
Distribution of modeling effort over time Best groups - activities distributed over semester - simulate more often - react to intermediate feedback Distribution of validation effort over time : replication study 2014 Distribution of modeling effort over time: replication study 2014 Context information Worse groups: - deadline driven - no reaction to feedback - less simulation
Distribution of relative modeling effort per event type (early vs Distribution of relative modeling effort per event type (early vs. late sessions) Worst / Satisfactory / Best performing groups (original experiment 2013) Worst / Satisfactory / Best performing groups (replication experiment 2014) Worse groups: - first solution not good enoug - fix through "plumbing" Best groups - start with a good solution - stabilise solution
Findings : Generalized modeling process patterns Behavioural patterns associated with worse/better learning outcomes Pattern 1 : Modelling approach: Sequential vs. iterative modeling approach Pattern 2 : Validation approach: Partial and/or disconnected vs. cross-validation oriented with broader test coverage Pattern 3 : Validation intention: Global testing vs. verifying recent changes Pattern 4 : Engagement in modeling activities: Deadline-oriented vs. earlier and systematic engagement Pattern 5 : Effort distribution over time: Decreased vs. continuous and increased effort Pattern 6 : Effort distribution within modeling tasks: Unstructured approach for modeling vs priority oriented
Q2: Does the modelling process matter ? Conclusions: Modelling process does matter ! Process mining techniques can be highly practical for monitoring and analyzing (cognitive) learning processes by also serving as a useful instrument for identifying and suggesting feedback needs We will pursue our research by investigating the logs of submitted solutions Only outcome is subject to final evaluation