Interactive Goal Model Analysis Applied - Systematic Procedures versus Ad hoc Analysis Jennifer Horkoff 1 Eric Yu 2 Arup Ghose 1 Department of Computer.

Slides:



Advertisements
Similar presentations
Scoring Training Guide. Goals of Assessment We must ensure that tests measure what is of value, not just what is easy to test. If we want students to.
Advertisements

Critical Reading Strategies: Overview of Research Process
CONCEPTUAL WEB-BASED FRAMEWORK IN AN INTERACTIVE VIRTUAL ENVIRONMENT FOR DISTANCE LEARNING Amal Oraifige, Graham Oakes, Anthony Felton, David Heesom, Kevin.
Reciprocal Teaching: Session 1. Twilight Course Overview Session 1: An Introduction to Reciprocal Teaching Introduction to the 4 key strategies used in.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Evaluation Issues in Programs that Promote Self-Determination Anita Yuskauskas, Ph.D.
Computer-Based Performance Assessments from NAEP and ETS and their relationship to the NGSS Aaron Rogat Educational Testing Service.
The design process IACT 403 IACT 931 CSCI 324 Human Computer Interface Lecturer:Gene Awyzio Room:3.117 Phone:
Practicing Community-engaged Research Mary Anne McDonald, MA, Dr PH Duke Center for Community Research Duke Translational Medicine Institute Division of.
CS305: HCI in SW Development Evaluation (Return to…)
May 14, May 14, 2015May 14, 2015May 14, 2015 Azusa, CA Sheldon X. Liang Ph. D. Software Engineering in CS at APU Azusa Pacific University, Azusa,
A Linguistics-Based Approach for Use Case Driven Analysis Using Goal and Scenario Authoring Vijayan Sugumaran Oakland University Rochester, Michigan, USA.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
DECO3008 Design Computing Preparatory Honours Research KCDCC Mike Rosenman Rm 279
USABILITY AND EVALUATION Motivations and Methods.
Supporting Design Managing complexity of designing Expressing ideas Testing ideas Quality assurance.
Exploring Adaptive and Representational Expertise Short Oral Discussant Remarks Jon R. Star Michigan State University.
Empirically Assessing End User Software Engineering Techniques Gregg Rothermel Department of Computer Science and Engineering University of Nebraska --
Security Models for Trusting Network Appliances From : IEEE ( 2002 ) Author : Colin English, Paddy Nixon Sotirios Terzis, Andrew McGettrick Helen Lowe.
Requirement Engineering – A Roadmap
An evaluation framework
Amirkabir University of Technology, Computer Engineering Faculty, Intelligent Systems Laboratory,Requirements Engineering Course, Dr. Abdollahzadeh 1 Goal.
Difficulties Facing English Majors in Writing Research Papers at the Islamic University of Gaza.
Simulation Exercises Overview Activities designed to assess, enhance and evaluate preparedness.
WRITING PROPOSALS WITH STRONG METHODOLOGY AND IMPLEMENTATION Kusum Singh, Virginia Tech Gavin W. Fulmer, National Science Foundation 1.
School Development Planning Initiative “An initiative for schools by schools” Self-Evaluation of Learning and Teaching Self-Evaluation of Learning and.
Firat Batmaz, Chris Hinde Computer Science Loughborough University A Diagram Drawing Tool For Semi–Automatic Assessment Of Conceptual Database Diagrams.
TEAM MORALE Team Assignment 12 SOFTWARE MEASUREMENT & ANALYSIS K15T2-Team 21.
The design process z Software engineering and the design process for interactive systems z Standards and guidelines as design rules z Usability engineering.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
An Introduction to Research Methodology
A Framework for Iterative, Interactive Analysis of Agent-Goal Models in Early Requirements Engineering (Research Proposal) Jennifer Horkoff 1 Eric Yu 2.
Rocks, fishes and a slice of cake: a study into integrating and facilitating the development of academic literacy with a cohort of undergraduate.
Needs Analysis Session Scottish Community Development Centre November 2007.
Evaluation and Scalability of Goal Models URN Meeting Ottawa, January 16-18, 2008 Jennifer Horkoff PhD Candidate, Department of Computer Science, U of.
Analyzing Goal Models – Different Approaches and How to Choose Among Them Jennifer Horkoff 1 Eric Yu 2 1 Department of Computer Science 2 Faculty of Information.
Evaluating Goal Achievement in Enterprise Modeling – An Interactive Procedure and Experiences Jennifer Horkoff 1 Eric Yu 2 1 Department of Computer Science,
Managing Social Influences through Argumentation-Based Negotiation Present by Yi Luo.
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
R EFLECTIVE A NALYSIS OF THE S YNTAX AND S EMANTICS OF THE i* F RAMEWORK Jennifer Horkoff, Golnaz Elahi, Samer Abdulhadi, Eric Yu Department of Computer.
Visualizations to Support Interactive Goal Model Analysis Jennifer Horkoff 1 Eric Yu 2 Department of Computer Science 1 Faculty of Information 2
Evaluation of software engineering. Software engineering research : Research in SE aims to achieve two main goals: 1) To increase the knowledge about.
Evaluating a Research Report
Human Computer Interaction
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
HCI in Software Process Material from Authors of Human Computer Interaction Alan Dix, et al.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
An exploration of students’ problem solving behaviors Presenter: Chun-Yi Lee Advisor: Ming-Puu Chen Muir, T., Beswick, K., & Williamson, J. (2008). I am.
1 ISE 412 Usability Testing Purpose of usability testing:  evaluate users’ experience with the interface  identify specific problems in the interface.
Information commitments, evaluative standards and information searching strategies in web-based learning evnironments Ying-Tien Wu & Chin-Chung Tsai Institute.
Requirements Collection By Dr. Gabriel. Requirements A requirement is any function, constraint, or property that the system must provide, meet, or satisfy.
Individual Differences in Human-Computer Interaction HMI Yun Hwan Kang.
Finding Solutions in Goal Models: An Interactive Backward Reasoning Approach Jennifer Horkoff 1 Eric Yu 2 Department of Computer Science 1 Faculty of Information.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
Facilitate Group Learning
JOT2 – LEARNING THEORIES
Introduction The design, development and maintenance of concurrent software are difficult tasks. Truly effective support methodologies have yet to be developed.
1 Structuring Knowledge for a Security Trade-offs Knowledge Base Golnaz Elahi Department of Computer Science Eric Yu Faculty of Information Study University.
Requirements Engineering Processes. Syllabus l Definition of Requirement engineering process (REP) l Phases of Requirements Engineering Process: Requirements.
The Use of Formative Evaluations in the Online Course Setting JENNIFER PETERSON, MS, RHIA, CTR DEPARTMENT OF HEALTH SCIENCES.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
Software Design and Development Development Methodoligies Computing Science.
Fact Finding (Capturing Requirements) Systems Development.
INSTRUCTIONAL DESIGN Many definitions exist for instructional design 1. Instructional Design as a Process: 2. Instructional Design as a Discipline: 3.
DATA COLLECTION METHODS IN NURSING RESEARCH
Leacock, Warrican and Rose (2009)
Oleh: Beni Setiawan, Wahyu Budi Sabtiawan
ASSESSMENT OF STUDENT LEARNING
Presentation transcript:

Interactive Goal Model Analysis Applied - Systematic Procedures versus Ad hoc Analysis Jennifer Horkoff 1 Eric Yu 2 Arup Ghose 1 Department of Computer Science 1 Faculty of Information 2 University of Toronto November 10, 2010 PoEM’10

Goal Modeling  Used as a tool for system analysis and design in an enterprise  Captures social-driven goals which motivate design or redesign  First sub-model of Enterprise Knowledge Development (EKD) method  Used in several Requirements Engineering frameworks i* (Yu, 97) Tropos (Bresciani et al., 94) GBRAM (Antón et al., 98) KAOS (Dardenne & van Lamsweerde, 93) GRL (Liu & Yu, 03) Etc. Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 2

Goal Model Analysis  Work has argued that more utility can be gained from goal models by applying systematic analysis  Many different types of analysis procedures have been introduced (metrics, model checking, simulation, planning, satisfaction propagation)  Most of the work in goal model analysis focuses on the analytical power and mechanisms of the procedures  What are the benefits of goal model analysis?  Do these benefits apply only to a systematic procedure? Or also to ad-hoc (no systematic procedure) analysis?  Focus: interactive satisfaction propagation Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 3

Hypotheses: Benefits of Systematic, Interactive Goal Model Analysis  Previous work by the authors has introduced interactive, qualitative goal model analysis aimed for early enterprise analysis (CAiSE’09 Forum, PoEM’09, IJISMD)  Hypotheses concerning benefits of interactive analysis developed through application of several case studies (PoEM’09, PST’06, REFSQ’08, HICSS’07, RE’05) Analysis: aids in finding non-obvious answers to domain analysis questions Model Iteration: prompts improvements in the model Elicitation: leads to further elicitation of information in the domain Domain Knowledge: leads to a better understanding of the domain  In this work we design and administer studies to test these hypotheses Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 4

Background: i* Models  We use i* as an example goal modeling framework Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 5

“Real” Example: inflo Case Study Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 6

Background: Interactive Satisfaction Analysis  Forward: A question/ scenario/ alternative is placed on the model and its affects are propagated “forward” through model links  Interactive: user input (human judgment) is used to decide on partial or conflicting evidence “What is the resulting value?”  Publications: CAiSE’09 Forum, PoEM’09, IJISMD  Additional procedure for “backward” analysis, allows “is this possible?” questions  Publications: istar’08, ER’10 Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 7 Human Judgment What if…?

Case Study Design  One group study involving “inflo” “back-of-the-envelope” calculation and modeling tool (case = group) Four grad students, 1 professor, and 1 facilitator Three two hour modeling sessions + one hour analysis session Most of each session devoted to developing the model & discussion with analysis at the end of each session  Ten two-hour sessions with an individual and a facilitator (case = individual) Five used systematic forward and backward analysis implemented in OpenOME Five were allowed to analyze the models as they liked  Individual study design was modified midway through Divided into Round 1 and Round 2  Studies were both exploratory and confirmatory Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 8

Individual Studies (Round 1)  Participants: students who had i* experience in system analysis courses or through i*-related projects  Purposive selection: wanted subjects with some i* knowledge but not much analysis experience  Training: Participants given 10 minutes of i* training (including analysis labels) Systematic participants given 15 minutes of analysis training using the tool  Model Domain: ICSE Greening models, large to medium models created by others  Analysis Questions: 12 questions provided 2 for each analysis direction (forward, backward) per model * 3 models Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 9

ICSE Greening Example: Conference Experience Chair Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 10

Individual Studies  Intermediate (Round 1) results: Models were too complicated Too many analysis questions Participants unfamiliar with domain Didn’t “care” about judgment decisions Made very few changes to models (too afraid to change other’s work? too intimidated to change complex models?) Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 11

Individual Studies (Round 2)  Round 2 Changes (last 4/10 participants) Model Domain: Asked participants to create their own models describing student life Group case study showed that participants had trouble finding analysis questions over their own model Created Analysis Methodology to help guide the analysis  Extreme test conditions (all alternatives/targets satisfied/denied)  Analyze likely alternatives/targets  Analyze domain-driven questions Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 12

Data Capture  Analysis: captured answers to analysis questions  Model Iteration: quantitative counts of model changes for each stage in the studies  Elicitation: captured lists of questions asked about the domain in each stage  Domain Knowledge: follow-up questions about experience  Recorded and analyzed other interesting qualitative findings Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 13

Results

Analysis  Analysis: aids in finding non-obvious answers to domain analysis questions  Some participants gave explicit answers, others had difficultly producing answers  Some referred to analysis labels in the model as answers to the question  Only some participants were able to interpret analysis results in the context of the domain  Generally, difficulty in mapping the model to the domain  Conclusion: knowledge of i* and the domain may have a significant effect on the ability to apply and interpret analysis Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 15

Model Iteration & Elicitation  Model Iteration: prompts improvements in the model  Elicitation: leads to further elicitation of information in the domain  Few changes, few differences between ad hoc & systematic, familiar and unfamiliar domain, forward backward Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 16 # Model Changes# Questions Asked TreatmentPartic. Forward Questions Backward Questions Forward Questions Backward QuestionsRound Ad-hoc P P40010 P P P Systematic P P30020 P60351 P P

Model Iteration & Elicitation  Conflicts with previous results (PoEM’09, PST’06, etc.), Why?  Underlying theory: interactive analysis prompts users to notice differences between mental domain model and physical model Evaluation did not reveal differences between the mental and physical model, or these differences existed, but were not used to modify the model Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 17

Model Iteration & Elicitation  Previous studies were conducted by i*/modeling “experts” who had commitment to the project  Conclusion: Model iteration and elicitation relies on: More extensive knowledge of syntax and analysis procedures More extensive knowledge of the domain “buy-in”/caring about a real problem Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 18

Domain Knowledge  Domain Knowledge: leads to a better understanding of the domain  Follow-up question: “do you feel that you have a better understanding of the model and the domain after this exercise?”  7/10 participants said yes (mix of ad-hoc and systematic participants)  Conclusion: both ad-hoc and systematic knowledge can help improve domain knowledge Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 19

Additional Findings  Promoted Discussion in Group Setting: human judgment caused discussion among participants Example: “what is meant by Flexibility?”  Model Interpretation Consistency i* syntax leaves room for interpretation Results shows a variety of interpretations when propagating analysis labels with ad-hoc analysis Conclusion: systematic analysis provokes a more consistent interpretation of the model  Coverage of Model Analysis Results show significant differences in the coverage of analysis across the model with systematic vs. ad-hoc analysis  Model Completeness and Analysis Analysis may not be useful until the model is sufficiently complete Some participants noticed incompleteness in the model(s) after applying analysis Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 20

Conclusions and Future Work  Designed and administered studies to test perceived benefits of interactive goal model analysis  Initial Hypotheses: Analysis, Model Iteration, Elicitation, Domain Knowledge Benefits dependent on:  Knowledge of i* and i* evaluation  Presence of an experienced facilitator  Domain expertise/buy-in  The presence of a real motivating problem  Discovered benefits: Interpretation Consistency, Coverage of Model Analysis, Model Completeness  Several threats to validity (construct, internal, external, reliability) described in the paper  Future Work More realistic action-research type studies Better tool support – make the tool the expert? Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 21

Thank you Questions?     OpenOME: Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 22

Outline  Goal Modeling  Goal Model Analysis  Hypotheses: Benefits of Systematic, Interactive Goal Model Analysis  Background: i* Syntax  Background: Interactive Goal Model Analysis  Case Study Design Group study Individual Studies  Results  Threats to Validity  Conclusions and Future Work Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 23

Goal Model Analysis  Work has argued that more utility can be gained from goal models by applying systematic analysis  Many different types of analysis procedures have been introduced Metrics (Franch, 06) (Kaiya, 02) Model checking (Fuxman et al., 03) (Giorgini et al., 04) Simulation (Gans et al., 03) (Wang & Lesperance, 01) Planning (Bryl et al., 06) (Asnar et al., 07) Satisfaction Propagation (Chung et al., 00) (Giorgini et al., 05)  Most of this work focuses on the analytical power and mechanisms of the procedures  What are the benefits of goal model analysis?  Do these benefits apply only to a systematic procedure? Or also to ad-hoc (no systematic procedure) analysis? Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 24

inflo (Group) Case Study  inflo: “back-of-the-envelope” calculation and modeling tool Support informed debate over issues like carbon footprint calculations  Four grad students, 1 professor, and 1 facilitator  Three two hour modeling sessions + one hour analysis session  Most of each session devoted to developing the model & discussion  Used systematic model analysis at the end of each session Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 25

Individual Studies (Round 1)  Analysis Questions: 12 questions provided 4 per model (3 models) 2 for each analysis direction (forward, backward) per model  Example (forward): “If every task of the Sustainability Chair and Local Chair is performed, will goals related to sustainability be sufficiently satisfied?”  Example (backward): “What must be done in order to Encourage informal and spontaneous introductions and Make conference participation fun?” Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 26

Analysis Methodology  1. Alternative Effects (Forward Analysis) a) Implement as much as possible: all leaves are satisfied b) Implement as little as possible: all leaves are denied c) Reasonable Implementation Alternatives: Evaluate likely alternatives  2. Achievement Possibilities (Backward Analysis) a) Maximum targets: all roots must be fully satisfied. Is this possible? How? b) Minimum targets: lowest permissible values for the roots. Is this possible? How? c) Iteration over minimum targets: try gradually increasing the targets in order to find maximum targets which still allow a solution.  3. Domain-Driven Analysis (Mixed) a) Use the model to answer interesting domain-driven questions Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 27

Threats to Validity  Construct Validity Model changes may not be beneficial  Internal Validity Presence of facilitator Think-aloud protocol Choice of model domain  External Validity Used students Used i* - generalize to other goal model frameworks?  Reliability Facilitator was i* & evaluation expert Interactive Goal Model Analysis Applied - Horkoff, Yu, Ghose 28