Presentation is loading. Please wait.

Presentation is loading. Please wait.

 delivers evidence that a solution developed achieves the purpose for which it was designed.  The purpose of evaluation is to demonstrate the utility,

Similar presentations


Presentation on theme: " delivers evidence that a solution developed achieves the purpose for which it was designed.  The purpose of evaluation is to demonstrate the utility,"— Presentation transcript:

1

2  delivers evidence that a solution developed achieves the purpose for which it was designed.  The purpose of evaluation is to demonstrate the utility, quality, and efficacy of a design artifact using rigorous evaluation methods.  the evaluation phase provides essential feedback to the construction phase as to the quality of the design process and the design product under development.  A design artifact is complete and effective when it satisfies the user requirements

3  Rigor in DSR should be approached from two directions. ◦ One is to establish if the artifact causes an observed improvement, its efficacy. ◦ The second direction is to establish if the artifact works in a real situation, its effectiveness.

4  We can distinguish two types of artifacts, product and process.  Products represent tools, diagrams or software that people use to solve a problem.  Process represent a method or procedure that guides someone what to do to solve a problem, thus a person must interact to provide utility of the artifact.  All of these properties of the artifact in some way contribute to the utility of the artifacts and  Also act as criteria that are candidates for evaluation in determining the overall utility.

5  The business environment establishes the requirements upon which the evaluation of the artifact is based.  This environment includes the technical infrastructure which itself is incrementally built by the implementation of new IT artifacts.  Evaluation is to ensure integration of the artifact within the technical infrastructure of the business environment.

6  IT artifacts can be evaluated in terms of functionality, completeness, consistency, accuracy, performance, reliability, usability, fit with the organization, and other relevant quality attributes.  All these variables require the definition of appropriate metrics and the gathering and analysis of appropriate data

7  Hevner et al (2004) suggested five evaluation methods (observational, analytical, experimental, testing, and descriptive).

8 ObservationalCase study: study the artifact in depth in its business environment Field study: monitor artifact in multiple projects AnalyticalStatic analysis : examine structure of artifact for static quality (e.g. complexity) Architecture analysis: study fit of artifact into technical IS architecture Optimization: demonstrate inherit optimal properties of the artifact or provide optimality bounds on artifact behavior ExperimentalControlled experiment: study artifact in controlled environment for qualities (e.g. usability)

9 TestingFunctional (black box testing): execute artifact interfaces to discover failures and identify defects Structural (white box testing): perform coverage testing of some metric (e.g execution paths) in the artifact implementation descriptiveInformed argument: use information from the knowledge base (e.g. relevant literature) to build a convincing argument for artifact’s quality Seniors: construct detailed serious around the artifact to demonstrate its utility

10  Venable (2006) classified evalualtion approaches as 1.artificial and 2.naturalistic evaluation.

11  Artificial evaluation may be empirical or non- empirical.  It is positivist and reductionist, being used to test design hypotheses  Includes laboratory experiments, field experiments, simulations, criteria-based analysis, theoretical arguments, and mathematical proofs.

12  It is unreal in some way or ways for three reasons: ◦ such as unreal users, ◦ unreal systems, and ◦ especially unreal problems (not held by the users and/or not real tasks, etc.)

13  Undertaken in a real environment (real people, real systems (artifacts), and real settings and embraces all of the complexities of human practice in real organizations  Always empirical and may be interpretivist, positivist, and/or critical.  Include case studies, field studies, surveys, ethnography and action research

14  naturalistic evaluation may be affected by confounding variables or misinterpretation, and  evaluation results may not be precise or even truthful about an artefact’s utility or efficacy in real use.

15  Naturalistic evaluation is expensive while artificial has the advantage of cost saving if it is properly managed  there is substantial tension between positivism and interpretivism in evaluation.  The human determination of value is rather central to this tension, drawing in social, cultural, psychological and ethical considerations that will escape a purely technical-rationality.

16  The selection of evaluation methods must be matched appropriately with the designed artifact and the selected evaluation metrics.  Example ◦ Descriptive methods of evaluation should only be used for especially innovative artifacts for which other forms of evaluation may not be feasible.

17  Distributed database design algorithms can be evaluated using expected operating cost or average response time for a given characterization of information processing requirements  Search algorithms can be evaluated using information retrieval metrics such as precision and recall


Download ppt " delivers evidence that a solution developed achieves the purpose for which it was designed.  The purpose of evaluation is to demonstrate the utility,"

Similar presentations


Ads by Google