Download presentation
Presentation is loading. Please wait.
Published byAlyssa Roche Modified over 11 years ago
1
Evaluating quality: the MILE method applied to museum Web sites Franca Garzotto - HOC- Hypermedia Open Center, Politecnico di Milano Maria Pia Guermandi - Istituto Beni Culturali, Regione Emilia-Romagna Franca Garzotto - HOC- Hypermedia Open Center, Politecnico di Milano Maria Pia Guermandi - Istituto Beni Culturali, Regione Emilia-Romagna
2
Outline The need for measuring What is MILE MILE applied to museum web sites
3
The need for measuring quality Once quality criteria are defined (see the Minerva Quality Framework), we need to provide a measurement method to evaluate quality –An evaluation procedure or process –A metrics
4
What is MILE? Developed at Politecnico di Milano (Hypermedia Open Centre) and University of Lugano, Communication Sciences (TEC-lab) A systematic method to evaluate the quality in use of web applications, i.e., USABILITY –Usability is the effectiveness, efficiency and satisfaction with which specified users can achieve specified goals in particular environments (ISO 9241-11) Combining structured inspection with empirical testing Scenario driven Up to now, applied to museum web sites (see next), educational and e-commerce web applications
5
MILE principles The usability of an application can be analysed at two levels: 1 : general (for any web application) 2 : domain specific (e.g. museum web sites)
6
MILE principles (cont.) Separating different levels of analysis: CONTENT COGNITIVE ASPECTS NAVIGATION and INTERACTION GRAPHICAL DESIGN TECHNOLOGY (PERFORMANCE)
7
MiLE Key concepts_1: ABSTRACT TASK An abstract task (AT) describes a pattern of inspection activity –captures usability experience identifies the application features on which it is important to focus inspection describes some actions the inspector should perform during the evaluation
8
MiLE Key concepts_1: ABSTRACT TASK (cont.) Abstract Tasks: enforce standardisation and uniformity reduce time and cost needed for inspection So far, A.T have been defined for –level 1 (general): navigation/interaction/multiple media dynamics –level 2 (domain specific): e.g., museum web sites (see next)
9
MiLE Key concepts_2: SCENARIOs_1 For evaluating domain specific aspects, MILE provides SCENARIOS Scenario: –story about use (Carroll, 2002) –the description of a concrete episode of use of the application (Cato, 2001) In MILE, a scenario is composed by: Abstract Task User profile: –Description of a stereotyped user, shaping the relevant features of typical stakeholders
10
MiLE Key concepts_2: SCENARIOs_2 The definition of scenarios is based on User Requirements investigation and analysis carried on with domain experts and user experts Scenarios are needed to weight the relevance of usability violations that are detected during inspection (see process)
11
MiLE: 5 phases process Modeling the application under inspection To identify the critical areas of the application relevant for a usability evaluation. Performing abstract tasks Executing the actions described by ATs Scoring Measuring accomplishment of abstract tasks through usability attributes Weighting balancing the scores with relevance weights according to user profiles and communication goals Empirical testing user testing
12
Bologna Group
13
1. sites presentation : general information about the Web-site structure 2. the real museum : contents and functions referring to a physical museum museums presentation how to reach how to visit information about museum services information about the museum educational department promotions and fidelization politics information about the museum activities and events information about museum publishing 3. the virtual museum : contents and functions exploiting the communicative strenght of the medium collections on-line single items description educational web activities on-line promotion and fidelization politics on-line events publishing on-line Contents Survey Schema for museum Web-sites
14
Abstract task dimensions
15
Efficiency: the action can be performed successfully and quickly Authority: the author is competent in relation to the subject Currency: the time scope of the contents validity is clearly stated. The info is updated. Consistency: similar pieces of information are dealt with in similar fashions Structure effectiveness: the organization of the content pieces is not disorienting Accessibility: the information is easily and intuitively accessible Completeness: the user can find all the information required Richness: the information required is rich (many examples, data…) Clarity: the information is easy to understand Multilevel: the information is given at different level of understanding Multimediality: different media are used to convey the information Multilinguisticity: the information is given in more than one language Abstract task attributes
16
Users scenarios
17
ABSTRACT TASK: find the educational activities occurring on a range of dates in a real museum USER PROFILE: a family with two sons aged 5-10, living in the town where the real museum is actually located (A1) currency of the information; (A2) quality of the organization of the information; (A3) completeness of the information; (A4) richness of the information provided. RELEVANT ATTRIBUTES
18
www.louvre.fr www.nationalgallery.org.uk Musée du Louvre National Gallery
31
SCORING WEIGHTING
32
(A1) effectiveness of the information; (A2) completeness of the information; (A3) richness of the information; (A4) navigation organization. ABSTRACT TASK: find all the works of a given artist RELEVANT ATTRIBUTES USER PROFILE: an art historian looking for information about a topic he/she is currently carrying a research on
33
www.hermitagemuseum.org www.metmuseum.org Hermitage Museum Metropolitan Museum
51
SCORING WEIGHTING
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.