Technology Acceptance Model (TAM) Evaluation for Academic Information System (case study : Ma Chung University) Soetam Rizky Wicaksono, Audrey Amelia Ma.

Slides:



Advertisements
Similar presentations
Chapter 15: Analytical evaluation
Advertisements

Chapter 2 The Process of Experimentation
Evaluation of User Interface Design
Preparing Data for Quantitative Analysis
CAPD eBook: Evaluating Multimedia Application for Continuous Ambulatory Peritoneal Dialysis (CAPD) Users Presenter: Mohammad Hafiz Ismail Arifah Fasha.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
CS305: HCI in SW Development Evaluation (Return to…)
Chapter 10 Schedule Your Schedule. Copyright 2004 by Pearson Education, Inc. Identifying And Scheduling Tasks The schedule from the Software Development.
DECO3008 Design Computing Preparatory Honours Research KCDCC Mike Rosenman Rm 279
淡江大學 資管碩一 林詒慧 資管碩一 陳韋翰 Riemenschneider, C. K., & Hardgrave, B. C. (2001). Explaining software development tool use with the technology.
UI Standards & Tools Khushroo Shaikh.
Usability Assessment, Evaluation and Testing Laura and Julie.
COMP6703 : eScience Project III ArtServe on Rubens Emy Elyanee binti Mustapha Supervisor: Peter Stradzins Client: Professor Michael.
Usability Inspection n Usability inspection is a generic name for a set of methods based on having evaluators inspect or examine usability-related issues.
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
Project Life Cycle Jon Ivins DMU. Introduction n Projects consist of many separate components n Constraints include: time, costs, staff, equipment n Assets.
Copyright © 1998 Wanda Kunkle Computer Organization 1 Chapter 2.1 Introduction.
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
An evaluation framework
Feedback from Usability Evaluation to User Interface Design: Are Usability Reports Any Good? Christian M. Nielsen 1 Michael Overgaard 2 Michael B. Pedersen.
Business research methods: data sources
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
James Tam Evaluating Interfaces With Users Why evaluation is crucial to interface design General approaches and tradeoffs in evaluation The role of ethics.
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
User Experience Design Goes Agile in Lean Transformation – A Case Study (2012 Agile Conference) Minna Isomursu, Andrey Sirotkin (VTT Technical Research.
© 2015 albert-learning.com Marketing Research Process MARKETING RESEARCH PROCESS.
© Simeon Keates 2009 Usability with Project Lecture 15 – 04/11/09 Dr. Simeon Keates.
Celeste M. Schwartz, Ph.D. Montgomery County Community College Blue Bell, Pennsylvania
Predictive Evaluation
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Research Methodology. Refers to search for knowledge. Research is an academic activity.
Evaluation of software engineering. Software engineering research : Research in SE aims to achieve two main goals: 1) To increase the knowledge about.
Course on Data Analysis and Interpretation P Presented by B. Unmar Sponsored by GGSU PART 2 Date: 5 July
Question 1 Why did a majority of students perceive the innovative web-enhanced Japanese language courses favorably and participate in additional online.
3461P Crash Course Lesson on Usability Testing The extreme, extreme basics...
Human Computer Interaction
User Interface Design Main issues: What is the user interface How to design a user interface ©2008 John Wiley & Sons Ltd.
Assumes that events are governed by some lawful order
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Agile Method Paper Report 資工 4A 余修丞. 2 Agile methods rapidly replacing traditional methods at Nokia: A survey of opinions on agile transformation.
IFS310: Module 3 1/25/2007 Fact Finding Techniques.
Software Engineering User Interface Design Slide 1 User Interface Design.
Chapter 8 Usability Specification Techniques Hix & Hartson.
A Machine Learning Approach to Programming. Agenda Overview of current methodologies. Disadvantages of current methodologies. MLAP: What, Why, How? MLAP:
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
Sample Cost/Benefit Analysis of adding Human Factors Tasks to a Software Development Project Adapted from: Mantei, Marilyn M. and Teorey, Toby J., “ Cost/Benefit.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Systems Development Life Cycle
Cat 2 Non Experimental Research Projects Day Competition 2009.
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Evaluation Methods - Summary. How to chose a method? Stage of study – formative, iterative, summative Pros & cons Metrics – depends on what you want to.
Chapter 5:User Interface Design Concepts Of UI Interface Model Internal an External Design Evaluation Interaction Information Display Software.
1 SEG3120 Analysis and Design for User Interfaces LAB1: Video tape evaluation.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Copyright 2010, The World Bank Group. All Rights Reserved. Development of Training and Procedural Manuals Section B 1.
Designing Better Online Teaching Material 교과목명 : 컴퓨터 교육론 논문 발표자 : 4 학기 신지연 발표일 :
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
TECHNOLOGY ACCEPTANCE MODEL
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
A focus group is actually gathering of people who are customers or users representatives for a product to gain its feedback. The feedback can be collected.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
INF385G: Topic Discussion Huang, S. C.
Software Engineering: A Practitioner’s Approach, 6/e Chapter 12 User Interface Design copyright © 1996, 2001, 2005 R.S. Pressman & Associates, Inc.
Collaboration with Google Drive
Model based design.
CIS 210 Systems Analysis and Development
Usability Techniques Lecture 13.
Evaluation.
Presentation transcript:

Technology Acceptance Model (TAM) Evaluation for Academic Information System (case study : Ma Chung University) Soetam Rizky Wicaksono, Audrey Amelia Ma Chung University CONTACT – Universitas Ma Chung Villa Puncak Tidar N-01 Malang, 65155, Indonesia Tel: Fax : “ [1] R. Pressman, Software Engineering, 5th ed., Mc Graw Hill, [2] H. C. Chan and H.-H. Teo, "Evaluating the boundary conditions of the technology acceptance model: An exploratory investigation," ACM Transactions on Computer-Human Interaction,, vol. 14, no. 2, [3] E. Frokjaer and K. Hornbaek, "Metaphors of human thinking for usability inspection and design," ACM Transactions on Computer-Human Interaction, vol. 14, no. 4, [4] B. P. Bailey and S. T. Iqbal, "Understanding changes in mental workload during execution Understanding changes in mental workload during execution," ACM Transactions on Computer-Human Interaction, vol. 14, no. 4, [5] A. Burton-Jones and G. S. Hubona, "The Mediation of External Variables in the Technology Acceptance Model," Information and Management, vol. 43, pp , [6] D. Davis, "A technology acceptance model for empirically testing new end-user information systems: theory and results," Evaluation of the information system (IS) that has been implemented often becoming forgotten stage by the software developers. Evaluation in IS using the spiral model is considered as the last stage [1]. It also considers as the beginning stages of development at the next level. However, it is strongly advisable to direct the evaluation towards usability when IS has already implemented for a while. The evaluation should create recommendation and also revision for better IS. This evaluation using a which names as technology acceptance model (TAM) [2, 3] The evaluation in general can be obtained from (1) user response based questionnaire or (2) questions which was not disclosed and will be missed when users interact directly with the software [3]. Another way is by measuring the level of error that has been made by the user and assume the results are as disruption or interruption which will ultimately result in a negative effect is referred to as cost of interruptions [4]. Background The early stages of the evaluation is done by analyzing the results of TAM based on error rates that have been performed by users and requiring revision and refinement directly from the software developer. On the other hand, it is also rechecked with the availability of guidelines from software developer. Thus, it will result PU (Perceived Usefulness) and PEOU (Perceived Ease Of Use) that describes the user acceptance of information systems and form the basis for evaluation at a later stage [2]. Results of the TAM evaluation expected to increase user awareness about the importance of PE and PEOU disclosure as feedback to the software developers as simple and honest opinion. Within the scope of this study, it is expected that the evaluation results can be interaction TAM is silent interaction from the user with the developer, considering that interaction evaluation by way of think aloud is still very difficult to do in real terms at an institution that holds a high level of manners (especially in Indonesia) ABSTRACT Evaluation of information technology is an activity that is often forgotten by the developer. Moreover, if an application system has been running long enough, it is strongly recommended to evaluate in terms of usability. Proper evaluation method in this case one of them is the evaluation of the Technology Acceptance Model (TAM) which use of two types of evaluation which are: (1) Perceived Usefulness, (2) Perceived Ease of Use. The observation error rate that occurs when users use the system can also support the results of the evaluation of TAM. Frequency of use and the results of interviews regarding the availability of additional material evaluation guide into TAM. All of the data collection result and analysis are summarized to improve the effectiveness of the performance even though the system still requires periodic development to meet the needs of users Keywords: Technology Acceptance Model, Software Engineering, Academic Information System, Usability CONCLUSIONS REFERENCES On the evaluation of Perceived Usefulness there are 4 types of evaluation, which are: (1) increasing the effectiveness of work, (2) improve overall performance, (3) faster usage and (4) menu usability. Summary of the results of the evaluation of PU explain that more than 90% of respondents said it help progress more quickly and can increase the effectiveness of work. In addition, 80% more respondents stated that the information academic system usage can improve the overall performance and found in accordance with the module already done. Then summarized that the system is being able to complete the job faster, but the existing modules are still not complete so it can not improve the overall performance Result : Perceived Usefulness Method Result : Perceived Ease of Use NoQuestionsYes(%)No (%) 1Faster task completion 94,44%5,56% 2Work effectiveness 94,44%5,56% 3Overall performance 83,33%16,67% 4Menu acceptance 77,78%22,22% NoQuestionsYes(%)No (%) 1Usage without training 83,33%16,67% 2Ease daily task 77,78%22,22% 3Understandable interaction 83,33%16,67% 4New menus are easy to learn 83,33%16,67% While the results of Perceived Ease of Use consists of four main evaluation namely: (1) ease of use, (2) completion of everyday tasks, (3) ease of interaction of the system and (4) the addition of new menu. A summary of the results states that the respondents still have resistance to the system that has been tested. PU percentage decrease compared to the results of PEU as a reaction from the respondents that implicitly mention they still want a system that is relatively better for each individual subjectively. In addition, the level of understanding and use of each respondent has not been measured in detail so that still require further research with qualitative methods in order to obtain optimal results. The linkage between the results of the evaluation of PU, PEOU, observation error rate and the average frequency of use Information Academic System is how the system is used by the user. Also interpret the usability even if the error is still there when the system is used. In addition, the average result using the application every day also determines the user to accept the application and use the application as a means of support in the improvement of the performance of the user. These results will be answered, whether this information system should be developed into a better direction or even better overall revamp.