Profiling/Tracing Method and Tool Evaluation Strategy Summary Slides Hung-Hsun Su UPC Group, HCS lab 1/25/2005.

Slides:



Advertisements
Similar presentations
Identification of Distributed Features in SOA Anis Yousefi, PhD Candidate Department of Computing and Software McMaster University July 30,
Advertisements

The Path to Multi-core Tools Paul Petersen. Multi-coreToolsThePathTo 2 Outline Motivation Where are we now What is easy to do next What is missing.
70-290: MCSE Guide to Managing a Microsoft Windows Server 2003 Environment Chapter 11: Monitoring Server Performance.
1 / 31 CS 425/625 Software Engineering User Interface Design Based on Chapter 15 of the textbook [SE-6] Ian Sommerville, Software Engineering, 6 th Ed.,
1 Software Testing and Quality Assurance Lecture 30 – Testing Systems.
MCITP Guide to Microsoft Windows Server 2008 Server Administration (Exam #70-646) Chapter 14 Server and Network Monitoring.
Instrumentation and Measurement CSci 599 Class Presentation Shreyans Mehta.
Intel Trace Collector and Trace Analyzer Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding.
Copyright © 2001 by Wiley. All rights reserved. Chapter 1: Introduction to Programming and Visual Basic Computer Operations What is Programming? OOED Programming.
Hands-On Microsoft Windows Server 2008 Chapter 11 Server and Network Monitoring.
CH 13 Server and Network Monitoring. Hands-On Microsoft Windows Server Objectives Understand the importance of server monitoring Monitor server.
Introduction to Systems Analysis and Design Trisha Cummings.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
Hands-On Microsoft Windows Server 2008
PAPI Tool Evaluation Bryan Golden 1/4/2004 HCS Research Laboratory University of Florida.
UPC/SHMEM PAT High-level Design v.1.1 Hung-Hsun Su UPC Group, HCS lab 6/21/2005.
Cmpe 589 Spring Software Quality Metrics Product  product attributes –Size, complexity, design features, performance, quality level Process  Used.
MpiP Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida.
Bottlenecks: Automated Design Configuration Evaluation and Tune.
Parallel Programming Models Jihad El-Sana These slides are based on the book: Introduction to Parallel Computing, Blaise Barney, Lawrence Livermore National.
CCS APPS CODE COVERAGE. CCS APPS Code Coverage Definition: –The amount of code within a program that is exercised Uses: –Important for discovering code.
Analyzing parallel programs with Pin Moshe Bach, Mark Charney, Robert Cohn, Elena Demikhovsky, Tevi Devor, Kim Hazelwood, Aamer Jaleel, Chi- Keung Luk,
Chapter 6 : Software Metrics
FCS - AAO - DM COMPE/SE/ISE 492 Senior Project 2 System/Software Test Documentation (STD) System/Software Test Documentation (STD)
70-290: MCSE Guide to Managing a Microsoft Windows Server 2003 Environment, Enhanced Chapter 11: Monitoring Server Performance.
Scalable Analysis of Distributed Workflow Traces Daniel K. Gunter and Brian Tierney Distributed Systems Department Lawrence Berkeley National Laboratory.
Bug Localization with Machine Learning Techniques Wujie Zheng
11 July 2005 Tool Evaluation Scoring Criteria Professor Alan D. George, Principal Investigator Mr. Hung-Hsun Su, Sr. Research Assistant Mr. Adam Leko,
Performance Model & Tools Summary Hung-Hsun Su UPC Group, HCS lab 2/5/2004.
Information Retrieval Evaluation and the Retrieval Process.
John Mellor-Crummey Robert Fowler Nathan Tallent Gabriel Marin Department of Computer Science, Rice University Los Alamos Computer Science Institute HPCToolkit.
Software Engineering Quality What is Quality? Quality software is software that satisfies a user’s requirements, whether that is explicit or implicit.
1 COMP 3438 – Part II-Lecture 1: Overview of Compiler Design Dr. Zili Shao Department of Computing The Hong Kong Polytechnic Univ.
4.2.1 Programming Models Technology drivers – Node count, scale of parallelism within the node – Heterogeneity – Complex memory hierarchies – Failure rates.
Overview of CrayPat and Apprentice 2 Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red: Negative.
SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:
Martin Schulz Center for Applied Scientific Computing Lawrence Livermore National Laboratory Lawrence Livermore National Laboratory, P. O. Box 808, Livermore,
70-290: MCSE Guide to Managing a Microsoft Windows Server 2003 Environment, Enhanced Chapter 11: Monitoring Server Performance.
IR Homework #2 By J. H. Wang Mar. 31, Programming Exercise #2: Query Processing and Searching Goal: to search relevant documents for a given query.
Dynaprof Evaluation Report Adam Leko, Hans Sherburne UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:
Summary Presentation (3/24/2005) UPC Group HCS Research Laboratory University of Florida.
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
HPCToolkit Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:
Lecture by: Prof. Pooja Vaishnav.  Language Processor implementations are highly influenced by the kind of storage structure used for program variables.
Dynaprof Evaluation Report Adam Leko, Hans Sherburne UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:
Chapter One An Introduction to Programming and Visual Basic.
Overview of dtrace Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red: Negative note Green: Positive.
Overview of AIMS Hans Sherburne UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red: Negative note Green:
TSS Database Inventory. CIRA has… Received and imported the 2002 and 2018 modeling data Decided to initially store only IMPROVE site-specific data Decided.
21 Sep UPC Performance Analysis Tool: Status and Plans Professor Alan D. George, Principal Investigator Mr. Hung-Hsun Su, Sr. Research Assistant.
Concept Diagram Hung-Hsun Su UPC Group, HCS lab 1/27/2004.
Spring ’05 Independent Study Midterm Review Hans Sherburne HCS Research Laboratory University of Florida.
Testing plan outline Adam Leko Hans Sherburne HCS Research Laboratory University of Florida.
General requirements for BES III offline & EF selection software Weidong Li.
© 2002 IBM Corporation Confidential | Date | Other Information, if necessary Copyright © 2009 Ericsson, Made available under the Eclipse Public License.
Development of a data acquisition program builder via a user interface F.Fujiwara, N.Tamura, M.Abe, S.Enomoto, G.Iwai, S,Kawabata, A.Manabe,Y.Nagasaka,
Parallel Performance Wizard: A Generalized Performance Analysis Tool Hung-Hsun Su, Max Billingsley III, Seth Koehler, John Curreri, Alan D. George PPW.
Introduction to Performance Tuning Chia-heng Tu PAS Lab Summer Workshop 2009 June 30,
SOFTWARE TESTING TRAINING TOOLS SUPPORT FOR SOFTWARE TESTING Chapter 6 immaculateres 1.
SQL Database Management
Topic 2: Hardware and Software
Using Ada-C/C++ Changer as a Converter Automatically convert to C/C++ to reuse or redeploy your Ada code Eliminate the need for a costly and.
Assembler, Compiler, MIPS simulator
Definition CASE tools are software systems that are intended to provide automated support for routine activities in the software process such as editing.
Software Testing.
Many-core Software Development Platforms
Introduction to Systems Analysis and Design
Chapter One: An Introduction to Programming and Visual Basic
M. Kezunovic (P.I.) S. S. Luo D. Ristanovic Texas A&M University
SeeSoft A Visualization Tool..
Presentation transcript:

Profiling/Tracing Method and Tool Evaluation Strategy Summary Slides Hung-Hsun Su UPC Group, HCS lab 1/25/2005

Profiling/Tracing Method

Experimental performance measurement Instrumentation – insertion of instrumentation code (in general) Measurement – actual measuring stage Analysis – filtering, aggregation, analysis of data gathered Presentation – display of analyzed data to the user. The only phase that deals directly with user Optimization – process of resolving bottleneck

Instrumentation (1) Overhead  Manual – amount of work needed from user  Performance – overhead added by tool to program Profiling / Tracing  Profiling – collecting of statistical event data. Generally refers to filtering and aggregating a subset of event data after program terminates  Tracing – Use to record the majority of events possible in logical order (generally with timestamp). Can use to reconstruct accurate program behavior. Require large amount of storage 2 ways to lower tracing cost – (1) compact tract file format (2) Smart tracing system that turns on and off Manual vs. Automatic – user/tool that is responsible for the instrumentation of original code. Categorization of which event is better suited for which method is desirable

Instrumentation (2) Number of passes – The number of times a program need to be executed to get performance data. One pass is desirable for long running program but multi-pass can provide more accurate data (ex: first pass=profiling, later pass=tracing using profiling data to turn on and off tracing). Hybrid method is available but might not be as accurate as multi-pass Levels - need at least source and binary to be useful (some event more suited for source level and other binary level)  Source level – manual, pre-compiler, instrumentation language  System level – library or compiler  Operating system level  Binary level – statically or dynamically

Tool Evaluation Strategy

Feature (section)DescriptionInformation to gatherCategoriesImportance Rating Available metrics ( )Kind of metric/events the tool can tract (ex: function, hardware, synchronization) Metrics it can provide (function, hw …) ProductivityCritical Cost (9.1.1)Physical cost for obtaining software, license, etc. How muchMiscellaneousAverage Documentation quality (9.3.2) Helpfulness of the document in term of understanding the tool design and its usage (usage more important) Clear document? Helpful document?MiscellaneousMinor Extendibility (9.3.1)Ease of (1) add new metrics (2) extend to new language, particularly UPC/SHMEM 1.Estimating of how easy it is to extend to UPC/SHMEM 2.How easy is it to add new metrics MiscellaneousCritical Filtering and aggregation ( ) Filtering is the elimination of “noise” data, aggregation is the combining of data into a single meaningful event. Does it provide filtering? Aggregation? To what degree Productivity, Scalabilit y Critical Hardware support (9.1.4)Hardware support of the toolWhich platforms?Usability, Portabilit y Critical Heterogeneity support (9.1.5)Heterogeneity deals with the ability to run the tool in a system where nodes have different HW/SW configuration. Support running in a heterogeneous environment? MiscellaneousMinor

Installation (9.1.2)Ease of installing the tool1.How to get the software 2.How hard to install the software 3.Components needed 4.Estimate number of hours needed for installation UsabilityMinor Interoperability ( )Ease of viewing result of tool using other tool, using other tool in conjunction with this tool, etc. List of other tools that can be used with this PortabilityAverage Learning curve (9.1.6)Learning time required to use the tool Estimate learning time for basic set of features and complete set of features Usability, Productiv ity Critical Manual overhead ( )Amount of work needed by the user to instrument their program 1.Method for manual instrumentation (source code, instrumentation language, etc) 2.Automatic instrumentation support Usability, Productiv ity Average Measurement accuracy ( ) Accuracy level of the measurementEvaluation of the measuring methodProductivity, Portabilit y Critical Multiple analyses ( )The amount of post measurement analysis the tool provides. Generally good to have different analyses for the same set of data Provide multiple analyses? Useful analyses? UsabilityAverage Multiple executions (9.3.5)Tool support for executing multiple program at once Support multiple executions?ProductivityMinor  Aver age Multiple views ( )Tool’s ability to provide different view/presentation for the same set of data Provide multiple views? Intuitive views? Usability, Productiv ity Critical

Performance bottleneck identification ( ) Tool’s ability to identify the point of performance bottleneck and it’s ability to help resolving the problem Support automatic bottleneck identification? How? ProductivityMinor  Aver age Profiling / tracing support ( ) Method of profiling/tracing the tool utilize 1.Profiling? Tracing? 2.Trace format 3.Trace strategy 4.Mechanism for turning on and off tracing Productivity, Portabilit y, Scalabilit y Critical Response time (9.2.6)Amount of time needed before any useful information is feed back to the user after program execution How long does it take to get back useful information ProductivityAverage Searching (9.3.6)Tool support for search of particular event or set of events Support data searching?ProductivityMinor Software support (9.1.3)Software support of the tool1.Libraries it supports 2.Languages it supports Usability, Productiv ity Critical Source code correlation ( ) Tool’s ability to correlate event data back to the source code Able to correlate performance data to source code? Usability, Productiv ity Critical System stability (9.3.3)Stability of the toolCrash rateUsability, Productiv ity Average Technical support (9.3.4)Responsiveness of the tool developer 1.Time to get a response from developer. 2.Quality/usefulness of system messages UsabilityMinor  Aver age