Presentation is loading. Please wait.

Presentation is loading. Please wait.

DARPA u METRICS Reporting s Web-based t platform independent t accessible from anywhere s Example: correlation plots created on-the-fly t understand the.

Similar presentations


Presentation on theme: "DARPA u METRICS Reporting s Web-based t platform independent t accessible from anywhere s Example: correlation plots created on-the-fly t understand the."— Presentation transcript:

1 DARPA u METRICS Reporting s Web-based t platform independent t accessible from anywhere s Example: correlation plots created on-the-fly t understand the relation between two metrics t find the importance of certain metrics to the flow t always up-to-date Motivations u Value of CAD tools improvement not clear s value well-defined only in context of overall design process u Design process includes other aspects not like any “flow/methodology” bubble chart s must measure to diagnose, and diagnose to improve u Many possibilities for what to measure s solution: record everything, then mine the data u Unlimited range of possible diagnoses s User performs same operation repeatedly with nearly identical inputs t tool is not acting as expected t solution quality is poor, and knobs are being twiddled s On-line docs always open to particular page t command/option is unclear METRICS: A System Architecture for Design Process Optimization Andrew B. Kahng and Stefanus Mantik Abstract We describe the architecture and prototype implementation of METRICS, a system aimed at improving design productivity through new infrastructure for design process optimization. A key precept for METRICS is that measuring a design process is a prerequisite to learning how to optimize that design process and continuously achieve maximum productivity. METRICS, therefore, (i) gathers characteristics of design artifacts, design process, and communication during system development effort, and (ii) analyzes and compares that data to analogous data from prior efforts. METRICS infrastructure consists of three parts: (i) a standard metrics schema, along with metrics transmittal capabilities embedded directly into EDA tools or into wrappers around tools; (ii) a metrics data warehouse and API for metrics retrieval; and (iii) data mining and visualization capabilities for project prediction, tracking, and diagnosis. Salient aspects of METRICS include the following. First, a standard metrics schema, along with standard naming and semantics, allows a metric from one tool to have the same meaning as the same metric from another tool from a different vendor. Second, transmittal APIs that are easily embeddable within tools allow freedom from the "log files" that currently provide only limited visibility into EDA tools. With appropriate security and access restrictions, these APIs can prevent loss of proprietary information while yet enabling detailed tracking of the design process. Third, at the heart of METRICS is a centralized data warehouse that stores metrics information. Several means of data retrieval and visualization (e.g., web-based project tracking and prediction) afford user flexibility. Finally, industry-standard components and protocols (http, XML, Java, Oracle8i, etc.) are used to create a robust, reliable system prototype. EDA Tool Tool wrapper EDA Tool API Java Servlet Oracle8i Inter/Intra-net XML SQL Java Servlet Oracle8i Inter/Intra-net SQL Local Graphing Tool (GNUPlot) data plot Request Report WEB Browser Request Report Data Request Data 3rd Party Graphing Tool (Excel,Lotus) Wrapper Future implementation METRICS System Architecture u METRICS Transmitter s No functional change to the tool t use API to send the available metrics s Low overhead t example: standard-cell placer using Metrics API  < 2% runtime overhead t even less overhead with buffering s Won’t break the tool on transmittal failure t child process handles transmission while parent process continues its job LEF DEF Placed DEF QP ECO Legal DEF Congestion Map WRoute Capo Placer Routed DEF CongestionAnalysis Incr. WRoute Final DEF METRICSMETRICS u METRICS Standards s Standard metrics naming across tools t same name  same meaning, independent of tool supplier t generic metrics and tool-specific metrics t no more ad hoc, incomparable log files s Standard schema for metrics database TOOL 173 9 32 173 9 P32 93762541300 TOOL_NAME CongestionAnalysis Example of METRICS XML, API and Wrapper Conclusions and Ongoing Work u Completion of METRICS server with Oracle8i, Java servlet, and XML parser u Initial transmittal API in C++ u METRICS wrapper for Cadence P&R tools u Simple reporting scheme for correlations u Work with EDA, designer community to establish standards s tool users: list of metrics needed for design process optimization s tool vendors: implementation of the metrics requested with the standardized naming u Improve the transmitter s add message buffering s “recovery” system for network / server failure u Extend METRICS system to include project management tools, email communications, etc. u Additional reports, data mining Conclusions and Ongoing Work u Completion of METRICS server with Oracle8i, Java servlet, and XML parser u Initial transmittal API in C++ u METRICS wrapper for Cadence P&R tools u Simple reporting scheme for correlations u Work with EDA, designer community to establish standards s tool users: list of metrics needed for design process optimization s tool vendors: implementation of the metrics requested with the standardized naming u Improve the transmitter s add message buffering s “recovery” system for network / server failure u Extend METRICS system to include project management tools, email communications, etc. u Additional reports, data mining Inter/Intra-net Tool xmitter Metrics Data Warehouse Data-Mining ReportingServer Wrapper or embedded Tool xmitter Tool xmitter Java Applets Web Browsers Example of Reports Abort by Task Congestion vs. WL LVS Convergence /** API Example **/ int main(int argc, char * argv[ ] ) {... toolID = initToolRun( projectID, flowID );... printf( “Hello World\n” ); sendMetric( projectID, flowID, toolID, “TOOL_NAME”, “Sample” ); sendMetric( projectID, flowID, toolID, “TOOL_VERSION”, “1.0” );... terminateToolRun( projectID, flowID, toolID ); return 0; } ## Wrapper example ( $File, $PID, $FID ) = @ARGV; $TID = `initToolRun $PID $FID`; open ( IN, “< $File” ); while ( ) { if ( /Begin\s+(\S+)\s+on\s+(\S+.*)/) { system “sendMetrics $PID $FID $TID TOOL_NAME $1”; system “sendMetrics $PID $FID $TID START_TIME $2”; }...} system “terminateToolRun $PID $FID $TID”; Capo/Cadence Flow Observations from experience with a previous prototype u Implemented by OxSigen LLC (Fenstermaker, George, Thielges) in Siemens Semicustom Highway flow u The METRICS system must be non-intrusive. The best choice for the system is if it is embedded in the tools. u Big brother type issues must be spelled out clearly at the beginning, and buyoff from user advocates must be considered. All data must be anonymized and any attempt to profile or quantify individual performance on a project is dangerous (but useful). u There is still a very big problem with flows. Ideally, the flow should be standardized, with “Makefile” type build environment for batch chip creation. There is no obvious common way to handle interactive tools yet, so we must be able to metricize flows in a standard way (which requires standard flows). u The CAD / design community must get together to standardize (or better educate) people on flow terminology, especially now that so many new hybrid tools are emerging which combine traditional flow steps. If we simply had a standard set of agreed upon milestones that occur during the lifecycle of a design, we could start to do accurate and more worthwhile benchmarking and prediction. u There is still a very big problem with standardized data management (version control), i.e., lots of custom codes to work around source code control systems in real world environments. u Project management tools need to be more standardized and widely used. These tools act like metrics transmitters for project-related information such as time allotted for certain tasks. This is critical for prediction of project- related details (how long to completion from this point, etc.).


Download ppt "DARPA u METRICS Reporting s Web-based t platform independent t accessible from anywhere s Example: correlation plots created on-the-fly t understand the."

Similar presentations


Ads by Google