Download presentation
Presentation is loading. Please wait.
Published byBryan McGee Modified over 9 years ago
1
Data analysis in I2U2 I2U2 all-hands meeting Michael Wilde Argonne MCS University of Chicago Computation Institute 12 Dec 2005
2
www.griphyn.org/vds Scaling up Social Science: Parallel Citation Network Analysis Work of James Evans, University of Chicago, Department of Sociology
3
www.griphyn.org/vds Scaling up the analysis l Database queries of 25+ million citations l Work started on small workstations l Queries grew to month-long duration l With database distributed across U of Chicago TeraPort cluster: u 50 (faster) CPUs gave 100 X speedup u Many more methods and hypotheses can be tested! l Grid enables deeper analysis and wider access
4
www.griphyn.org/vds Grids Provide Global Resources To Enable e-Science
5
www.griphyn.org/vds Why Grids? eScience is the Initial Motivator … l New approaches to inquiry based on u Deep analysis of huge quantities of data u Interdisciplinary collaboration u Large-scale simulation and analysis u Smart instrumentation u Dynamically assemble the resources to tackle a new scale of problem l Enabled by access to resources & services without regard for location & other barriers … but eBusiness is catching up rapidly, and this will benefit both domains
6
www.griphyn.org/vds Technology that enables the Grid l Directory to locate grid sites and services l Uniform interface to computing sites l Fast and secure data set mover l Directory to track where datasets live l Security to control access l Toolkits to create application services u Globus, Condor, VDT, many more
7
www.griphyn.org/vds Virtual Data and Workflows l Next challenge is managing and organizing the vast computing and storage capabilities provided by Grids l Workflow expresses computations in a form that can be readily mapped to Grids l Virtual data keeps accurate track of data derivation methods and provenance l Grid tools virtualize location and caching of data, and recovery from failures
8
www.griphyn.org/vds Virtual Data Process l Describe data derivation or analysis steps in a high-level workflow language (VDL) l VDL is cataloged in a database for sharing by the community l Workflows for Grid generated automatically from VDL l Provenance of derived results goes back into catalog for assessment or verification
9
www.griphyn.org/vds Virtual Data Lifecycle l Describe u Record the processing and analysis steps applied to the data u Document the devices and methods used to measure the data l Discover u I have some subject images - what analyses are available? Which can be applied to this format? u I’m a new team member – what are the methods and protocols of my colleagues? l Reuse u I want to apply an image registration program to thousands of objects. If the results already exist, I’ll save weeks of computation. l Validate u I’ve come across some interesting data, but I need to understand the nature of the preprocessing applied when it was constructed before I can trust it for my purposes.
10
www.griphyn.org/vds Virtual Data Workflow Abstracts Grid Details
11
www.griphyn.org/vds Workflow - the next programming model?
12
www.griphyn.org/vds Jim Annis, Steve Kent, Vijay Sehkri, Fermilab, Michael Milligan, Yong Zhao, University of Chicago Galaxy cluster size distribution DAG Virtual Data Example: Galaxy Cluster Search Sloan Data
13
www.griphyn.org/vds A virtual data glossary l virtual data u defining data by the logical workflow needed to create it virtualizes it with respect to location, existence, failure, and representation l VDS – Virtual Data System u The tools to define, store, manipulate and execute virtual data workflows l VDT – Virtual Data Toolkit u A larger set of tools, based on NMI, VDT provides the Grid environment in which VDL workflows run l VDL – Virtual Data Language u A language (text and XML) that defines the functions and function calls of a virtual data workflow l VDC – Virtual Data Catalog u The database and schema that store VDL definitions
14
www.griphyn.org/vds What must we “virtualize” to compute on the Grid? l Location-independent computing: represent all workflow in abstract terms l Declarations not tied to specific entities: u sites u file systems u schedulers l Failures – automated retry for data server and execution site un-availability
15
www.griphyn.org/vds Expressing Workflow in VDL TR grep (in a1, out a2) { argument stdin = ${a1}; argument stdout = ${a2}; } TR sort (in a1, out a2) { argument stdin = ${a1}; argument stdout = ${a2}; } DV grep (a1=@{in:file1}, a2=@{out:file2}); DV sort (a1=@{in:file2}, a2=@{out:file3}); file1 file2 file3 grep sort
16
www.griphyn.org/vds Expressing Workflow in VDL TR grep (in a1, out a2) { argument stdin = ${a1}; argument stdout = ${a2}; } TR sort (in a1, out a2) { argument stdin = ${a1}; argument stdout = ${a2}; } DV grep (a1=@{in:file1}, a2=@{out:file2}); DV sort (a1=@{in:file2}, a2=@{out:file3}); file1 file2 file3 grep sort Define a “function” wrapper for an application Provide “actual” argument values for the invocation Define “formal arguments” for the application Define a “call” to invoke application Connect applications via output-to-input dependencies
17
www.griphyn.org/vds Essence of VDL l Elevates specification of computation to a logical, location-independent level l Acts as an “interface definition language” at the shell/application level l Can express composition of functions l Codable in textual and XML form l Often machine-generated to provide ease of use and higher-level features l Preprocessor provides iteration and variables
18
www.griphyn.org/vds Using VDL l Generated directly for low-volume usage l Generated by scripts for production use l Generated by application tool builders as wrappers around scripts provided for community use l Generated transparently in an application- specific portal (e.g. quarknet.fnal.gov/grid) l Generated by drag-and-drop workflow design tools such as Triana
19
www.griphyn.org/vds Basic VDL Toolkit l Convert between text and XML representation l Insert, update, remove definitions from a virtual data catalog l Attach metadata annotations to defintions l Search for definitions l Generate an abstract workflow for a data derivation request l Multiple interface levels provided: u Java API, command line, web service
20
www.griphyn.org/vds Representing Workflow l Specifies a set of activities and control flow l Sequences information transfer between activities l VDS uses XML-based notation called “DAG in XML” (DAX) format l VDC Represents a wide range of workflow possibilities l DAX document represents steps to create a specific data product
21
www.griphyn.org/vds Executing VDL Workflows Abstract workflow Local planner DAGman DAG Statically Partitioned DAG DAGman & Condor-G Dynamically Planned DAG VDL Program Virtual Data catalog Virtual Data Workflow Generator Job Planner Job Cleanup Workflow spec Create Execution Plan Grid Workflow Execution
22
www.griphyn.org/vds OSG:The “target chip” for VDS Workflows Supported by the National Science Foundation and the Department of Energy.
23
www.griphyn.org/vds VDS Applications ApplicationJobs / workflowLevelsStatus ATLAS HEP Event Simulation 500K1In Use LIGO Inspiral/Pulsar ~7002-5Inspiral In Use NVO/NASA Montage/Morphology 1000s7Both In Use GADU Genomics: BLAST,… 40K1In Use fMRI DBIC AIRSN Image Proc 100s12In Devel QuarkNet CosmicRay science <103-6 In Use SDSS Coadd; Cluster Search 40K 500K 2828 In Devel / CS Research FOAM Ocean/Atmos Model 2000 (core app runs 250 8-CPU jobs) 3In use GTOMO Image proc 1000s1In Devel SCEC Earthquake sim 1000sIn use
24
www.griphyn.org/vds A Case Study – Functional MRI l Problem: “spatial normalization” of a images to prepare data from fMRI studies for analysis l Target community is approximately 60 users at Dartmouth Brain Imaging Center l Wish to share data and methods across country with researchers at Berkeley l Process data from arbitrary user and archival directories in the center’s AFS space; bring data back to same directories l Grid needs to be transparent to the users: Literally, “Grid as a Workstation”
25
www.griphyn.org/vds A Case Study – Functional MRI (2) l Based workflow on shell script that performs 12-stage process on a local workstation l Adopted replica naming convention for moving user’s data to Grid sites l Creates VDL pre-processor to iterate transformations over datasets l Utilizing resources across two distinct grids – Grid3 and Dartmouth Green Grid
26
www.griphyn.org/vds Functional MRI Analysis Workflow courtesy James Dobson, Dartmouth Brain Imaging Center
27
www.griphyn.org/vds Spatial normalization of functional run Dataset-level workflowExpanded (10 volume) workflow
28
www.griphyn.org/vds Conclusion: Motivation for the Grid l Provide flexible, cost-effective supercomputing u Federate computing resources u Organize storage resources and make them universally available u Link them on networks fast enough to achieve federation l Create usable Supercomputing u Shield users from heterogeneity u Organize and locate widely distributed resources u Automate policy mechanisms for resource sharing l Provide ubiquitous access while protecting valuable data and resources
29
www.griphyn.org/vds Grid Opportunities l Vastly expanded computing and storage l Reduced effort as needs scale up l Improved resource utilization, lower costs l Facilities and models for collaboration l Sharing of tools, data, and procedures and protocols l Recording, discovery, review and reuse of complex tasks l Make high-end computing more readily available
30
www.griphyn.org/vds fMRI Dataset processing FOREACH BOLDSEQ DV reorient (# Process Blood O2 Level Dependent Sequence input = [ @{in: "$BOLDSEQ.img"}, @{in: "$BOLDSEQ.hdr"} ], output = [@{out: "$CWD/FUNCTIONAL/r$BOLDSEQ.img"} @{out: "$CWD/FUNCTIONAL/r$BOLDSEQ.hdr"}], direction = "y", ); END DV softmean ( input = [ FOREACH BOLDSEQ @{in:"$CWD/FUNCTIONAL/har$BOLDSEQ.img"} END ], mean = [ @{out:"$CWD/FUNCTIONAL/mean"} ] );
31
www.griphyn.org/vds fMRI Virtual Data Queries Which transformations can process a “subject image”? l Q: xsearchvdc -q tr_meta dataType subject_image input l A: fMRIDC.AIR::align_warp List anonymized subject-images for young subjects: l Q: xsearchvdc -q lfn_meta dataType subject_image privacy anonymized subjectType young l A: 3472-4_anonymized.img Show files that were derived from patient image 3472-3: l Q: xsearchvdc -q lfn_tree 3472-3_anonymized.img l A: 3472-3_anonymized.img 3472-3_anonymized.sliced.hdr atlas.hdr atlas.img … atlas_z.jpg 3472-3_anonymized.sliced.img
32
www.griphyn.org/vds Blasting for Protein Knowledge Blasting complete nr file for sequence similarity and function Characterization Knowledge Base PUMA is an interface for the researchers to be able to find information about a specific protein after having been analyzed against the complete set of sequenced genomes (nr file ~ approximately 2 million sequences) Analysis on the Grid The analysis of the protein sequences occurs in the background in the grid environment. Millions of processes are started since several tools are run to analyze each sequence, such as finding out protein similarities (BLAST), protein family domain searches (BLOCKS), and structural characteristics of the protein.
33
www.griphyn.org/vds FOAM: Fast Ocean/Atmosphere Model 250-Member Ensemble Run on TeraGrid under VDS FOAM run for Ensemble Member 1 FOAM run for Ensemble Member 2 FOAM run for Ensemble Member N Atmos Postprocessing Ocean Postprocessing for Ensemble Member 2 Coupl Postprocessing for Ensemble Member 2 Atmos Postprocessing for Ensemble Member 2 Coupl Postprocessing for Ensemble Member 2 Results transferred to archival storage Work of: Rob Jacob (FOAM), Veronica Nefedova (Workflow design and execution) Remote Directory Creation for Ensemble Member 1 Remote Directory Creation for Ensemble Member 2 Remote Directory Creation for Ensemble Member N
34
www.griphyn.org/vds FOAM: TeraGrid/VDSBenefits Climate Supercomputer TeraGrid with NMI and VDS Visualization courtesy Pat Behling and Yun Liu, UW Madison
35
www.griphyn.org/vds Small Montage Workflow ~1200 node workflow, 7 levels Mosaic of M42 created on the Teragrid using Pegasus
36
www.griphyn.org/vds LIGO Inspiral Search Application l Describe… Inspiral workflow application is the work of Duncan Brown, Caltech, Scott Koranda, UW Milwaukee, and the LSC Inspiral group
37
www.griphyn.org/vds US-ATLAS Data Challenge 2 Sep 10 Mid July CPU-day Event generation using Virtual Data
38
www.griphyn.org/vds Provenance for DC2 How much compute time was delivered? | years| mon | year | +------+------+------+ |.45 | 6 | 2004 | | 20 | 7 | 2004 | | 34 | 8 | 2004 | | 40 | 9 | 2004 | | 15 | 10 | 2004 | | 15 | 11 | 2004 | | 8.9 | 12 | 2004 | +------+------+------+ Selected statistics for one of these jobs: start: 2004-09-30 18:33:56 duration: 76103.33 pid: 6123 exitcode: 0 args: 8.0.5 JobTransforms-08-00-05-09/share/dc2.g4sim.filter.trf CPE_6785_556... -6 6 2000 4000 8923 dc2_B4_filter_frag.txt utime: 75335.86 stime: 28.88 minflt: 862341 majflt: 96386 Which Linux kernel releases were used ? How many jobs were run on a Linux 2.4.28 Kernel?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.