The Grid Effort at UF Presented by Craig Prescott.

Slides:



Advertisements
Similar presentations
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
Advertisements

CHEPREO Tier-3 Center Achievements. FIU Tier-3 Center Tier-3 Centers in the CMS computing model –Primarily employed in support of local CMS physics community.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
© , Michael Aivazis DANSE Software Issues Michael Aivazis California Institute of Technology DANSE Software Workshop September 3-8, 2003.
Assessment of Core Services provided to USLHC by OSG.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
LIGO-G E ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech Argonne, May 10, 2004.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
Framework for Automated Builds Natalia Ratnikova CHEP’03.
The CAVES Project Collaborative Analysis Versioning Environment System The CODESH Project COllaborative DEvelopment SHell Dimitri Bourilkov University.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
1 Dynamic Application Installation (Case of CMS on OSG) Introduction CMS Software Installation Overview Software Installation Issues Validation Considerations.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
Grid Leadership Avery –PI of GriPhyN ($11 M ITR Project) –PI of iVDGL ($13 M ITR Project) –Co-PI of CHEPREO –Co-PI of UltraLight –President of SESAPS Ranka.
Virtual Logbooks and Collaboration in Science and Software Development Dimitri Bourilkov, Vaibhav Khandelwal, Archis Kulkarni, Sanket Totala University.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
WG Goals and Workplan We have a charter, we have a group of interested people…what are our plans? goalsOur goals should reflect what we have listed in.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Jarek Nabrzyski, Ariel Oleksiak Comparison of Grid Middleware in European Grid Projects Jarek Nabrzyski, Ariel Oleksiak Poznań Supercomputing and Networking.
DOSAR VO ACTION AGENDA ACTION ITEMS AND GOALS CARRIED FORWARD FROM THE DOSAR VI WORKSHOP AT OLE MISS APRIL 17-18, 2008.
Fueling Discovery and Education through Advanced Internet Technologies John P. McGowan Vice President & CIO Florida International University.
LIGO- G Z Planning Meeting (Dec 2002)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
LIGO-G9900XX-00-M ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Welcome and Condor Project Overview.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
October 21, 2015 XSEDE Technology Insertion Service Identifying and Evaluating the Next Generation of Cyberinfrastructure Software for Science Tim Cockerill.
NanoHUB.org and HUBzero™ Platform for Reproducible Computational Experiments Michael McLennan Director and Chief Architect, Hub Technology Group and George.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
CPT Demo May Build on SC03 Demo and extend it. Phase 1: Doing Root Analysis and add BOSS, Rendezvous, and Pool RLS catalog to analysis workflow.
LIGO- GXXXXXX-XX-X GriPhyN Kickoff Meeting LSC Member Institution (UT Brownsville) 1 GriPhyN Outreach Program.
A Brief Overview Andrew K. Bjerring President and CEO.
National Center for Supercomputing Applications Barbara S. Minsker, Ph.D. Associate Professor National Center for Supercomputing Applications and Department.
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Cyberinfrastructure Requirements and Best Practices Lessons from a study of TeraGrid Ann Zimmerman.
Project Coordination R. Cavanaugh University of Florida.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
LIGO- G Z EAC Meeting (Jan 2003)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
LIGO- G Z EAC Meeting (Jan 2003)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
OSG Integration Activity Report Rob Gardner Leigh Grundhoefer OSG Technical Meeting UCSD Dec 16, 2004.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
Internet2 Applications Group: Renater Group Presentation T. Charles Yun Internet2 Program Manager, Applications Group 30 October 2001.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
Sep 25, 20071/5 Grid Services Activities on Security Gabriele Garzoglio Grid Services Activities on Security Gabriele Garzoglio Computing Division, Fermilab.
Toward a common data and command representation for quantum chemistry Malcolm Atkinson Director 5 th April 2004.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
EUROPEAN UNION Polish Infrastructure for Supporting Computational Science in the European Research Space The Capabilities of the GridSpace2 Experiment.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
Michael Wilde Argonne National Laboratory University of Chicago Open Science Grid Summer Grid Workshop Overview & Curriculum.
Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
TeraGrid’s Process for Meeting User Needs. Jay Boisseau, Texas Advanced Computing Center Dennis Gannon, Indiana University Ralph Roskies, University of.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
Regional Operations Centres Core infrastructure Centres
DOSAR: State of Organization
Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002
Overview and Development Plans
Module 01 ETICS Overview ETICS Online Tutorials
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

The Grid Effort at UF Presented by Craig Prescott

Overview of UF Grid Activities Leadership Funded Grid Projects / Activities Middleware R&D Application Integration

Grid Members Professors –Avery –Ranka Researchers –Bourilkov –Cavanaugh –Fu –Kim –Prescott –Rodriguez Ph.D. Students –Chitnis (Sphinx) –In (Sphinx) –Kulkarni (Sphinx) Master Student –Khandelwal (CODESH) Former Master Students –Katageri (now at Linux Labs) –Arbree (now at Cornell) –Padala (now at Michigan)

Grid Leadership Avery –PI of GriPhyN ($11 M ITR Project) –PI of iVDGL ($13 M ITR Project) –Co-PI of CHEPREO –Co-PI of UltraLight –President of SESAPS Ranka –PI of Data Mining & exploration Middleware for Grid and Distributed Computinhg ($1.5 M ITR Project) –Project Co-lead for Sphinx –Senior Personnel on CHEPREO –PI MRI Bourilkov –Project Lead for CAVES and CODESH Cavanaugh –Project Coordinator for UltraLight –Deputy Coordinator for GriPhyN –Project Co-Lead for Sphinx Prescott –Co-organised the Boston OSG Technical Workshop –US-CMS Production Manager Rodriguez –Deputy Coordinator for iVDGL –Deployment Board Co-chair for OSG KIM –Project Lead for GridCAT

GriPhyN / iVDGL Develop the technologies & tools needed to exploit a distributed cyberinfrastructure Apply and evaluate those technologies & tools in challenging scientific problems Develop the technologies & procedures to support a persistent cyberinfrastructure Create and operate a persistent cyberinfrastructure in support of diverse discipline goals

Sphinx Scheduling on a grid has unique requirements –Information –System Decisions based on global views providing a Quality of Service are important –Particularly in a resource limited environment Sphinx is an extensible, flexible grid middleware which –Already implements many required features for effective global scheduling –Provides an excellent “workbench” for future activities! VDT Server VDT Client ? Recommendation Engine

CAVES & CODESH Concentrate on the interactions between scientists collaborating over extended periods of time Seamlessly log, exchange and reproduce results and the corresponding methods, algorithms and programs Automatic and complete logging and reuse of work or analysis sessions (between checkpoints) Extend the power of users working or performing analyses in their habitual way, giving them virtual data capabilities Build functioning collaboration suites (stay close to users!) First prototypes use popular tools: Python, ROOT and CVS; e.g. all ROOT commands and CAVES commands available

Grid-enabled Analysis Environment

Grid3 Task Force –Rodriguez member Operations Group –Online expert consultants –Prescott, Kim, Rodriguez Site Verification –Validates the grid middleware installation on a grid site –Prescottled development Monitoring –Kim, Prescott members of G3 Monitoirng grup Also of OSG mon tech group Grid3-Dev –Fu, Rodriguez, Prescott –Prescott maintains Grid3-Dev deployment at UF

GridCAT

Open Science Grid Avery senior member –Governance board –Steering committee? Deployment board Co-chair Rodriguez Prescott, Kim Monitoring Technical Group members OSG Integration Group: –Kim, Prescott, Rodriguez members –Prescott, Rodriguez SRM Server Integration activity

CHEPREO & Ultralight

GEMS

In-VIGO

FLR

HPC and UF Campus Grid

US-CMS Grid Testbed

CMS Computing UF Coordinates the US-CMS Production Effort Over the past year, Prescott oversaw and was responsible for –40% of the global CMS detector simulation –25% of the global CMS event digitisation Prescott also assists in publishing the MC data to the FNAL Tier-1 and the UF Tier-2 sites Effort started in 2001 (pre-grid) with Bourilkov and Rodriguez –Produced all phases, including PU

Workshops Digital Divide Workshop Brazil –Avery OSG Boston –Prescott PNPA GGF Berlin –Cavanaugh UTB Grid Summer School –Rodriguez, Padala

CMS Application Integration Production and Virtual Data Analysis and Virtual Data ORCA on Grid3 in analysis Mode

Outreach FIU Grid3 Cluster –Rodriguez University of Chicago US-ATLAS Tier-2 Facility –Rodriguez Brazil –Rodriguez Korea –Kim

Publications CHEP –VD in CMS Analysis –VD in CMS Production –UF Proto T2 Facility –GridCAT –Sphinx The GRID II –Federated Analysis for HEP IPDPS –Policy Based Scheduling

Plans Distribution of user MC Production Data on Grid3 –Kim developing a web portal…

Conclusion