National Computational Science Alliance PACS Meeting – April, 2000 Boston University FY 00 Q2 Status Report.

Slides:



Advertisements
Similar presentations
© S.J. Coles 2006 Usability WS, NeSC Jan 06 Experiences in deploying a useable Grid-enabled service for the National Crystallography Service Simon J. Coles.
Advertisements

Foundation API: Today and Tomorrow Rion Dooley. Today v1 is in production 192 apps Creeping up on 200,000 requests/month About to hit 10,000th job Blowing.
C3.ca in Atlantic Canada Virendra Bhavsar Director, Advanced Computational Research Laboratory (ACRL) Faculty of Computer Science University of New Brunswick.
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
Condor-G: A Computation Management Agent for Multi-Institutional Grids James Frey, Todd Tannenbaum, Miron Livny, Ian Foster, Steven Tuecke Reporter: Fu-Jiun.
LLNL and LANL Portal Update Cathy Aaron, Lawrence Livermore National Laboratory Katherine Norskog, Los Alamos National Laboratory Presented at InterLab.
Case Histories, Erik Brisson Scientific Computing and Visualization: Linux Clusters and Tiled Display Walls July 30 – August 1, 2002 Slide 1 Case Histories.
National Computational Science Alliance Boston University Introduction to the Access Grid and its Use at Boston University Jennifer Teig von Hoffman Boston.
National Computational Science Boston UniversityNational Computational Science Alliance From Chautauquas to Cyberarts: An Access Grid User Perspective.
DataGrid Kimmo Soikkeli Ilkka Sormunen. What is DataGrid? DataGrid is a project that aims to enable access to geographically distributed computing power.
High Throughput Urgent Computing Jason Cope Condor Week 2008.
Distributed Account Management Middleware Glenn Bresnahan (PI), Boston University Steve Quinn (CoPI), NCSA Aaron Fuegi, Boston University Chris Pond, NCSA.
Access Grid Documentation Project Jennifer Teig von Hoffman Access Grid Retreat, March 4-5, 2002 Boston University & National Computational Science Alliance.
National Computational Science Alliance Boston University Access Grid Conference Facility at Boston University Jennifer Teig von Hoffman.
National Computational Science Boston UniversityNational Computational Science Alliance Access Grid Conference Facility At Boston University Jennifer Teig.
OSG Area Coordinators Meeting Operations Rob Quick 2/22/2012.
Ready to deliver SW Evaluation of Locate to Microsoft as a pilot towards product, customer contract & service implementation Presented Locate to CFO Timo.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
National Computational Science Alliance Boston University The Joys and Perils of Early Adoption OR Proper Care and Feeding of Node Ops Jennifer Teig von.
ALICE Electronic Logbook MEST-CT Vasco Barroso PH/AID.
DORII Joint Research Activities DORII Joint Research Activities Status and Progress 4 th All-Hands-Meeting (AHM) Alexey Cheptsov on.
HPC club presentation A proposal for a campus-wide research grid Barry Wilkinson Department of Computer Science UNC-Charlotte Dec. 2, 2005.
High Performance Louisiana State University - LONI HPC Enablement Workshop – LaTech University,
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
CoG Kit Overview Gregor von Laszewski Keith Jackson.
C3 TASP Meeting/HPCS 2006 Memorial University, St John’s Newfoundland, Copyright © 2006 Ge Baolai Outline  About SHARCNET  Resources and Usage.
Grid Resource Allocation and Management (GRAM) Execution management Execution management –Deployment, scheduling and monitoring Community Scheduler Framework.
The Sharing and Training of HPC Resources at the University of Arkansas Amy Apon, Ph.D. Oklahoma Supercomputing Symposium October 4, 2006.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
Scalable Systems Software Center Resource Management and Accounting Working Group Face-to-Face Meeting October 10-11, 2002.
GRAM5 - A sustainable, scalable, reliable GRAM service Stuart Martin - UC/ANL.
EzHPC ERDC, ITL ezHPC Workshop April 23, Problem HPC tools are primarily command line Hard to remember syntax to accomplish things Not everyone.
Use of Condor on the Open Science Grid Chris Green, OSG User Group / FNAL Condor Week, April
CSF4 Meta-Scheduler Name: Zhaohui Ding, Xiaohui Wei
Miron Livny Center for High Throughput Computing Computer Sciences Department University of Wisconsin-Madison Open Science Grid (OSG)
Grid Workload Management Massimo Sgaravatto INFN Padova.
INFSO-RI NA2 Status Update ETICS2 second AHM Isabel Matranga Engineering Ingegneria Informatica SpA Darmstadt, February 2009.
National Computational Science Boston UniversityNational Computational Science Alliance Access Grid Conference Facility At Boston University Jennifer Teig.
December 8 & 9, 2005, Austin, TX SURA Cyberinfrastructure Workshop Series: Grid Technology: The Rough Guide User Interfaces to Grids Patrick Hurley Texas.
Grid Architecture William E. Johnston Lawrence Berkeley National Lab and NASA Ames Research Center (These slides are available at grid.lbl.gov/~wej/Grids)
Grid Infrastructure group (Charlotte): Barry Wilkinson Jeremy Villalobos Nikul Suthar Keyur Sheth Department of Computer Science UNC-Charlotte March 16,
Institute For Digital Research and Education Implementation of the UCLA Grid Using the Globus Toolkit Grid Center’s 2005 Community Workshop University.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing - User Environment Anke Kamrath Associate Director, SDSC
SC Global Jennifer Teig von Hoffman Boston University Chair, SC Global
GridLab WP-2 Cactus GAT (CGAT) Ed Seidel, AEI & LSU Co-chair, GGF Apps RG, Gridstart Apps TWG Gabrielle Allen, Robert Engel, Tom Goodale, *Thomas Radke.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
New Voices and New Visions – Summer 2008 Infrastructure for Creating a Cyberclassroom Jennifer Teig von Hoffman Boston University.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
3 rd Party Registration & Account Management JDOA Update To RMS October 28, 2014.
USQCD regional grid Report to ILDG /28/09ILDG14, June 5, US Grid Usage  Growing usage of gauge configurations in ILDG file format.  Fermilab.
Education, Outreach and Training (EOT) and External Relations (ER) Scott Lathrop Area Director for EOT Extension Year Plans.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
Feb 2-4, 2004LNCC Workshop on Computational Grids & Apps Middleware for Production Grids Jim Basney Senior Research Scientist Grid and Security Technologies.
GridChem Architecture Overview Rion Dooley. Presentation Outline Computational Chemistry Grid (CCG) Current Architectural Overview CCG Future Architectural.
TeraGrid User Portal Eric Roberts. Outline Motivation Vision What’s included? Live Demonstration.
GridChem Sciene Gateway and Challenges in Distributed Services Sudhakar Pamidighantam NCSA, University of Illinois at Urbaba- Champaign
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor and (the) Grid (one of.
National Computational Science Boston UniversityNational Computational Science Alliance Web-Based Documentation and Training for the AG Jennifer Teig von.
Status of Globus activities Massimo Sgaravatto INFN Padova for the INFN Globus group
National Computational Science Ky PACS at the University of Kentucky April 2000 –Advanced Computing Resources –EPSCoR Outreach –SURA Liaison –John.
Access Grid Conference Facility at Boston University Ariella Rebbi and Jennifer Teig von Hoffman, Scientific Computing and Visualization, Office of Information.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI UMD Roadmap Steven Newhouse 14/09/2010.
II EGEE conference Den Haag November, ROC-CIC status in Italy
Operation team at Ccin2p3 Suzanne Poulat –
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
TeraGrid Software Integration: Area Overview (detailed in 2007 Annual Report Section 3) Lee Liming, JP Navarro TeraGrid Annual Project Review April, 2008.
The Future of Volunteer Computing
Condor-G Making Condor Grid Enabled
Condor-G: An Update.
Presentation transcript:

National Computational Science Alliance PACS Meeting – April, 2000 Boston University FY 00 Q2 Status Report

National Computational Science Alliance VMR - Computational Grid AAB/NRAC allocations: –16 projects –>300,000 SUs (200K committed) –>23,000 SUs delivered April –>122,000 SUs delivered Q1-2/FY00 Startup allocations (<10K SUs) –External (non-BU PI) –36 projects –>57,000 SUs allocated –Internal (BU PI) –90 projects –>122,000 SUs allocated Personnel –Glenn Bresnahan, Claudio Rebbi, Aaron Fuegi, Kadin Tseng, Doug Sondak, Mike Dugan, Erik Brisson

National Computational Science Alliance VMR - Account Management Full usage reporting (November, 1998) Development of transactions model Implementation of transaction system –Beta testing (within the week) Development of user Web interface Participation in GridForum Accounting Working Group (w/ NCSA) Personnel: –Glenn Bresnahan, Aaron Fuegi, Jennifer Teig von Hoffman, Mike Dugan

National Computational Science Alliance VMR – Security Deployment of all latest GSI/Globus releases –Currently Maintenance of Globus mapping file –Approx. 18 users Participation in VMR demonstration at SC99 Participation in CPM Personnel: –Eric Jones, Mike Dugan, Russ Wolf, Glenn Bresnahan

National Computational Science Alliance VMR – Storage Participation in storage group Deployed and tested GASS “archive” script Personnel: –Eric Jones, Mike Dugan

National Computational Science Alliance VMR – Scheduling Remote job submission (globus_submit) –Interface to local queuing system (LSF) Condor glide-in Participation in Condor demonstration at SC99 Personnel: –Eric Jones, Mike Dugan

National Computational Science Alliance VMR Services Trouble ticket working group Origin Repository –Ongoing Personnel: –Jennifer Teig von Hoffman, Kadin Tseng, Mike Dugan, Kadin Tseng

National Computational Science Alliance VMR Services – Training PACS MPI course development –Completed 2 chapters –Participating in peer review –Recruiting beta testers/focus group Personnel: –Kadin Tseng, Jennifer Teig von Hoffman

National Computational Science Alliance VMR Services– Training Delivery of on-site and Web based materials Enhancement of Web-based materials, esp. OpenMP Instrumentation and reporting of Web usage Participation in AG delivered training –FFTW, Globus (coming) Personnel: –Kadin Tseng, Jennifer Teig von Hoffman, Doug Sondak, Robert Putnam, Russ Wolf

National Computational Science Alliance VMR Services/Communities – Training HPC training –2 day workshop (December) –26 attendees, 50% non-BU –Grants for EPSCoR participants Personnel: –Kadin Tseng, Doug Sondak, Robert Putnam, Ariella Rebbi

National Computational Science Alliance Access Grid Deployment AG classroom –Acquired and renovated space ($$$$ - cost shared) –Seating for 20+ –Rear projected – 1x3 projector array –Upgrading to 2x3 projector array Participation in numerous AG events –Demonstrations: regional-international visitors Proposal for enhanced AG features Multicast infrastructure deployment –Local (BU) and regional (NoX) Personnel: –Jennifer Teig von Hoffman, Russ Wolf, Robert Putnam, Wayne Gilmore, Eric Jones, Erik Brisson, Ariella Rebbi, Laura Giannitrapani, Chuck von Lichtenberg, Glenn B.

National Computational Science Alliance Chautauqua Participating in national planning AG & Chautauqua workshop Active participation in OSC and KU events –Remote audiences for all events –On-site AG support (OSC) Corresponding local events –Open house/technology demos, dinner Personnel: –Jennifer Teig von Hoffman, Russ Wolf, Glenn Bresnahan, Roscoe Giles, Raquell Holmes, Ariella Rebbi, Wayne Gilmore, Robert Putnam, Laura Giannitrapani

National Computational Science Alliance Effort Total EffortBostonU Cost Sharing % G. Bresnahan33%50% R. Giles8%35% J. Teig von Hoffman100%35% K. Tseng100%35% E. Jones90%35% A. Rebbi50%35% R. Wolf33%80% M. Dugan20%100% W. Gilmore10%100% R. Putnam25%80% E. Brisson10%80% D. Sondak25%100%