José D. Zamora, Sean R. Morriss and Manuela Campanelli.

Slides:



Advertisements
Similar presentations
LIGO LSC DataGrid Workshop March 24-26, 2005 Livingston Observatory.
Advertisements

 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Parallel Programming on the SGI Origin2000 With thanks to Moshe Goldberg, TCC and Igor Zacharov SGI Taub Computer Center Technion Mar 2005 Anne Weill-Zrahia.
MASPLAS ’02 Creating A Virtual Computing Facility Ravi Patchigolla Chris Clarke Lu Marino 8th Annual Mid-Atlantic Student Workshop On Programming Languages.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
The Grid as Infrastructure and Application Enabler Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
Computing at COSM by Lawrence Sorrillo COSM Center.
Grid Toolkits Globus, Condor, BOINC, Xgrid Young Suk Moon.
Wireless Grid Computing A Prototype Wireless Grid Grant Gifford Mark Hempstead April 30, 2003.
Grid Computing in a Commodity World KCCMG Fall Impact 2005 Lorin Olsen, Sprint Nextel.
Michigan Grid Testbed Report Shawn McKee University of Michigan UTA US ATLAS Testbed Meeting April 4, 2002.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
Patrick R Brady University of Wisconsin-Milwaukee
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
LIGO- G Z E/O Meeting, March 1 (2002)LSC Member Institution (UT Brownsville) 1 Education & Outreach Activities of the GriPhyN & iVDGL projects.
WG Goals and Workplan We have a charter, we have a group of interested people…what are our plans? goalsOur goals should reflect what we have listed in.
1 A National Virtual Specimen Database for Early Cancer Detection June 26, 2003 Daniel Crichton NASA Jet Propulsion Laboratory Sean Kelly NASA Jet Propulsion.
LIGO- G Z Planning Meeting (Dec 2002)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
Use of Condor on the Open Science Grid Chris Green, OSG User Group / FNAL Condor Week, April
Manuela Campanelli The University of Texas at Brownsville EOT-PACI Alliance All-Hands Meeting 30 April 2003 Urbana, Illinois GriPhyN.
1 Grid Related Activities at Caltech Koen Holtman Caltech/CMS PPDG meeting, Argonne July 13-14, 2000.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
D0SAR - September 2005 Andre Sznajder 1 Rio GRID Initiatives : T2-HEPGRID Andre Sznajder UERJ(Brazil)
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
Sean Morriss and Jose Zamora The University of Texas at Brownsville GriPhyN NSF Project Review January 2003.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
LIGO- GXXXXXX-XX-X GriPhyN Kickoff Meeting LSC Member Institution (UT Brownsville) 1 GriPhyN Outreach Program.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
The Internet2 HENP Working Group Internet2 Spring Meeting April 9, 2003.
GRID ARCHITECTURE Chintan O.Patel. CS 551 Fall 2002 Workshop 1 Software Architectures 2 What is Grid ? "...a flexible, secure, coordinated resource- sharing.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Packaging & Testing: NMI & VDT.
LIGO- G Z All-Hands April (2002)LSC Member Institution (UT Brownsville) 1 Education & Outreach Activities of the GriPhyN & iVDGL projects Manuela.
1 Development of a High-Throughput Computing Cluster at Florida Tech P. FORD, R. PENA, J. HELSBY, R. HOCH, M. HOHLMANN Physics and Space Sciences Dept,
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
LIGO- G Z EAC Meeting (Jan 2003)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
LIGO- G Z EAC Meeting (Jan 2003)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
LIGO- G Z GriPhyN All-Hands Meeting LSC Member Institution (UT Brownsville) 1 GriPhyN Education and Outreach Joe Romano University of.
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
Manuela Campanelli The University of Texas at Brownsville GriPhyN NSF Project Review January 2003 Chicago Education & Outreach.
Internet2 AdvCollab Apps 1 Access Grid Vision To create virtual spaces where distributed people can work together. Challenges:
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
State of LSC Data Analysis and Software LSC Meeting LIGO Hanford Observatory November 11 th, 2003 Kent Blackburn, Stuart Anderson, Albert Lazzarini LIGO.
LSC Meeting LIGO Scientific Collaboration - University of Wisconsin - Milwaukee 1 Software Coordinator Report Alan Wiseman LIGO-G Z.
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
June 15, PMG Ruth Pordes Status Report US CMS PMG July 15th Tier-1 –LCG Service Challenge 3 (SC3) –FY05 hardware delivery –UAF support Grid Services.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
Management & Coordination Paul Avery, Rick Cavanaugh University of Florida Ian Foster, Mike Wilde University of Chicago, Argonne
Internet2 Spring Meeting NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions.
LIGO- G Z iVDGL Kick-off Meeting LSC Member Institution (UT Brownsville) 1 Education and Outreach Activities Manuela Campanelli, Joe.
Michael Wilde Argonne National Laboratory University of Chicago Open Science Grid Summer Grid Workshop Overview & Curriculum.
LIGO- G Z GriPhyN All-Hands Meeting LSC Member Institution (UT Brownsville) 1 Education and Outreach Activities Manuela Campanelli,
Access Grid Conference Facility at Boston University Ariella Rebbi and Jennifer Teig von Hoffman, Scientific Computing and Visualization, Office of Information.
Collaborative Tools for the Grid V.N Alexandrov S. Mehmood Hasan.
GRIDS Center John McGee, USC/ISI April 10, 2003 Internet2 – Spring Member Meeting Arlington, VA NSF Middleware Initiative.
Grid Education and Communication Soma Mukherjee Center for Gravitational Wave Astronomy Department of Physics and Astronomy University of Texas at Brownsville.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
The Minority-Serving Institutions (MSI) Cyberinfrastructure (CI) Institute [MSI C(I) 2 ] Providing a scalable mechanism for developing a CI-enabled science.
Grid computing.
GriPhyN Education and Outreach
Presentation transcript:

José D. Zamora, Sean R. Morriss and Manuela Campanelli

Research: LIGO data analysis (LDAS/LAL) and numerical relativity source simulations (CACTUS) and now grid computing (VDT). Outreach : UTB is classified as a Minority Serving Institution. UTB GriPhyN and iVDGL Tier 3 center Experiments: LIGO ATLAS/CMS SDSS

GriPhyN/iVDGL Map Circa Worldwide Multi-Institutional Collaboration among Computer Science research and data intensive physics Experiments.

UTB Linux cluster: Lobizon Lobizon is 96-node Linux cluster used for research in LIGO data analysis, GW source simulation and grid computing. Currently administered by Jose’ Zamora. Grid enabled: Jose’ Zamora, Sean Morris and Santiago Peña.

UTB Linux Cluster `Lobizon’ Internet Router Firewall / Router Master Server Matrix Switch DHCP/Image server 96 node cluster (1-24), (1-24) and (1-24), (1-24) Intel Pentium III, 812mhz, 512 RAM, 40Gb The master (Lobizon) is connected to the internet through a firewall. The 96 nodes communicate with the master. PI III (812 mhz), Red-Hat 7.1 linux.

The Virtual Data Toolkit (VDT) is a set of software, developed by GriPhyN researchers, for constructing the first global petascale virtual grid for data intensive science. The aim is to unify the view of data in a distributed grid environment. We use Condor to implement high throughput computing on a large, on-site collection of distributively owned computing resources. In order to achieve site to site connectivity, common protocols and security services are applied using Globus. The Virtual Data Toolkit

April 2002: we installed the first release of the VDT 1.0 May 2002: we successfully tested VDT locally June 2002: we worked with the VDT team to improve their documentation July 2002: we updated to VDT (development version) August 2002: visit to Fermilab and Madison to learn about Globus and Condor-g September 2002: we successfully tested Globus on the grid Grid Enabling the Cluster

GriPhyN-LIGO SC2002 Grid Demo SC2001 demo slide from Scott Koranda/Mike Wilde Caltech, NCSA, USC/ISI, UWM and UTB