Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.

Slides:



Advertisements
Similar presentations
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
Advertisements

Experiences in Middleware Deployment: Teach a man to fish… Mary Fran Yafchak NMI Integration Testbed Manager SURA IT Program Coordinator.
Xsede eXtreme Science and Engineering Discovery Environment Ron Perrott University of Oxford 1.
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
The DOE Science Grid Computing and Data Infrastructure for Large-Scale Science William Johnston, Lawrence Berkeley National Lab Ray Bair, Pacific Northwest.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
EInfrastructures (Internet and Grids) - 15 April 2004 Sharing ICT Resources – “Think Globally, Act Locally” A point-of-view from the United States Mary.
EInfrastructures (Internet and Grids) - 15 April 2004 Sharing ICT Resources – Discussion of Best Practices in the U.S. Mary E. Spada Program Manager, Strategic.
SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN The Coming of Cyberinfrastructure Gary M. Olson.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
The GRIDS Center, part of the NSF Middleware Initiative The GRIDS Center: Defining and Deploying Grid Middleware presented by Tom.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
What is Internet2? Ted Hanss, Internet2 5 March
Jarek Nabrzyski, Ariel Oleksiak Comparison of Grid Middleware in European Grid Projects Jarek Nabrzyski, Ariel Oleksiak Poznań Supercomputing and Networking.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Welcome and Condor Project Overview.
NSF Middleware Initiative Renee Woodten Frost Assistant Director, Middleware Initiatives Internet2 NSF Middleware Initiative.
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
A Wide Range of Scientific Disciplines Will Require a Common Infrastructure Example--Two e-Science Grand Challenges –NSF’s EarthScope—US Array –NIH’s Biomedical.
Spring 2003 Internet2 Meeting Cyberinfrastructure - Implications for the Future of Research Alan Blatecky ANIR National Science Foundation.
CEOS Grid Task Team Yonsook Enloe, Allan Doyle, Ananth Rao March 8, 2005.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
Geosciences - Observations (Bob Wilhelmson) The geosciences in NSF’s world consists of atmospheric science, ocean science, and earth science Many of the.
3 Nov 2003 A. Vandenberg © Second NMI Integration Testbed Workshop on Experiences in Middleware Deployment, Anaheim, CA 1 NMI R3 Enterprise Directory Components.
The Swiss Grid Initiative Context and Initiation Work by CSCS Peter Kunszt, CSCS.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
GRID ARCHITECTURE Chintan O.Patel. CS 551 Fall 2002 Workshop 1 Software Architectures 2 What is Grid ? "...a flexible, secure, coordinated resource- sharing.
Authors: Ronnie Julio Cole David
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
3 December 2015 Examples of partnerships and collaborations from the Internet2 experience Interworking2004 Ottawa, Canada Heather Boyles, Internet2
Southeastern Universities Research Association (SURA) - Intro for Fed/Ed 18 Mary Fran Yafchak Senior Program Manager, IT
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
December 10, 2003Slide 1 International Networking and Cyberinfrastructure Douglas Gatchell Program Director International Networking National Science Foundation,
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
What’s Happening at Internet2 Renee Woodten Frost Associate Director Middleware and Security 8 March 2005.
Welcome to Base CAMP: Enterprise Directory Deployment Ken Klingenstein, Director, Internet2 Middleware Initiative Copyright Ken Klingenstein This.
Digital Data Collections ARL, CNI, CLIR, and DLF Forum October 28, 2005 Washington DC Chris Greer Program Director National Science Foundation.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
NSF Middleware Initiative and Enterprise Middleware: What Can It Do for My Campus? Renee Woodten Frost Internet2/University of Michigan.
Internet2 Spring Meeting NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
NSF Middleware Initiative and Enterprise Middleware: What Can It Do for My Campus? Mark Luker, EDUCAUSE Copyright Mark Luker, This work is the intellectual.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Bob Jones EGEE Technical Director
Mary Fran Yafchak Senior Program Manager, IT
EGI Webinar - Introduction -
Grid Application Model and Design and Implementation of Grid Services
NSF Middleware Initiative
Presentation transcript:

Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research

Middleware Camp NMI Agreements/awards 3 Cooperative Agreements were executed to establish NMI (effective ) Service Integrator and Service Provider functions were integrated and combined into a common approach and effort 9 Middleware research projects awarded

Middleware Camp NMI Organization –GRIDS Center ISI, NCSA, UC, UCSD & UW –EDIT Team (Enterprise and Desktop Integration Technologies) EDUCAUSE, Internet2 & SURA Purpose of NMI To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit applications in a networked environment Two Teams

Middleware Camp What does Middleware Do? The purpose of middleware is: 1. To allow scientists and engineers the ability to transparently use and share distributed resources, such as computers, data, and instruments 2. To develop effective collaboration and communications tools such as Grid technologies, desktop video, and other advanced services to expedite research and education, and 3. To develop a working architecture and approach which can be extended to Internet users around the world. Middleware is the stuff that makes “transparently use” happen, providing consistency, security, privacy and capability

Middleware Camp NMI Goals Facilitate scientific productivity Increase research collaboration through shared data, computing, facilities and applications Support education enterprise; early adoption to deployment Establish a level of persistency and availability for users and developers Encourage the participation of industry partners, government labs and agencies Encourage and support the development of standards and open source approaches Enable scaling and sustainability to support the larger research community and beyond

Middleware Camp NMI Process Experimental Software & research applications Middleware deployment Consensus - disciplines - communities - public & private Early Implementations - Directories, GRID services, authentication, etc MiddlewareTestbeds - experimental, Beta, scaling & “hardening” Early Adopters - GriPhyN, NEES, campuses, etc Dissemination & Support International Research & Education

Middleware Camp First Year Objectives Develop and Release a version of Grid/Middleware –NMI Release 1 scheduled for April –NMI Release 2 probably in July/Aug Develop security, directory architectures, best practices for campus integration Establish associated support and training mechanisms Develop partnerships with external groups and partners Develop a communication and outreach plan Develop a repository of NMI software and best practices

Middleware Camp Major Grid projects and efforts DOE Science Grid (SciDAC) NASA Information Power Grid GriPhyN – Grid Physics Network TeraGrid – Distributed Terascale Facility IVDGL – International Virtual Data Grid Laboratory BIRN - Biomedical Imaging Research Network NEES – Network for Earthquake Engineering Simulation Earth Systems Grid Space Grid PPDG - Particle Physics Data Grid DATATAG Trans-Atlantic Grid Testbed

Middleware Camp More Projects/efforts UK Grid Support Center National Fusion Grid CEOS (Committee for Earth Observation Satellites) Astronomy Virtual Observatory European Data Grid ALMA - Atacama Large Millimeter Array LHC Computing Grid - Large Hadron Collider LIGO - Laser Inteferometer Gravitional Observatory NEON - National Ecological Observatory Network SDSS - Sloan Digital Sky Survey Open Grid Consortium Global Grid Forum

Middleware Camp NMI Release v.1 Software Components –Globus Toolkit (GRAM 1.5, MDS 2.2, GPT, GridFTP,) –Condor-G –Network Weather Service –KCA 1.0; CPM 2.0, KX –eduPerson 1.5; eduOrg 1.0 Compliance, testing and packaging Best Practices and policies –suite of directory and services

Middleware Camp NMI Testbed Program SURA Call for Proposals (Mar 8 response) Testing to include –integration and distribution to desktop –interaction with common campus infrastructure –vertical integration with communities of users –component scalability and consistency 4 sponsored testbeds (~ 4 unsponsored) Mary Fran Yafchak - Project Manager

Middleware Camp MAGIC Middleware And Grid Infrastructure Coordination Coordinates Interagency Grid and Middleware efforts Enhances and encourages interoperable Grid and Middleware domains Promotes usable, widely deployed middleware tools and services Provides a Federal voice for effective international coordination of Grid and Middleware Technologies

Middleware Camp MAGIC Status Established by the Large Scale Networking Committee on January 8, 2002 Representatives and structure being discussed –Working charter being developed –DARPA, DOE, NSF, NASA, NIH, NIST Next steps –identification of immediate concerns and issues –establishment of an MAGIC Engineering team –establishment of a meeting schedule

Middleware Camp 2nd Year Program of NMI Program Announcement March 1, 2002 Proposals due

Middleware Camp NMI Emphasis Areas for 2nd year program  Distributed authorization and management tools -Resource schedulers and reservation, especially across multiple domains -Resource accounting and monitoring -Predictive services including Grid and network prediction tools -Directories and certificate authorities -Peer-to-peer middleware resources  SIP-enabling collaboration tools  Mobility: public space 802.1x authentication infrastructure and performance improvement tools

Middleware Camp Additional Areas of Emphasis  Security for operating systems and middleware software -User privacy management tools -User data integrity and authentication -Authorization tools -Peer-to-peer middleware resources