Summit 2009 Building a New User Community for Very High Quality Media Applications On Very High Speed Networks October 7, 2009 Jim DeRoest ResearchChannel.

Slides:



Advertisements
Similar presentations
Electronic Visualization Laboratory University of Illinois at Chicago EVL Optical Networking Research Oliver Yu Electronic Visualization Laboratory University.
Advertisements

University of Illinois at Chicago Annual Update Thomas A. DeFanti Principal Investigator, STAR TAP Director, Electronic Visualization Laboratory.
GLIF Global Lambda Integrated Facility Kees Neggers CCIRN Cairns Australia, 3 July 2004.
Moving Forward With Digital Preservation at the Library of Congress Laura Campbell Associate Librarian for Strategic Initiatives Library of Congress.
ASIS&T 2010 Building a Collaborative Testbed for Very High-Quality Media Applications Using Very High-Speed Networks April 9, 2010 Laurin Herr.
APAN Application Technology Area HDTV WG Meeting Date : th APAN Cairns.
AHM Overview OptIPuter Overview Third All Hands Meeting OptIPuter Project San Diego Supercomputer Center University of California, San Diego January 26,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
Global Lambda Integrated Facility (GLIF) Kees Neggers SURFnet Internet2 Fall meeting Austin, TX, 27 September 2004.
Sustainable Preservation Services for Archivists through Distributed Custody Caryn Wojcik State of Michigan Records Management Services.
The Academy of Motion Picture Arts & Sciences Building an Interim Digital Preservation System Nancy Silver Digital Archival Program Manager Science and.
Chronopolis: Preserving Our Digital Heritage David Minor UC San Diego San Diego Supercomputer Center.
Western Regional Biomedical Collaboratory Creating a culture for collaboration.
Robust Tools for Archiving and Preserving Digital Data Joseph JaJa, Mike Smorul, and Mike McGann Institute for Advanced Computer Studies Department of.
1 International Real-Time Streaming of 4K Digital Cinema iGrid September 2005 Calit2, University of California, San Diego, CA ORGANIZERS Keio.
“How LambdaGrids are Transforming Science" Keynote iGrid2005 La Jolla, CA September 29, 2005 Dr. Larry Smarr Director, California Institute.
An Introduction to the Open Science Data Cloud Heidi Alvarez Florida International University Robert L. Grossman University of Chicago Open Cloud Consortium.
IGrid Workshop: September 26-29, 2005 GLIF Meeting: September 29-30, 2005 Maxine Brown and Tom DeFanti, Co-Chairs Larry Smarr and Ramesh Rao, Hosts Calit2.
Copyright © 2006 Keio University APRU New Initiatives and Open Forum Session New Initiatives for Global Knowledge Sharing by Research Institute.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
NORDUnet NORDUnet The Fibre Generation Lars Fischer CTO NORDUnet.
CineGrid for Digital Cinema and Beyond Tom DeFanti Research Scientist California Institute for Telecommunications and Information Technology University.
Global High Performance Networks N+I Tokyo’98 Session Chair : Kilnam Chon Speakers : George Strawn / NSF US Next Generation Internet Projects.
Kees Neggers SURFnet SC2003 Phoenix, 20 November 2003.
Developing a North American Global LambdaGrid Dr. Larry Smarr Director, California Institute for Telecommunications and Information Technology Harry E.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
High-quality Internet for higher education and research GigaPort  Overview SURFnet6 Niels den Otter SURFnet EVN-NREN Meeting Amsterdam October 12, 2005.
Chicago/National/International OptIPuter Infrastructure Tom DeFanti OptIPuter Co-PI Distinguished Professor of Computer Science Director, Electronic Visualization.
CineGrid: An Innovative High End Digital Media Collaboration Presentation Sixth Meeting of the University-Industry Demonstration Partnership (UIDP): The.
Jim DeRoest Director Streaming Media Technologies.
MAIN TECHNICAL CHARACTERISTICS Next generation optical transport networks with 40Gbps capabilities are expected to be based on the ITU’s.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
The Future of the Internet and Internet2 IEC Executive 2001 Douglas E. Van Houweling President and CEO, UCAID IEC Executive
Advanced IT to Support Digital Libraries Research and Academic Computing & Telecommunications Divisions UITS.
© Copyright AARNet Pty Ltd TEIN APII Koren Symposium Australia and the Pacific Region Infrastructure to support global e-science George McLaughlin Director,
National Center for Supercomputing Applications Barbara S. Minsker, Ph.D. Associate Professor National Center for Supercomputing Applications and Department.
Research Institute for Digital Media and Content Keio University Tokyo, Japan Naohisa Ohta
TOPS “Technology for Optical Pixel Streaming” Paul Wielinga Division Manager High Performance Networking SARA Computing and Networking Services
Connecting Advanced Networks in Asia-Pacific Kilnam Chon APAN Focusing on Applications -
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
Computational and Digital Arts Drivers: leveraging and synergizing cyberinfrastructure advances NCSA Strategic Planning Presentation (April 20,2010) Donna.
A DVANCE AND I MMEDIATE R ESERVATIONS OF V IRTUALIZED N ETWORK R ESOURCES Laia Ferrao Fundació i2CAT
A Data Centre for Science and Industry Roadmap. INNOVATION NETWORKING DATA PROCESSING DATA REPOSITORY.
STAR TAP, Euro-Link, and StarLight Tom DeFanti April 8, 2003.
The OptIPuter Project Tom DeFanti, Jason Leigh, Maxine Brown, Tom Moher, Oliver Yu, Bob Grossman, Luc Renambot Electronic Visualization Laboratory, Department.
26/05/2005 Research Infrastructures - 'eInfrastructure: Grid initiatives‘ FP INFRASTRUCTURES-71 DIMMI Project a DI gital M ulti M edia I nfrastructure.
CineGrid HPA Tech Retreat 2006 Laurin Herr President, Pacific Interface Inc.
Electronic Visualization Laboratory, University of Illinois at Chicago Pacific Research Platform Scalable Visualization and Virtual-Reality Team
“ Ultra-Broadband and Peta-Scale Collaboration Opportunities Between UC and Canada Summary Talk Canada - California Strategic Innovation Partnership Summit.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
“The UCSD Big Data Freeway System” Invited Short Talk Workshop on “Enriching Human Life and Society” UC San Diego February 6, 2014 Dr. Larry Smarr Director,
For WSIS 2003, CERN and the International Center for Advanced Internet Research (iCAIR) designed several demonstrations of next generation.
Integrating Data Mining and Data Management Technologies for Scholarly Inquiry Ray R. Larson University of California, Berkeley Paul Watry Richard Marciano.
Digital Asset Management: Solutions for Campus Collections Myke Smith Manager, Streaming Media Technologies ResearchChannel.
End to End Performance Initiative . Context for E2E Performance High performance backbones are in place Now, under certain conditions within particular.
26 October 2001 National Summit On Broadband Deployment Implications From Internet2.
Internet2. Yesterday’s Internet  Thousands of users  Remote login, file transfer  Applications capitalize on underlying technology.
The National Digital Stewardship Alliance: Stewardship, Collaboration, Inclusiveness, Exchange.
Internet2 Members Meeting Washington, DC 1 Advanced Networking Infrastructure and Research (ANIR) Aubrey Bush Division Director, ANIR National Science.
Fall 2005 Internet2 Member Meeting International Task Force Julio Ibarra, PI Heidi Alvarez, Co-PI Chip Cox, Co-PI John Silvester, Co-PI September 19, 2005.
Use of high speed photonic networks for efective use of very high quality media production Michal Krsek (CESNET) Mirek Sochor (CESNET/UPP)
High Performance Cyberinfrastructure Discovery Tools for Data Intensive Research Larry Smarr Prof. Computer Science and Engineering Director, Calit2 (UC.
DutchGrid KNMI KUN Delft Leiden VU ASTRON WCW Utrecht Telin Amsterdam Many organizations in the Netherlands are very active in Grid usage and development,
“OptIPuter: From the End User Lab to Global Digital Assets" Panel UC Research Cyberinfrastructure Meeting October 10, 2005 Dr. Larry Smarr.
May 17, 2010 Michal Krsek CESNET Use of high speed photonic networks for effective use of very high quality media production.
Canada’s National Research and Education Network (NREN)
GLIF Global Lambda Integrated Facility
TOPS “Technology for Optical Pixel Streaming”
TOPS “Technology for Optical Pixel Streaming”
Presentation transcript:

Summit 2009 Building a New User Community for Very High Quality Media Applications On Very High Speed Networks October 7, 2009 Jim DeRoest ResearchChannel

What is CineGrid?  CineGrid is a non-profit international membership organization.  CineGrid’s mission is to build an interdisciplinary community focused on the research, development, and demonstration of networked collaborative tools to enable the production, use and exchange of very high-quality digital media over high-speed photonic networks.  Members of CineGrid are a mix of media arts schools, research universities, scientific laboratories, post- production facilities and hardware/software developers around the world connected by 1 Gigabit Ethernet and 10 Gigabit Ethernet networks used for research and education.

CineGrid Founding Members  Cisco Systems  Keio University DMC  Lucasfilm Ltd.  NTT Network Innovation Laboratories  Pacific Interface Inc.  Ryerson University/Rogers Communications Centre  San Francisco State University/INGI  Sony Electronics America  University of Amsterdam  University of California San Diego/Calit2/CRCA  University of Illinois at Urbana-Champaign/NCSA  University of Illinois Chicago/EVL  University of Southern California, School of Cinematic Arts  University of Washington/ResearchChannel

CineGrid Institutional Members  Academy of Motion Picture Arts and Sciences, STC  California Academy of Sciences  Cinepost, ACE Prague  Dark Strand  i2CAT  JVC America  Korea Advanced Institute of Science and Technology (KAIST)  Louisiana State University, Center for Com and Tech  Mechdyne  Meyer Sound Laboratories  Nortel Networks  Northwestern University, iCAIR  Naval Postgraduate School  Renaissance Center North Carolina (RENCI)  Royal Swedish Institute of Technology  SARA  Sharp Corporation Japan  Sharp Labs USA  Tohoku University/Kawamata Lab  University of Manitoba, Experimental Media Centre  Waag Society

CineGrid Network/Exchange Members  AMPATH  CANARIE  CENIC  CESNET  CzechLight  Internet 2  JA.NET  Japan Gigabit Network 2  National LambdaRail  NetherLight  NORDUnet  Pacific Wave  Pacific North West GigaPOP  PIONEER  RNP  Southern Light  StarLight  SURFnet  WIDE

H istoric Convergence Motivates CineGrid  State of the art of media applications is historically driven by three communities  Entertainment, media, art and culture  Science, medicine, education and research  Military, intelligence, security and police  Adoption of digital media means their requirements are converging  Fast networking with similar profiles  Access shared instruments, specialized computers and massive storage  Collaboration tools for distributed, remote teams  Robust security for their intellectual property  Higher media quality, greater speed, more distributed applications  A next generation of trained professionals

1981 Francis Ford Coppola with Dr. Takashi Fujio “First Look” at HDTV Electronic Cinema

1999 ResearchChannel iHDTV: First HDTV Over Internet

2001 NTT Network Innovations Laboratory “First Look” at 4K Digital Cinema

2004 “First Look” at 100 Mpixel OptIPortal Scientific Visualization and Remote Collaboration

CineGrid Projects: “Learning by Doing” iGrid 2005 AES 2006 GLIF Holland Festival 2007

CineGrid: A Scalable Approach 4K x 24 2K x 24 HD 2 x 24/25/30 HDTV x 24/25/30/60 HDV x 24/25/30/60 4K 2 x 24/30 2K 2 x 24 8K x 60 Consumer HD HDTV Stereo HD Digital Cinema Stereo 4K (future) UHDTV (far future) Mbps 20 Mbps Gbps 200 Mbps - 3 Gbps 250 Mbps Gbps 500 Mbps Gbps Gbps More SHD x 24/25/30 Tiled Displays Camera Arrays SHD (Quad HD) 250 Mbs - 6 Gbps

Need to Move Big Data Objects Globally  Digital Motion Picture for Audio Post-Production 1 TV Episode Dubbing Reference ~ 1 GB 1 Theatrical 5.1 Final Mix ~ 8 GB 1 Theatrical Feature Dubbing reference ~ 30 GB  Digital Motion Picture Acquisition 4K RGB x 24 FPS x 10bit/color: ~ 48MB/Frame uncompressed (ideal) 6:1 ~ 20:1 shooting ratios => 48TB ~ 160TB digital camera originals  Digital Dailies HD compressed 25 ~ 50 Mb/s  Digital Post-production and Visual Effects Gigabytes - Terabytes to Select Sites Depending on Project  Digital Motion Picture Distribution Film Printing in Regions  Features ~ 8TB  Trailers~ 200GB Digital Cinema Package to Theatres  Features ~ GB per DCP  Trailers~ 2 - 4GB per DCP Web Download to Consumers  Features ~ 1.3GB  TV Shows ~ 600MB

CineGrid Project Run over the Global Lambda Integrated Facility (GLIF) Backbone 2008 GLIF Visualization by Bob Patterson, NCSA/UIUC

CineGrid Exchange  CineGrid faces a growing need to store and distribute its own collection of digital media assets. The terabytes are piling up. Members want access to the materials for their experiments and demonstrations.  Pondering “The Digital Dilemma” published by AMPAS in 2007, we studied the lessons learned by NDIPP and NARA, as well as the pioneering distributed storage research at Stanford (LOCKSS) and at UCSD (SRB and iRODS).  CineGrid Exchange established to handle CineGrid’s own practical requirements AND to create a global-scale testbed with enough media assets at high enough quality, connected with fast enough networks, to enable exploration of strategic issues in digital archiving and digital library distribution for cinema, scientific visualization, medical imaging, etc.

CineGrid Exchange 2008 Geographically Distributed Repositories Linked by Fast Networks  San UCSD/Calit2 40 TB with 10Gbps connectivity  UvA 30 TB with 10Gbps connectivity  Keio/DMC 6 TB with 10 Gbps connectivity Total Storage = 76 TB CineGrid Exchange holds high quality digital media assets, including 4K, 2K, HD, mono & stereo, still & motion pictures, plus audio.

CineGrid Exchange 2009  96 TB repository added by Ryerson in Toronto  48 TB repository added by CESNET in Prague  10 TB repository added by UIC/EVL in Chicago  10 TB repository to be added by AMPAS in Hollywood  16 TB repository to be added by NPS in Monterey  By end of 2009, global capacity of CineGrid Exchange will be 256 TB connected via 10 GigE cyberinfrastructure  Initiated CineGrid Exchange Project (CXP 2009) to implement multi-layer open-source asset management and user access framework for distributed digital media repository  Funding for CXP 2009 from AMPAS STC  Working Group: AMPAS, PII, Ryerson, UCSD, NPS, UW, UvA, Keio, NTT, CESNET, UIC

CineGrid Exchange Architecture Multi-layer Open Software Stack Concept User Interface & Access Open Source Application for Collections Management & On-Line Access Open Source Digital Repository Framework Open Source Middleware for Rule-based Management of Distributed Asset & Storage Resources Testbed Infrastructure of Distributed Storage and Network Links iRODS by DICE For CineGrid By Calit2 Collective Access Extended by Whirl-I-Gig for AMPAS CineGrid Exchange Access Portal For CineGrid by CineGrid Resource Description Framework For GLIF and CineGrid by UvA

CineGrid International Workshop UCSD/Calit2 in San Diego Save the Date: December 6-9, 2009