Outline IT Division News Other News

Slides:



Advertisements
Similar presentations
Amber Boehnlein, FNAL D0 Computing Model and Plans Amber Boehnlein D0 Financial Committee November 18, 2002.
Advertisements

Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
JLab Status & 2016 Planning April 2015 All Hands Meeting Chip Watson Jefferson Lab Outline Operations Status FY15 File System Upgrade 2016 Planning for.
IHEP Site Status Jingyan Shi, Computing Center, IHEP 2015 Spring HEPiX Workshop.
OSG GUMS CE SE VOMS VOMRS UConn-OSG University of Connecticut GLUEX support center Gluex VO Open Science Grid All-Hands Meeting, Chicago, IL, Mar. 8-11,
CPP Staff - 30 CPP Staff - 30 FCIPT Staff - 35 IPR Staff IPR Staff ITER-India Staff ITER-India Staff Research Areas: 1.Studies.
31/10/2000NT Domain - AD Migration - JLab 2000 NT DOMAIN - ACTIVE DIRECTORY MIGRATION Michel Jouvin LAL Orsay
IT in the 12 GeV Era Roy Whitney, CIO May 31, 2013 Jefferson Lab User Group Annual Meeting.
W.Smith, U. Wisconsin, ZEUS Computing Board Zeus Executive, March 1, ZEUS Computing Board Report Zeus Executive Meeting Wesley H. Smith, University.
Computing for ILC experiment Computing Research Center, KEK Hiroyuki Matsunaga.
Outline IT Organization SciComp Update CNI Update
1 Computing & Networking User Group Meeting Roy Whitney Andy Kowalski Sandy Philpott Chip Watson 17 June 2008.
Nov 1, 2000Site report DESY1 DESY Site Report Wolfgang Friebel DESY Nov 1, 2000 HEPiX Fall
Computing and IT Update Jefferson Lab User Group Roy Whitney, CIO & CTO 10 June 2009.
NLIT May 26, 2010 Page 1 Computing Jefferson Lab Users Group Meeting 8 June 2010 Roy Whitney CIO & CTO.
Scientific Computing Experimental Physics Lattice QCD Sandy Philpott May 20, 2011 IT Internal Review 12GeV Readiness.
Discussion Topics DOE Program Managers and OSG Executive Team 2 nd June 2011 Associate Executive Director Currently planning for FY12 XD XSEDE Starting.
D0SAR - September 2005 Andre Sznajder 1 Rio GRID Initiatives : T2-HEPGRID Andre Sznajder UERJ(Brazil)
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
PDSF at NERSC Site Report HEPiX April 2010 Jay Srinivasan (w/contributions from I. Sakrejda, C. Whitney, and B. Draney) (Presented by Sandy.
JLab Scientific Computing: Theory HPC & Experimental Physics Thomas Jefferson National Accelerator Facility Newport News, VA Sandy Philpott.
JLAB Computing Facilities Development Ian Bird Jefferson Lab 2 November 2001.
May 25-26, 2006 LQCD Computing Review1 Jefferson Lab 2006 LQCD Analysis Cluster Chip Watson Jefferson Lab, High Performance Computing.
OSG Tier 3 support Marco Mambelli - OSG Tier 3 Dan Fraser - OSG Tier 3 liaison Tanya Levshina - OSG.
Jefferson Lab Site Report Sandy Philpott Thomas Jefferson National Accelerator Facility Jefferson Ave. Newport News, Virginia USA 23606
The LHCb Italian Tier-2 Domenico Galli, Bologna INFN CSN1 Roma,
Slide David Britton, University of Glasgow IET, Oct 09 1 Prof. David Britton GridPP Project leader University of Glasgow UK-T0 Meeting 21 st Oct 2015 GridPP.
Outline: Status: Report after one month of Plans for the future (Preparing Summer -Fall 2003) (CNAF): Update A. Sidoti, INFN Pisa and.
U.S. Department of Energy’s Office of Science Midrange Scientific Computing Requirements Jefferson Lab Robert Edwards October 21, 2008.
Doug Benjamin Duke University. 2 ESD/AOD, D 1 PD, D 2 PD - POOL based D 3 PD - flat ntuple Contents defined by physics group(s) - made in official production.
Building and managing production bioclusters Chris Dagdigian BIOSILICO Vol2, No. 5 September 2004 Ankur Dhanik.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
CERN - European Organization for Nuclear Research FOCUS March 2 nd, 2000 Frédéric Hemmer - IT Division.
Batch Software at JLAB Ian Bird Jefferson Lab CHEP February, 2000.
Hopper The next step in High Performance Computing at Auburn University February 16, 2016.
Jefferson Lab Site Report Sandy Philpott HEPiX Fall 07 Genome Sequencing Center Washington University at St. Louis.
5/12/06T.Kurca - D0 Meeting FNAL1 p20 Reprocessing Introduction Computing Resources Architecture Operational Model Technical Issues Operational Issues.
Hall D Computing Facilities Ian Bird 16 March 2001.
Margaret Votava / Scientific Computing Division FIFE Workshop 20 June 2016 State of the Facilities.
High Performance Computing Center ACI-REF Virtual Residency August 7-13, 2016 How do Design a Cluster Dana Brunson Asst. VP for Research Cyberinfrastructure.
JLab “SciPhi-XVI” KNL Cluster
Clouds , Grids and Clusters
Report from WLCG Workshop 2017: WLCG Network Requirements GDB - CERN 12th of July 2017
Scaling Science Communities Lessons learned by and future plans of the Open Science Grid Frank Würthwein OSG Executive Director Professor of Physics UCSD/SDSC.
Grid site as a tool for data processing and data analysis
HTCondor at Syracuse University – Building a Resource Utilization Strategy Eric Sedore Associate CIO HTCondor Week 2017.
LinkSCEEM-2: A computational resource for the development of Computational Sciences in the Eastern Mediterranean Mostafa Zoubi SESAME Outreach SESAME,
Software Defined Storage
Experiences with Large Data Sets
ATLAS Sites Jamboree, CERN January, 2017
VIDIZMO Deployment Options
Infrastructure for testing accelerators and new
LQCD Computing Operations
Proposal for the LHCb Italian Tier-2
CMGT 410 Possible Is Everything/snaptutorial.com
CMGT 410 HOMEWORK Perfect Education/ cmgt410homework.com.
Scientific Computing At Jefferson Lab
Windows Server 2016 Software Defined Storage
High Energy Physics Computing Coordination in Pakistan
NSF cloud Chameleon: Phase 2 Networking
Sky Computing on FutureGrid and Grid’5000
2017 IT First Time: 100% Internet uptime
2018 COMPUTING REVIEW HOMEWORK QUESTIONS
Computing Overview Amber Boehnlein.
The Problem ~6,000 PCs Another ~1,000 boxes But! Affected by:
Scientific Computing Strategy
Computer communications
Office of Information Technology February 16, 2016
Sky Computing on FutureGrid and Grid’5000
Jefferson Lab Scientific Computing Update
Presentation transcript:

Outline IT Division News Other News Computing Update User Group Board of Directors Meeting Amber Boehnlein IT Division Director Outline IT Division News Other News

IT News 100% Internet uptime IT service survey- Calls for more engagement/collaboration between IT & Physics Last survey in 2016: time for another one Made improvements to User Audit and other user management tools We are always open to suggestions PAC proposal submission improved and this year deploying Unicheck Now ORCID ‘members’ rollout in process Publications—At 85%+ manuscripts logged to DOE We are consistently the best in the DOE complex

2019 IT First phase of migration to Office365-cloud version and other cloud services complete Replacing CCPR system with ServiceNow Will allow dashboards and better quantification of tickets. Rolling out in March Hiring students for “Cyber Operations Lab’ 2019 is largely an administrative year: many projects for the benefit of the Jefferson Lab but of less interest to the User community.

SciComp News Proposed resources JLAB Existing Hardware SciPhy Cluster: LQCD-Ext (2016) 17,152 Xeon Phi KNL cores + 12,500 Xeon Phi cores as for last week 14,520 conventional Intel cores 5,000 Physics Farm 6,000 ‘DNR’ nodes 3,520 cores added in 2018   Current FY19 FY20 CPU (M-core-hours/year) 37 87 90 Scratch Disk & Cache Disk (PB) 0.65 1.1 2 Tape (GB/s) 3 5 7 WAN bandwidth (Gbps) 10 Proposed resources

SciComp News Developed a 3 year plan for computing resources for the experimental program 2018 S&C review endorsed approach of mix of resources Provisioning to peaks is not affordable GlueX running MC jobs offsite using OSG and (mostly) collaboration resources at universities GlueX also now running reconstruction at NERSC Have allocations for GlueX and CLAS12 at NERSC for 2019 Network bottlenecks observed on our current 10G link to ESNET Some tuning required on our part ESNet will light a second 10G link March/April 2018

2019 SciComp IT and Physics continues to host a computing round table to discuss common topics Speakers often comment on what a nice series it is (makes it easier to get good speakers) However it is getting harder to maintain… Waiting for the S&C review report… Close out was positive. Some technical projects this year such as phasing out PBS on the farm Machine Learning will be a lab focal area for the year. IT will be hiring Establish working group (s) based grass roots effort Working with University CS and Math groups as well

Open Data & Data Cataloging Open Data discussed at last summers’ meeting… Many considerations… Technical; infrastructure; policy; cost…. Some initiatives ORCID is the de facto standard. Digital Object Identifier- USQCD has done some work Software Citations--Dan Katz of UIUC is spearing heading CERN Open Data Can schedule a Round Table on this topic