Computing and IT Update Jefferson Lab User Group Roy Whitney, CIO & CTO 10 June 2009.

Slides:



Advertisements
Similar presentations
Jefferson Lab Site Report Kelvin Edwards Thomas Jefferson National Accelerator Facility Newport News, Virginia USA
Advertisements

Student Lab Management IT Services March Our Purpose this evening … o Foster good communication o Improve Relationships and Collaboration o Prepare.
A new look at the Linux Operating System
Report of Liverpool HEP Computing during 2007 Executive Summary. Substantial and significant improvements in the local computing facilities during the.
Birmingham site report Lawrie Lowe: System Manager Yves Coppens: SouthGrid support HEP System Managers’ Meeting, RAL, May 2007.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
JLab Status & 2016 Planning April 2015 All Hands Meeting Chip Watson Jefferson Lab Outline Operations Status FY15 File System Upgrade 2016 Planning for.
CNIL Report April 4 th, CNIL Report (Apr 4 th, 2005) Two Major Goals: –Improvement of Instructional Services –Strengthening research IT infrastructure.
12 GeV Era Computing (CNI) Andy Kowalski May 20, 2011.
Virtual Network Servers. What is a Server? 1. A software application that provides a specific one or more services to other computers  Example: Apache.
University of Tennessee at Chattanooga IT Master Plan Helpdesk: (423)
Edinburgh Site Report 1 July 2004 Steve Thorn Particle Physics Experiments Group.
UCL Site Report Ben Waugh HepSysMan, 22 May 2007.
PPD Computing “Business Continuity” Windows and Mac Kevin Dunford May 17 th 2012.
Scott Conrad Will Baty October 1, Agenda Process for 2014 Revision Primary Needs by Area o Hardware + Software + Staff/Training = $ o Review of.
IT in the 12 GeV Era Roy Whitney, CIO May 31, 2013 Jefferson Lab User Group Annual Meeting.
DECS Community IT DIVISION OF ENGINEERING COMPUTING SERVICES Michigan State University College of Engineering.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
CIS 460 – Network Design Seminar Network Security Scanner Tool GFI LANguard.
9/16/2000Ian Bird/JLAB1 Planning for JLAB Computational Resources Ian Bird.
ITS Update – Fall 2015 Sept 2, 2015 John Levay. Background Over last year, broad consultation with academics on classrooms Coordinator’s forums ITS/FMS/CAE/Library.
Outline IT Organization SciComp Update CNI Update
Design & Management of the JLAB Farms Ian Bird, Jefferson Lab May 24, 2001 FNAL LCCWS.
30-Jun-04UCL HEP Computing Status June UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS.
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Site Networking Anna Jordan April 28, 2009.
1 Computing & Networking User Group Meeting Roy Whitney Andy Kowalski Sandy Philpott Chip Watson 17 June 2008.
Operating Systems & Information Services CERN IT Department CH-1211 Geneva 23 Switzerland t OIS Working with Windows 7 at CERN Michał Budzowski.
NLIT May 26, 2010 Page 1 Computing Jefferson Lab Users Group Meeting 8 June 2010 Roy Whitney CIO & CTO.
Scientific Computing Experimental Physics Lattice QCD Sandy Philpott May 20, 2011 IT Internal Review 12GeV Readiness.
ExamSoft at BU LAW Boston University School of Law Office of Systems & Technology.
Jefferson Lab Site Report Kelvin Edwards Thomas Jefferson National Accelerator Facility Newport News, Virginia USA
Katie Antypas User Services Group Lawrence Berkeley National Lab 17 February 2012 JGI Training Series.
JLab Scientific Computing: Theory HPC & Experimental Physics Thomas Jefferson National Accelerator Facility Newport News, VA Sandy Philpott.
Integrating JASMine and Auger Sandy Philpott Thomas Jefferson National Accelerator Facility Jefferson Ave. Newport News, Virginia USA 23606
VKSF 423 System Administration III Enterprise Computing and The Role of IT.
Operated by the Southeastern Universities Research Association for the U.S. Depart. Of Energy Thomas Jefferson National Accelerator Facility Andy Kowalski.
JLAB Computing Facilities Development Ian Bird Jefferson Lab 2 November 2001.
Panoptic Capacity Planning Presented by. "Scotty, I need warp speed in 3 minutes or we're all dead!” (William Shatner - Star Trek II ‘The Wrath of Khan’)
Jefferson Lab Site Report Sandy Philpott Thomas Jefferson National Accelerator Facility Newport News, Virginia USA
May 25-26, 2006 LQCD Computing Review1 Jefferson Lab 2006 LQCD Analysis Cluster Chip Watson Jefferson Lab, High Performance Computing.
Jefferson Lab Site Report Sandy Philpott Thomas Jefferson National Accelerator Facility Jefferson Ave. Newport News, Virginia USA 23606
Jefferson Lab Site Report Sandy Philpott Thomas Jefferson National Accelerator Facility (formerly CEBAF - The Continuous Electron Beam Accelerator Facility)
PC clusters in KEK A.Manabe KEK(Japan). 22 May '01LSCC WS '012 PC clusters in KEK s Belle (in KEKB) PC clusters s Neutron Shielding Simulation cluster.
Cyber Security Review, April 23-24, 2002, 0 Operated by the Southeastern Universities Research Association for the U.S. Depart. Of Energy Thomas Jefferson.
1 Cluster Development at Fermilab Don Holmgren All-Hands Meeting Jefferson Lab June 1-2, 2005.
Gareth Smith RAL PPD RAL PPD Site Report. Gareth Smith RAL PPD RAL Particle Physics Department Overview About 90 staff (plus ~25 visitors) Desktops mainly.
Batch Software at JLAB Ian Bird Jefferson Lab CHEP February, 2000.
Information Technology Support Services Focusing on our customers 1.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
Fermi National Accelerator Laboratory & Thomas Jefferson National Accelerator Facility SciDAC LQCD Software The Department of Energy (DOE) Office of Science.
Jefferson Lab Site Update Winter 2010 ESCC Meeting Andy Kowalski Bryan Hess February 4, 2010.
Jefferson Lab Site Report Sandy Philpott HEPiX Fall 07 Genome Sequencing Center Washington University at St. Louis.
Installing RMS 3.0 Contractor Mode
Working with Windows 7 at CERN
Compute and Storage For the Farm at Jlab
LQCD Computing Project Overview
The new starbucks Jill E. Farris.
Technology Fee Presentation
Welcome to SharePoint Saturday Houston
Computational Requirements
Vanderbilt Tier 2 Project
FY09 Tactical Plan Status Report for Site Networking
3.2 Virtualisation.
Scientific Computing At Jefferson Lab
Design Unit 26 Design a small or home office network
BusinessObjects IN Cloud ……InfoSol’s story
Outline IT Division News Other News
Laptops and Processes Modern laptops are multicore
Division of Engineering Computing Services
Presentation transcript:

Computing and IT Update Jefferson Lab User Group Roy Whitney, CIO & CTO 10 June 2009

Sci Comp – Physics Farm Moving 2 racks 6n nodes (bought by base funds) from LQCD to the farm (3rd upgrade funded by LQCD) Adding a new cluster of 10 nodes –2.8 GHz dual Nehalem (much faster than current farm nodes) –64 bit OS: CentOS 5.3 Desktop CUE Level 1/2 machines at RHEL bit will be supported –Primary user (and funder) is muon beam simulation project, but farm will end up getting lots of cycles & testbed for 64 bit Decommissioning oldest nodes (not worth the electricity anymore) Planning for FY2010 upgrades –Cache disk capacity & performance –More new Nehalem nodes (64 bit) –Some new work disk

Sci Comp – Tape Library Almost finished moving 3 PB of data from old silos to new library –When finished will have 3x bandwidth of old system –During past year 25%-50% of bandwidth was used for the transfer –Higher performance revealed problems in slow cache nodes (performance mismatch) Capacity is limited! –High tape usage on limited budget means from now on, not all data will fit into the tape library Oldest tapes are now being ejected and put into storage Re-mount will take up to a week Capacity upgrade will come in 2010, but…intention is to hold sliding window of N years of data (N tbd)

Lattice QCD / High Performance Computing JLab currently runs ~660 nodes with ~3700 cores, for LQCD computing, as part of the USQCD collaboration JLab will host a new $5M project for LQCD computing funded by ARRA through the Nuclear Physics office –$3.2M for computing (6x - 12x capacity gain at JLab) GPUs will be used as compute accelerators for part or all of the cluster –JLab will again be on the Top100 list of fastest computers in the world –~$0.3M for disk (over 250 Tbytes) –2 phases, first to be installed in November, second in January

Computing and Networking Infrastructure Helpdesk Hours during the Summer 8-12,1-4:30 Network –Registration of computers and automatic port configuration –Wireless changes coming  make it function like wired and put all unmanaged laptops on guest list management will be moving from Majordomo to Mailman Telecom: testing VoIP Cyber Security –Managed desktops have resulted in reduced vulnerabilities discovered from scanning and faster remediation –Phishing/spear phishing is the currently preferred attack of choice –We do not ask for passwords or personal information in

Management Information Systems The online user registration form is undergoing improvements. Remember: submit all publications related to JLab research into the publications database!

Power Outage As part of preventing/managing potential power outages/brownouts and as we get our power under the Commonwealth of Virginia’s contract, we are participating in a power management program. –There will be a test mid day, tomorrow 11 June where farm and LQCD clusters will go off line for several hours. This should have no effect an the network, servers, desktops, , file access, etc. Some lighting, HVAC, etc. will be off during the test. Please support the test as success will lower our power bill! –If there sever electrical issues subsequently during the summer in the Eastern Mid-Atlantic, we may be asked to drop power up to 12 times. We are not required to do so! It is not anticipated that this will be likely.