DataTAG CERN Oct 2002 R. Hughes-Jones Manchester Initial Performance Measurements With DataTAG PCs Gigabit Ethernet NICs (Work in progress Oct 02)

Slides:



Advertisements
Similar presentations
What happens when you try to build a low latency NIC? Mario Flajslik.
Advertisements

CALICE, Mar 2007, R. Hughes-Jones Manchester 1 Protocols Working with 10 Gigabit Ethernet Richard Hughes-Jones The University of Manchester
JIVE VLBI Network Meeting 15 Jan 2003 R. Hughes-Jones Manchester The EVN-NREN Project Richard Hughes-Jones The University of Manchester.
GNEW2004 CERN March 2004 R. Hughes-Jones Manchester 1 End-2-End Network Monitoring What do we do ? What do we use it for? Richard Hughes-Jones Many people.
ESLEA Closing Conference, Edinburgh, March 2007, R. Hughes-Jones Manchester 1 Protocols Working with 10 Gigabit Ethernet Richard Hughes-Jones The University.
Meeting on ATLAS Remote Farms. Copenhagen 11 May 2004 R. Hughes-Jones Manchester Networking for ATLAS Remote Farms Richard Hughes-Jones The University.
Slide: 1 Richard Hughes-Jones T2UK, October 06 R. Hughes-Jones Manchester 1 Update on Remote Real-Time Computing Farms For ATLAS Trigger DAQ. Richard Hughes-Jones.
CdL was here DataTAG/WP7 Amsterdam June 2002 R. Hughes-Jones Manchester 1 EU DataGrid - Network Monitoring Richard Hughes-Jones, University of Manchester.
Slide: 1 Richard Hughes-Jones PFLDnet2005 Lyon Feb 05 R. Hughes-Jones Manchester 1 Investigating the interaction between high-performance network and disk.
IEEE Real Time 2007, Fermilab, 29 April – 4 May R. Hughes-Jones Manchester 1 Using FPGAs to Generate Gigabit Ethernet Data Transfers & The Network Performance.
DataGrid WP7 Meeting CERN April 2002 R. Hughes-Jones Manchester Some Measurements on the SuperJANET 4 Production Network (UK Work in progress)
JIVE VLBI Network Meeting 28 Jan 2004 R. Hughes-Jones Manchester Brief Report on Tests Related to the e-VLBI Project Richard Hughes-Jones The University.
CALICE UCL, 20 Feb 2006, R. Hughes-Jones Manchester 1 10 Gigabit Ethernet Test Lab PCI-X Motherboards Related work & Initial tests Richard Hughes-Jones.
DataTAG Meeting CERN 7-8 May 03 R. Hughes-Jones Manchester 1 High Throughput: Progress and Current Results Lots of people helped: MB-NG team at UCL MB-NG.
PFLDNet Argonne Feb 2004 R. Hughes-Jones Manchester 1 UDP Performance and PCI-X Activity of the Intel 10 Gigabit Ethernet Adapter on: HP rx2600 Dual Itanium.
© 2006 Open Grid Forum Interactions Between Networks, Protocols & Applications HPCN-RG Richard Hughes-Jones OGF20, Manchester, May 2007,
Slide: 1 Richard Hughes-Jones CHEP2004 Interlaken Sep 04 R. Hughes-Jones Manchester 1 Bringing High-Performance Networking to HEP users Richard Hughes-Jones.
ESLEA Bedfont Lakes Dec 04 Richard Hughes-Jones Network Measurement & Characterisation and the Challenge of SuperComputing SC200x.
CdL was here DataTAG CERN Sep 2002 R. Hughes-Jones Manchester 1 European Topology: NRNs & Geant SuperJANET4 CERN UvA Manc SURFnet RAL.
GridPP Collaboration Meeting May 2002 R. Hughes-Jones Manchester 1 Networking in Under 30 Minutes ! Richard Hughes-Jones, University of Manchester.
Can Google Route? Building a High-Speed Switch from Commodity Hardware Guido Appenzeller, Matthew Holliman Q2/2002.
Sockets vs. RDMA Interface over 10-Gigabit Networks: An In-depth Analysis of the Memory Traffic Bottleneck Pavan Balaji  Hemal V. Shah ¥ D. K. Panda 
02 nd April 03Networkshop Managed Bandwidth Next Generation F. Saka UCL NETSYS (NETwork SYStems centre of excellence)
GGF4 Toronto Feb 2002 R. Hughes-Jones Manchester Initial Performance Measurements Gigabit Ethernet NICs 64 bit PCI Motherboards (Work in progress Mar 02)
13th-14th July 2004 University College London End-user systems: NICs, MotherBoards, TCP Stacks & Applications Richard Hughes-Jones.
A TCP/IP transport layer for the DAQ of the CMS Experiment Miklos Kozlovszky for the CMS TriDAS collaboration CERN European Organization for Nuclear Research.
Evaluation of the LDC Computing Platform for Point 2 SuperMicro X6DHE-XB, X7DB8+ Andrey Shevel CERN PH-AID ALICE DAQ CERN 10 October 2006.
EVN-NREN Meeting, Zaandan, 31 Oct 2006, R. Hughes-Jones Manchester 1 FABRIC 4 Gigabit Work & VLBI-UDP Performance and Stability. Richard Hughes-Jones The.
“ PC  PC Latency measurements” G.Lamanna, R.Fantechi & J.Kroon (CERN) TDAQ WG –
Essentials components in Mobo The more important part in a mobo is the chipset. It make the interconnexions between all the other parts on the mobo.
Slide: 1 Richard Hughes-Jones e-VLBI Network Meeting 28 Jan 2005 R. Hughes-Jones Manchester 1 TCP/IP Overview & Performance Richard Hughes-Jones The University.
4 Dec 2006 Testing the machine (X7DBE-X) with 6 D-RORCs 1 Evaluation of the LDC Computing Platform for Point 2 SuperMicro X7DBE-X Andrey Shevel CERN PH-AID.
High TCP performance over wide area networks Arlington, VA May 8, 2002 Sylvain Ravot CalTech HENP Working Group.
1 Network Performance Optimisation and Load Balancing Wulf Thannhaeuser.
ESLEA VLBI Bits&Bytes Workshop, 4-5 May 2006, R. Hughes-Jones Manchester 1 VLBI Data Transfer Tests Recent and Current Work. Richard Hughes-Jones The University.
Internet data transfer record between CERN and California Sylvain Ravot (Caltech) Paolo Moroni (CERN)
MB - NG MB-NG Meeting Dec 2001 R. Hughes-Jones Manchester MB – NG SuperJANET4 Development Network SuperJANET4 Production Network Leeds RAL / UKERNA RAL.
ESLEA Bits&Bytes, Manchester, 7-8 Dec 2006, R. Hughes-Jones Manchester 1 Protocols DCCP and dccpmon. Richard Hughes-Jones The University of Manchester.
An Architecture and Prototype Implementation for TCP/IP Hardware Support Mirko Benz Dresden University of Technology, Germany TERENA 2001.
Robin HJ & R. Hughes-Jones Manchester Sep 1999 Gigabit Ethernet in Ptolemy Status Sep 99 : Stars that exist : –GigEChipTranslate between GigEPacket and.
CAIDA Bandwidth Estimation Meeting San Diego June 2002 R. Hughes-Jones Manchester UDPmon and TCPstream Tools to understand Network Performance Richard.
PFLDNet Workshop February 2003 R. Hughes-Jones Manchester Some Performance Measurements Gigabit Ethernet NICs & Server Quality Motherboards Richard Hughes-Jones.
High bit rate tests between Manchester and JIVE Looking at data rates attainable with UDP along with packet loss and reordering statistics Simon Casey,
TERENA Networking Conference, Zagreb, Croatia, 21 May 2003 High-Performance Data Transport for Grid Applications T. Kelly, University of Cambridge, UK.
Kraków4FutureDaQ Institute of Physics & Nowoczesna Elektronika P.Salabura,A.Misiak,S.Kistryn,R.Tębacz,K.Korcyl & M.Kajetanowicz Discrete event simulations.
GNEW2004 CERN March 2004 R. Hughes-Jones Manchester 1 Lessons Learned in Grid Networking or How do we get end-2-end performance to Real Users ? Richard.
An Efficient Gigabit Ethernet Switch Model for Large-Scale Simulation Dong (Kevin) Jin.
Networks ∙ Services ∙ People Richard-Hughes Jones eduPERT Training Session, Porto A Hands-On Session udpmon for Network Troubleshooting 18/06/2015.
Final EU Review - 24/03/2004 DataTAG is a project funded by the European Commission under contract IST Richard Hughes-Jones The University of.
1 eVLBI Developments at Jodrell Bank Observatory Ralph Spencer, Richard Hughes- Jones, Simon Casey, Paul Burgess, The University of Manchester.
ESLEA VLBI Bits&Bytes Workshop, 31 Aug 2006, R. Hughes-Jones Manchester 1 vlbi_udp Throughput Performance and Stability. Richard Hughes-Jones The University.
L1/HLT trigger farm Bologna setup 0 By Gianluca Peco INFN Bologna Genève,
16 th IEEE NPSS Real Time Conference 2009 IHEP, Beijing, China, 12 th May, 2009 High Rate Packets Transmission on 10 Gbit/s Ethernet LAN Using Commodity.
Recent experience with PCI-X 2.0 and PCI-E network interfaces and emerging server systems Yang Xia Caltech US LHC Network Working Group October 23, 2006.
DiSCoV Fall 2008 Paul A. Farrell Cluster Computing 1 Improving Cluster Performance Performance Evaluation of Networks.
Connect. Communicate. Collaborate 4 Gigabit Onsala - Jodrell Lightpath for e-VLBI Richard Hughes-Jones.
DataGrid WP7 Meeting Jan 2002 R. Hughes-Jones Manchester Initial Performance Measurements Gigabit Ethernet NICs 64 bit PCI Motherboards (Work in progress)
MB MPLS MPLS Technical Meeting Sep 2001 R. Hughes-Jones Manchester SuperJANET Development Network Testbed – Cisco GSR SuperJANET4 C-PoP – Cisco GSR.
NaNet Problem: lower communication latency and its fluctuations. How?
CALICE TDAQ Application Network Protocols 10 Gigabit Lab
R. Hughes-Jones Manchester
Networking between China and Europe
Data Transfer Node Performance GÉANT & AENEAS
Towards 10Gb/s open-source routing
Mar 2001 ATLAS T2UK Meeting R. Hughes-Jones
MB-NG Review High Performance Network Demonstration 21 April 2004
Motherboard.
MB – NG SuperJANET4 Development Network
High-Performance Data Transport for Grid Applications
Achieving reliable high performance in LFNs (long-fat networks)
Presentation transcript:

DataTAG CERN Oct 2002 R. Hughes-Jones Manchester Initial Performance Measurements With DataTAG PCs Gigabit Ethernet NICs (Work in progress Oct 02)

DataTAG CERN Oct 2002 R. Hughes-Jones Manchester UDP Latency: Intel onboard b2b Motherboard: SuperMicro P4DP8 Chipset: CPU: P4 2.2 GHz PCI:64 bit 66 MHz RedHat 7.3 Kernel Latency high Interrupt coalesance Latency NOT well behaved Slope Expect: PCI GigE0.008 PCI us/byte

DataTAG CERN Oct 2002 R. Hughes-Jones Manchester UDP Latency: Intel onboard b2b Motherboard: SuperMicro P4DP8 Chipset: CPU: P4 2.2 GHz PCI:64 bit 66 MHz RedHat 7.3 Kernel Max throughput 980Mbit/s No packet loss 30% CPU utilisation

DataTAG CERN Oct 2002 R. Hughes-Jones Manchester UDP Latency: SysKonnect b2b Motherboard: SuperMicro P4DP8 Chipset: CPU: P4 2.2 GHz PCI:64 bit 66 MHz RedHat 7.3 Kernel Latency low Interrupt every packet Latency well behaved Slope us/byte Expect: PCI GigE0.008 PCI us/byte

DataTAG CERN Oct 2002 R. Hughes-Jones Manchester UDP Throughput: SysKonnect b2b Motherboard: SuperMicro P4DP8 Chipset: CPU: P4 2.2 GHz PCI:64 bit 66 MHz RedHat 7.3 Kernel Max throughput 980Mbit/s No packet loss 30% utilisation Sender after wire speed

DataTAG CERN Oct 2002 R. Hughes-Jones Manchester UDP Latency: SysKonnect b2b with switch Motherboard: SuperMicro P4DP8 Chipset: CPU: P4 2.2 GHz PCI:64 bit 66 MHz RedHat 7.3 Kernel Latency low Interrupt every packet Latency well behaved Slope us/byte Expect: PCI GigE0.008 PCI us/byte

DataTAG CERN Oct 2002 R. Hughes-Jones Manchester UDP Throughput: SysKonnect b2b with switch Motherboard: SuperMicro P4DP8 Chipset: CPU: P4 2.2 GHz PCI:64 bit 66 MHz RedHat 7.3 Kernel Max throughput 980Mbit/s No packet loss 30% utilisation Sender ~30 % utilisation Receiver

DataTAG CERN Oct 2002 R. Hughes-Jones Manchester UDP Throughput: SysKonnect Chicago - CERN Motherboard: SuperMicro P4DP8 Chipset: CPU: P4 2.2 GHz PCI:64 bit 66 MHz RedHat 7.3 Kernel Max throughput 980Mbit/s No packet loss Packet loss when sending > wire rate

DataTAG CERN Oct 2002 R. Hughes-Jones Manchester UDP Throughput: SysKonnect Chicago - CERN Motherboard: SuperMicro P4DP8 Chipset: CPU: P4 2.2 GHz PCI:64 bit 66 MHz RedHat 7.3 Kernel Max throughput 980Mbit/s No packet loss 30% utilisation Sender ~30 % utilisation Receiver