1 AMS-02 POCC & SOC MSFC, 9-Apr-2009 Mike Capell Avionics & Operations Lead Senior Research Scientist ISS: 108x80m 420T 86KW 400km AMS: 5x4x3m 7T 2.3+KW.

Slides:



Advertisements
Similar presentations
Alpha Magnetic Spectrometer Alpha Magnetic Spectrometer – 02 DELTA CDR Permanent Magnet Avionics Changes Summary May 4, 2010 Timothy J. Urban / ESCG /
Advertisements

Computing Strategy of the AMS-02 Experiment B. Shan 1 On behalf of AMS Collaboration 1 Beihang University, China CHEP 2015, Okinawa.
Presents The Silver Family An Integrated Approach to Processors, Data Communication and Head End Integration.
Communication Links Communication Link = Physical connection or Physical Medium Types: Wire Pair or Twisted Pair Coaxial Cable Fiber Optics Bandwidth,
Introduction to Computers Personal Computing 10. What is a computer? Electronic device Performs instructions in a program Performs four functions –Accepts.
AMS TIM, CERN Apr 12, 2005 AMS Computing and Ground Centers Status Report Alexei Klimentov —
Chapter 8 Implementing Disaster Recovery and High Availability Hands-On Virtual Computing.
OO Software and Data Handling in AMS Computing in High Energy and Nuclear Physics Beijing, September 3-7, 2001 Vitali Choutko, Alexei Klimentov MIT, ETHZ.
Status of AMS Regional Data Center The Second International Workshop on HEP Data Grid CHEP, KNU G. N. Kim, J. W. Shin, N. Tasneem, M. W.
CHEP as an AMS Remote Data Center International HEP DataGrid Workshop CHEP, KNU G.N. Kim, H.B. Park, K.H. Cho, S. Ro, Y.D. Oh, D. Son (Kyungpook)
Experimental Facilities Division ORNL DAQ System Interfaces Neutron Science Software Workshop Oct 13, 2003 Rick Riedel.
POCC activities AMS02 meeting at KSC October 11 th 2010.
Markus Frank (CERN) & Albert Puig (UB).  An opportunity (Motivation)  Adopted approach  Implementation specifics  Status  Conclusions 2.
Mike Capell / Jan ‘04AMS-02 Electronics1 AMS-02 Avionics FSR-II (21 May 2007) Mike Capell Avionics Lead Senior Research Scientist ISS: 108x80m 420T 86KW.
Evolution of Monitoring System for AMS Science Operation Center
AMS Summary 16:00 on Mon 26 Mar 2018 (GMT 085) To
16:00 on Tue 3 Apr 2018 (GMT 093) To 16:00 on Wed 4 Apr 2018 (GMT 094)
AMS Summary 16:00 on Thu 29 Mar 2018 (GMT 088) To
16:00 on Mon 2 Apr 2018 (GMT 092) To 16:00 on Thu 3 Apr 2018 (GMT 093)
AMS Summary 16:00 on Wed 24 Oct 2018 (GMT 297) To
AMS Summary 16:00 on Mon 29 Oct 2018 (GMT 302) To
AMS Summary 16:00 on Wed 31 Oct 2018 (GMT 304) To
AMS Summary 16:00 on Mon 22 Oct 2018 (GMT 295) To
AMS Summary 16:00 on Mon 19 Nov 2018 (GMT 323) To
AMS Summary 16:00 on Thu 15 Nov 2018 (GMT 319) To
AMS Summary 16:00 on Fri 16 Nov 2018 (GMT 320) To
AMS Summary 16:00 on Wed 21 Nov 2018 (GMT 325) To
AMS Summary 16:00 on Wed 14 Nov 2018 (GMT 318) To
AMS Summary 16:00 on Tue 20 Nov 2018 (GMT 324) To
AMS Summary 16:00 on Fri 05 Oct 2018 (GMT 278) To
16:00 on Thu 8 Nov 2018 (GMT 312) To 16:00 on Fri 9 Nov 2018 (GMT 313)
AMS Summary 16:00 on Fri 9 Nov 2018 (GMT 313) To
16:00 on Fri 2 Nov 2018 (GMT 306) To 16:00 on Mon 5 Nov 2018 (GMT 309)
16:00 on Tue 6 Nov 2018 (GMT 310) To 16:00 on Wed 7 Nov 2018 (GMT 311)
AMS Summary 16:00 on Fri 23 Nov 2018 (GMT 327) To
AMS Summary 16:00 on Thu 22 Nov 2018 (GMT 326) To
16:00 on Wed 5 Dec 2018 (GMT 339) To 16:00 on Thu 6 Dec 2018 (GMT 340)
AMS Summary 16:00 on Mon 12 Nov 2018 (GMT 316) To
AMS Summary 16:00 on Fri 30 Nov 2018 (GMT 334) To
AMS Summary 16:00 on Wed 23 Jan 2019 (GMT 023) To
AMS Summary 16:00 on Mon 28 Jan 2019 (GMT 028) To
AMS Summary 16:00 on Wed 9 Jan 2019 (GMT 009) To
AMS Summary 16:00 on Fri 11 Jan 2019 (GMT 011) To
AMS Summary 16:00 on Mon 14 Jan 2019 (GMT 014) To
AMS Summary 16:00 on Thu 24 Jan 2019 (GMT 024) To
AMS Summary 16:00 on Wed 13 Feb 2019 (GMT 044) To
AMS Summary 16:00 on Mon 18 Feb 2019 (GMT 049) To
AMS Summary 16:00 on Fri 15 Feb 2019 (GMT 046) To
AMS Summary 16:00 on Tue 19 Feb 2019 (GMT 050) To
AMS Summary 16:00 on Thu 14 Feb 2019 (GMT 045) To
AMS Summary 16:00 on Thu 10 Jan 2019 (GMT 010) To
AMS Summary 16:00 on Tue 26 Feb 2019 (GMT 057) To
AMS Summary 16:00 on Mon 25 Feb 2019 (GMT 056) To
AMS Summary 16:00 on Fri 22 Feb 2019 (GMT 053) To
AMS Summary 16:00 on Fri 12 Apr 2019 (GMT 102) To
AMS Summary 16:00 on Mon 15 Apr 2019 (GMT 105) To
AMS Summary 16:00 on Thu 11 Apr 2019 (GMT 101) To
AMS Summary 16:00 on Thu 18 Apr 2019 (GMT 108) To
AMS Summary 16:00 on Tue 30 Apr 2019 (GMT 120) To
16:00 on Fri 3 May 2019 (GMT 123) To 16:00 on Mon 6 May 2019 (GMT 126)
AMS Summary 16:00 on Fri 10 May 2019 (GMT 130) To
AMS Summary 16:00 on Mon 13 May 2019 (GMT 133) To
AMS Summary 16:00 on Tue 14 May 2019 (GMT 134) To
AMS Summary 16:00 on Wed 22 May 2019 (GMT 142) To
16:00 on Tue 2 Jul 2019 (GMT 183) To 16:00 on Mon 8 Jul 2019 (GMT 189)
AMS Summary 16:00 on Wed 10 Jul 2019 (GMT 191) To
AMS Summary 16:00 on Tue 9 Jul 2019 (GMT 190) To
AMS Summary 16:00 on Thu 11 Jul 2019 (GMT 192) To
16:00 on Mon 8 Jul 2019 (GMT 189) To 16:00 on Tue 9 Jul 2019 (GMT 190)
AMS Summary 16:00 on Thu 04 Oct 2018 (GMT 277) To
Presentation transcript:

1 AMS-02 POCC & SOC MSFC, 9-Apr-2009 Mike Capell Avionics & Operations Lead Senior Research Scientist ISS: 108x80m 420T 86KW 400km AMS: 5x4x3m 7T 2.3+KW 3+ years

Power: VDC ~2.3 KW LRDL for Cmd & Mon 1553B Bus 1 Kbit/s in 10 Kbit/s out 10 B/sec CHD HRDL for Event Data Taxi F/O (STS:RS422) orbit S-Band CHD disk & Command Monitoring Science Data Earth TDRS LRDL HRDL UMA EVA International Space Station Data M&C Power AMS POCC ACOP Data Link Duty Cycles: ~70% Contingency Critical Health Data (CHD) Duty Cycle > 90% AMS-02 Electrical Interfaces on ISS 2

Subdetector Requirements: Summary SubdetectorReq’mentsChannelsRaw Kbits U: TRDGas gain5,24884 S: ToF+ACC100 ps48*4*849 T: Trackerfew fC196,6083,146 R: RICH Single γ 680*16*2348 E: ECAL1:60,000324*(4*2+1)47  Raw Kbits/event 3,674 * Event Rate≤ 2 Khz = Total Raw Data Rate~7 Gbit/sec 7 Gbit/sec >> 4 Mbit/sec ⇒ Restrict Rate & Size Specify, design, develop, produce: High Speed, High Capacity, Low Power, Low Weight, Reliable Signal & Data Processing ON ORBIT ! 3

4 Examples: Data Reduction (UDR2, TDR2) Boards 70 types of boards, 454 boards total

Electronics on mounting jigs, 4 Aug CAB

Electronics cabled on mounting jigs, 5 Nov

AMS Operational Locations 1.Pre-integration (done) and Integration in AMS Clean Room, B867. Operations Center in B Beam Test at CERN, EHN1. Operations Center colocated or in B Thermal-Vacuum and Electro-Magnetic Compatibility Tests at ESA ESTEC, Nordwijk, NL. Operations Center at ESTEC. 4. Interface Verification Testing at SSPF Bldg, Kennedy Space Center (KSC), FL. Operations Center at KSC. 5.End-to-end testing on the launch pad. Operations Centers at KSC and then at Johnson Space Center (JSC), Houston, TX. 6.Inside the Space Shuttle en route to the Station. Operations at JSC. 7.Installation, activation and full operations on the Space Station. Operations at JSC (~3 months), then shift to CERN B892. 7’.Backup Operations Center in the US (U. Maryland). 7

AMS Data Flow (ISS) AMS Science Data and Monitoring Data A HRDL (F/O) High Rate System Low Rate System Monitoring Data B and Critical Health Data Commands LRDL (1553B) Monitoring Data B ISS Ancillary Data, Critical Health Data S-Band Commands Ku-Band Science Data and Monitoring Data A+B White Sands, NM NASA Networks SOC: Science Operations Center a.k.a. “offline” POCC: Payload Operations Control Center a.k.a. “online” Regional Center Payload Data Service System Real Time Data Service Short Term Storage Long Term Storage AMS GSC-N AMS GSC-R Playback File Transfer Voice Loop Commands Nominal: All Data UDP All Data POIC, MSFC, AL Regional Center Redundant: All Data Cmds to GSC-N INTERNET ISS AMS provided NASA provided Cmds to GSC-R All Data Internet Connection (TCP/IP, FTP, X) M.Capell, Oct 2008 All Data: Science Data, Monitoring Data A+B, ISS Ancillary Data and Critical Health Data TDRS The Wall 8

Ground Support Computers (as of today) 9

Ground Support Computers in Clean Room 10 PCI-HRDL/ PMCHRDL PCI-HRDL/ PMCHRDL USB422 PCMCIA ACE 1553 PCMCIA ACE 1553 EPP BOX HRDL Splitter RS422 ISS 1553 STS 1553 AMS CAN Buses PCGSC00 (commanding & recording) PCGSC04 (recording) PCGSC01 (commanding & recording) PCGSC03 (commanding & recording) PCGSC02 (commanding & monitoring) NFS Server for HRDL/422 frames Command Server for HRDL/422 NFS Server for HRDL/422 frames NFS Server for ISS 1553 frames Command Server for ISS 1553 NFS Server for STS 1553 frames Command Server for STS 1553 Backup stream RS232-USB JMDC Terminals

AMS POCC at CERN for ISS SOC AMS GSC-N AMS GSC-R Voice Loop Commands Nominal: All Data Redundant: All Data Cmds to GSC-N Cmds to GSC-R All Data All Data: Science Data, Monitoring Data A+B, ISS Ancillary Data and Critical Health Data 1. Shift Leader (alt. Commander) 2. Commander 3. DAQ+Trigger +RunControl 4. Magnet+ Cryocoolers 10. PDS+GPS+ Star Tracker 5. Tracker+Laser Align.+Cooling 6.TRD+TRD Gas 7. TOF+ACC 9. ECAL 8. RICH Storage Server (+10 TByte) Production Node 0. Management Switch Functional Data Flow in the POCC (in reality, everything is done over Ethernet with the Switch in the center of a star configuration). Voice Loop Open source voice loop server handles internal communications and mixes in voice channel to/from NASA. Commands One specific station can actually send commands, others can prepare them. Video Live NASA video feeds are just another network service. “All Data” Stored, processed and redistributed from the Storage Server. Extra processing power provided by Processing Node. POIC, MSFC, AL The Wall POCC Station: PC + 2 Screens + headset + laptop location Thermal

POCC Commanding and Monitoring Consoles (as of today) 12 PCAAL12 PCAAL13 PCAAL14

All members of the cluster are connected to 2 Fiber Channel (FC) switches and have direct, redundant access to all arrays using GFS software. These arrays are also exported via NFS to production nodes. RAID 6 126TB GFS RAID 5 40TB GFS FC Switch 1FC Switch 2 File Server 1 RAID 6 126TB GFS RAID 6 126TB GFS RAID 5 10TB GFS DB Server 1 File Server 2 AMS Gateway DB Server 2 File Server 3 File Server 4 Already existPlanned for 2009Planed for 2010 – 2012 Production Node 1 Production Node 2 Production Node 6 Production Node 3 Ethernet (NFS) AMS SOC at CERN 13

SOC Computers (as of today) DB Server 1 DB Server2 AMS Gateway Raid Array & FC Switch1 File server1 File server2 FC Switch2 Raid Array UPS UPSes

SOC Software Currently: SLC4 (64bit) Operating System on all servers. Oracle 10g (64bit) installed, database migrated LSF batch system installed and in use GFS open source shared file system in use on disk arrays, few glitches. Future: These will be updated following CERN IT best practices. 15

Preintegration Cosmic Ray Data Rerun To verify the concept of the Remote Center Data reruns a trial rerun was done at CNAF/Milano. Few initial problems resolved – e.g., Data Transfer/Data Management software has been modified to cope with differences between Data & MC (use since 2005 for MC and transferred 20 TByte); 30 days to do complete rerun, mainly because of CNAF/Castor maintenance (in preparation for LHC) Lessons learned: Larger bandwidth to RC may be needed; It is essential to have raw data locally for reruns. 16