December2002, Garching ALMA Computing IDR ALMA AIPS++ Audit Steven T. Myers (NRAO)

Slides:



Advertisements
Similar presentations
National Radio Astronomy Observatory June 13/14, 2005 EVLA Phase II Proposal Review EVLA Phase II Computing Development Bryan Butler (EVLA System Engineer.
Advertisements

Software Quality Assurance Plan
Copyright 2006 Prentice-Hall, Inc. Essentials of Systems Analysis and Design Third Edition Joseph S. Valacich Joey F. George Jeffrey A. Hoffer Chapter.
Project Management Workshop. Nick Cook  Citigroup Corporate and Investment Bank  European Technology Business Office Manager Edinburgh University April.
Copyright 2004 Prentice-Hall, Inc. Essentials of Systems Analysis and Design Second Edition Joseph S. Valacich Joey F. George Jeffrey A. Hoffer Chapter.
OVSA Expansion Software Overview Gordon Hurford Kickoff Meeting NJIT 25-Oct-2010.
Documentation Testing
Chapter 8 Prototyping and Rapid Application Development
Fundamentals of Information Systems, Second Edition
Project Plan The Development Plan The project plan is one of the first formal documents produced by the project team. It describes  How the project will.
Stoimen Stoimenov QA Engineer QA Engineer SitefinityLeads,SitefinityTeam6 Telerik QA Academy Telerik QA Academy.
Key Performance Indicators - KPI’s
EVLA Computing Schedule, Staffing, Testing, Tracking.
Copyright 2002 Prentice-Hall, Inc. Chapter 3 Managing the Information Systems Project Modern Systems Analysis and Design Third Edition Jeffrey A. Hoffer.
Introduction to Software Quality Assurance (SQA)
Hunt for Molecules, Paris, 2005-Sep-20 Software Development for ALMA Robert LUCAS IRAM Grenoble France.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
CLAS12 CalCom Activity CLAS Collaboration Meeting, March 6 th 2014.
1 ANASAC Meeting – May 20, 2015 ALMA Pipeline Brian Glendenning (for Jeff Kern)
Quality Control Project Management Unit Credit Value : 4 Essential
McMullinEVLA Advisory Committee Meeting December 14-15, 2004 EVLA Data Post-processing: SSG (AIPS++/CASA) Development J. McMullin.
Nick Elias 2010 May 14 CASA Developers' Meeting1.
ALMA Integrated Computing Team Coordination & Planning Meeting #2 Santiago, January 2014 Control Group Planning Rafael Hiriart, Control Group Lead.
CZECH STATISTICAL OFFICE 1 The Quality Metadata System In the Czech Statistical Office Work Session on Statistical Metadata (METIS)
S.T. MyersEVLA Advisory Committee Meeting September 6-7, 2007 EVLA Algorithm Research & Development Steven T. Myers (NRAO) CASA Project Scientist with.
ALMA Software B.E. Glendenning (NRAO). 2 ALMA “High Frequency VLA” in Chile Presently a European/North American Project –Japan is almost certainly joining.
DEFINING THE PROJECT CHAPTER 4.
Post-processing Bryan Butler, for Joe McMullin. Bryan ButlerEVLA NSF Review 2006May EVLA Post-processing Mission: Primary purpose is to provide.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
Refined ECSS Software Process Model Elements SD-TN-AI-0570, Issue 5 APPENDIX D.
Overall Data Processing Architecture Review EVLA Monitor and Control Interfaces July , 2002EVLA Data Processing PDR Bill Sahr.
ASAC Meeting - May 2004 Computing IPT Science Requirements: Status Report Robert Lucas & Debra Shepherd.
Gustaaf van MoorselEVLA Advisory Committee Meeting May 8-9, 2006 EVLA Computing Software Overview.
EVLA Software Bryan Butler. 2007May22EVLA SAGE Meeting2 Requirements and Goals of EVLA Software Maximize scientific throughput of the instrument At a.
2007Sep06 EAC Butler - Software Overview 1 Software Overview Bryan Butler.
Sanjay BhatnagarEVLA Advisory Committee Meeting May 8-9, EVLA Algorithm Research & Development Progress & Plans Sanjay Bhatnagar CASA/EVLA.
Status of Computing at PDR B.E. Glendenning (NRAO), G. Raffi (ESO)
Atacama Large Millimeter/submillimeter Array Expanded Very Large Array Robert C. Byrd Green Bank Telescope Very Long Baseline Array Data Processing Progress.
P.NapierEVLA Advisory Comm, 14 Dec 2004 Project Overview Peter Napier, EVLA Project Manager Status 2003 Committee Response.
RupenEVLA Advisory Committee Meeting May 8-9, Scientific Oversight and Testing of Software Michael P. Rupen EVLA Project Scientist for Software.
Post-processing Overview Bryan Butler, for Joe McMullin.
Gustaaf van MoorselEVLA Advisory Committee Meeting December 14-15, 2004 EVLA Computing Organization/Development.
EVLA Computing Software Overview. Gustaaf van MoorselEVLA Advisory Committee Meeting May 8-9, Contents History Organization and staffing Staffing.
Robustness Initiative Jeff Kern NRAO. CASA Robustness and Reliability Number one priority from the CASA User Survey was reliability. – Survey did not.
ESO, 17 April 2007ESAC meeting1 ALMA offline User Test 5 Silvia Leurini, ESO.
Bryan ButlerEAC meeting 2003-Sep-091 Computing Issues: Scientific Requirements Bryan Butler EVLA Project Scientist for Software (starting 2003-Oct-01)
National Radio Astronomy Observatory May 10, 2006 – EVLA Advisory Committee Meeting Post-processing Overview Bryan Butler (& Joe McMullin) (NRAO-AOC)
Copyright 2001 Prentice-Hall, Inc. Essentials of Systems Analysis and Design Chapter 2 Managing the Information Systems Project 2.1.
M.P. RupenEVLA Advisory Committee Meeting September 6-7, Correlator Test Plan Michael P. Rupen.
23-Nov-1999STScI Projects Monthly Status Review1 of 8 SpaceTelescopeScienceInstitute COS Status Report for period December, 1998 to November, 1999 Tony.
SwCDR (Peer) Review 1 UCB MAVEN Particles and Fields Flight Software Critical Design Review Peter R. Harvey.
EVLA Data Processing PDR Pipeline design Tim Cornwell, NRAO.
PDS4 Project Report PDS MC F2F University of Maryland Dan Crichton March 27,
S.T.MyersEVLA Advisory Committee Meeting December 14-15, 2004 EVLA Data Post-processing Overview Steven T. Myers AIPS++ Project Scientist.
MilitaryChildCare.com (MCC) Installation CYP Director Training October 2015.
T Iteration Demo LicenseChecker I2 Iteration
EAC Butler - SSS Software
EVLA Computing Software Overview.
Amendment Invoice Task Force Progress Report
Scientific Oversight and Testing of Software
VLA to EVLA Transition Plan
Gustaaf van Moorsel September 9, 2003
Project Management Process Groups
Amendment Invoice Task Force Progress Report
EVLA Algorithm Research & Development
Project Name - Testing Iteration 1 UAT Kick-off
Amendment Invoice Task Force Progress Report
Amendment Invoice Task Force Progress Report
Post-processing Overview
Building a “System” Moving from writing a program to building a system. What’s the difference?! Complexity, size, complexity, size complexity Breadth.
Presentation transcript:

December2002, Garching ALMA Computing IDR ALMA AIPS++ Audit Steven T. Myers (NRAO)

ALMA Computing IDR – December 2002, Garching 2 Process SSR Requirements –Offline Data Processing Requirements (SW-18) –Not package specific, any package must fulfill req. ALMA AIPS++ Audit –AIPS++ is baseline plan for ALMA –First step is to audit AIPS++ against SW-18 requirements –If too many unmet, may want to rethink baseline plan –Most important is where AIPS++ will be in 2007 –Includes performance benchmarking and user testing Progress monitoring –“delta” progress must be monitored –Cycle timescales TBD, e.g months? At milestones?

ALMA Computing IDR – December 2002, Garching 3 Process (2) Feedback to Requirements Process –Requirements may need to be modified or refined –Take input from ALMA project, or ASAC etc. Input to AIPS++ development planning –Identify milestones (e.g. ALMA Level 2 and 3) –Costing and delivery for unfulfilled Priority 1 & 2 items –Iterate with ALMA Computing and SSR Eventually move toward acceptance testing –Carried out by SSR? –Official procedure?

ALMA Computing IDR – December 2002, Garching 4 Current Status ALMA SSR for “Offline Data Processing” SW-18 Completed and reviewed Apr 2002 e2e SSR May 2001, revised Nov 2002 Audit Started Jul 2002, drafts Sep 2002 & Dec 2002 Based mostly on documentation, only minor testing Ready for SSR comment, passed to AIPS++ project Next Performance benchmarking, testing (need leader) Revise priorities, add timescales (e.g. based on milestones) Lead up to next audit in 2003 or 2004

ALMA Computing IDR – December 2002, Garching 5 Grading (1) First pass (2002) –Audit state of AIPS++ as of September 2002 in meeting ALMA requirements (SW-18) –Identify if functionality is present, based on documentation and auditor inventory of package –Some testing (e.g. on existing images or data) –Fold in AIPS++/IRAM tests where possible (e.g. in iramcalibrater module), appeared in builds at end of audit Procedure –Identify AIPS++ tools, functions, and documents relevant to each requirement –Grade based on functionality, usability, and/or documentation (depending on specific requirement)

ALMA Computing IDR – December 2002, Garching 6 Grading (2) Priorities (from SSR SW-18 Requirements) –1 = Critical (all Priority 1 features should be present) –2 = Important (90% of Priority 2 items should be fulfilled) –3 = Desirable (enhancements and future development) Grades (from audit) –A = acceptable –A/E = acceptable, but enhancements desired –I = inadequate –N = not available –U = unable to grade (e.g. ALMA definitions needed)

ALMA Computing IDR – December 2002, Garching 7 Grading (3) Severity (for I and N grades) –low –medium –high Grading procedure –3 principal auditors (Myers, Viallefond, Morita) –Plus deputy auditors (Brogan, Coulais, Caillat) –Input from others (Lucas, Glendenning, Cornwell, Brouw) –Myers audited all req., tried to have overlap on most others –In cases with disparate grades (~14% of req.), Myers drafted unification

ALMA Computing IDR – December 2002, Garching 8 Results – Chart Guide Work to be done by ALMA These should be 0 (in ~2007) These should be <10% of the total Explanatory – not results!

ALMA Computing IDR – December 2002, Garching 9 Breakdown A : A/E : I/N : U –All (489) 52% / 9% / 33% / 6% –Priority 1 (293) 60% / 7% / 29% / 4% –Priority 2 (135) 49% / 12% / 32% / 7% –Priority 3 (61) 23% / 13% / 54% / 10% Results – Overall

ALMA Computing IDR – December 2002, Garching 10 Breakdown A : A/E : I/N : U –All (23) 52% / 4% / 17% / 26% –Priority 1 (12) 42% / 8% / 33% / 17% –Priority 2 (9) 67% / 0% / 11% / 22% –Priority 3 (2) 50% / 0% / 50% / 0% Results – 1 General

ALMA Computing IDR – December 2002, Garching 11 Breakdown A : A/E : I/N : U –All (53) 58% / 11% / 26% / 4% –Priority 1 (26) 65% / 4% / 27% / 4% –Priority 2 (22) 55% / 23% / 18% / 5% –Priority 3 (5) 40% / 0% / 60% / 0% Results – 2 Interface

ALMA Computing IDR – December 2002, Garching 12 Breakdown A : A/E : I/N : U –All (127) 81% / 1% / 16% / 1% –Priority 1 (91) 86% / 1% / 12% / 1% –Priority 2 (24) 88% / 0% / 4% / 8% –Priority 3 (12) 33% / 0% / 67% / 0% Results – 3 Data Handling

ALMA Computing IDR – December 2002, Garching 13 Breakdown A : A/E : I/N : U –All (76) 22% / 4% / 63% / 11% –Priority 1 (51) 27% / 4% / 63% / 6% –Priority 2 (15) 20% / 0% / 67% / 13% –Priority 3 (10) 0% / 10% / 60% / 30% Results – 4 Calibration & Editing

ALMA Computing IDR – December 2002, Garching 14 Breakdown A : A/E : I/N : U –All (38) 39% / 24% / 29% / 8% –Priority 1 (24) 38% / 29% / 21% / 13% –Priority 2 (13) 46% / 15% / 38% / 0% –Priority 3 (1) 0% / 0% / 100% / 0% Results – 5 Imaging

ALMA Computing IDR – December 2002, Garching 15 Breakdown A : A/E : I/N : U –All (91) 56% / 19% / 24% / 1% –Priority 1 (45) 73% / 9% / 16% / 2% –Priority 2 (29) 45% / 28% / 28% / 0% –Priority 3 (17) 29% / 29% / 41% / 0% Results – 6 Data Analysis

ALMA Computing IDR – December 2002, Garching 16 Breakdown A : A/E : I/N : U –All (56) 32% / 13% / 46% / 9% –Priority 1 (36) 39% / 11% / 50% / 0% –Priority 2 (11) 27% / 9% / 45% / 0% –Priority 3 (9) 11% / 22% / 33% / 33% Results – 7 Visualization

ALMA Computing IDR – December 2002, Garching 17 Breakdown A : A/E : I/N : U –All (25) 32% / 0% / 64% / 4% –Priority 1 (8) 63% / 0% / 38% / 0% –Priority 2 (12) 17% / 0% / 75% / 8% –Priority 3 (5) 20% / 0% / 80% / 0% Results – 8 Special Features

ALMA Computing IDR – December 2002, Garching 18 Audit Summary Requirements and Sub(sub)requirements “equal”: –61% of all req., and 67% of Priority 1 grade A or A/E –High severity defects (I or N) 6% of all or 9% of Priority 1 –29% of Priority 1 requirements graded I or N – target –60% of all requirements classified Priority 1 – should fix this! Problem Areas: –Calibration & Editing (63% I or N for Priority 1) –Visualization (50% I or N for Priority 1) –Imaging (needs ALMA input and algorithm development) –Interface (performance and look-and-feel deemed inadequate) Cost to complete (Kumar) ~ 26 FTE?

ALMA Computing IDR – December 2002, Garching 19 Next – Benchmarking Goals: –Quantify AIPS++ performance on ALMA sized representative datasets –Compare with other packages –Locate problem areas in package –Basis for assay and regression testing Test Datasets –Representative of ALMA data (e.g. size) –Real and simulated data –Should cover major modes –SSR: define needed sets as soon as possible

ALMA Computing IDR – December 2002, Garching 20 Procedure Identify test datasets –SSR defined or provided (e.g. IRAM, BIMA, simulated) –AIPS++ provided (e.g. simulated) Build and run scripts –AIPS++ provide scripts (Rusk, Jan 2003) –SSR involvement (new SSR hire?) –IRAM PdB Phase II and III? Migrate into assay module(s) –Build into alma package –Use benchmarking tools Compare versus other packages –SSR led, with AIPS++ input –Must compare “apples” with “apples”

ALMA Computing IDR – December 2002, Garching 21 Outcome Identify problem areas –Determine cause of problem Augmentation or change of technology required Algorithm issue Size of problem issue (e.g. pure flops) Effectiveness of this step depends on how carefully the benchmarking was done! –Profiling of code –Fix where necessary (cost, fit into development plan) –Priorities for development Build benchmarking process into auditing and development cycles

ALMA Computing IDR – December 2002, Garching 22 Example – Hot Topics Imaging performance –Comparisons versus MIRIAD, GILDAS Gridding procedure (e.g. frequency independent) –Comparison with AIPS Joint Stokes deconvolution Interface performance and presentation –Interface speed Event rate, a glish issue? Change technology? –GUI look and feel Development issues (need GUI expert? User desires?) Technology choice (e.g. is Python our savior?) Measurement sets and fillers

ALMA Computing IDR – December 2002, Garching 23 Upcoming Deadlines AIPS++ Technical Review –Will address many of these tough questions –Tentatively scheduled for late Jan 2003 or Feb 2003 In time for PDR –Need to have some VLA benchmarks in place for this Will be significant AIPS++ and NAUG work in this area –Would like to have some first ALMA benchmarks also ALMA is AIPS++ top customer! (Rusk begin Jan 2003) Timescales –Need to have some info by PDR for March/April 2003 –Will have Technical Review in hand, could scale from this if necessary –Who will work on this? This will determine delivery date…

ALMA Computing IDR – December 2002, Garching 24 AIPS++ Reorganization New NRAO Director – Fred Lo –Critical reviews of all NRAO projects New roles in project –Joe McMullin (Project Manager) –Steve Myers (Project Scientist) –Kumar Golap (deputy Project Manager) –George Moellenbrock (Operations Manager)\ ALMA Subsystem interim leads –Tim Cornwell (Pipeline)  Lindsey Davis –Kumar Golap (Offline) Upcoming reviews –DM Review (late Jan 2003) –Technical Review (late Jan 2003 or Feb 2003)

ALMA Computing IDR – December 2002, Garching 25 Other AIPS++ Developments More User input into AIPS++ –NRAO AIPS++ User Group (NAUG) Auditing and testing, subsystem scientists –VLA Audit, testing and benchmarking (2003 Q1) –NRAO-wide requirements, audit (EVLA,GBT) Based on ALMA with some changes –Viewer focus group (May 2002) –User Interface focus group (Jan 2003) More ALMA input into AIPS++ –ALMA is high-profile customer for AIPS++! –ALMA has substantial influence on AIPS++ development e.g. through Project Scientist Through SSR and subsystem requirements