Presentation is loading. Please wait.

Presentation is loading. Please wait.

December2002, Garching ALMA Computing IDR ALMA AIPS++ Audit Steven T. Myers (NRAO)

Similar presentations


Presentation on theme: "December2002, Garching ALMA Computing IDR ALMA AIPS++ Audit Steven T. Myers (NRAO)"— Presentation transcript:

1 December2002, Garching ALMA Computing IDR ALMA AIPS++ Audit Steven T. Myers (NRAO)

2 ALMA Computing IDR – December 2002, Garching 2 Process SSR Requirements –Offline Data Processing Requirements (SW-18) –Not package specific, any package must fulfill req. ALMA AIPS++ Audit –AIPS++ is baseline plan for ALMA –First step is to audit AIPS++ against SW-18 requirements –If too many unmet, may want to rethink baseline plan –Most important is where AIPS++ will be in 2007 –Includes performance benchmarking and user testing Progress monitoring –“delta” progress must be monitored –Cycle timescales TBD, e.g. 12-18 months? At milestones?

3 ALMA Computing IDR – December 2002, Garching 3 Process (2) Feedback to Requirements Process –Requirements may need to be modified or refined –Take input from ALMA project, or ASAC etc. Input to AIPS++ development planning –Identify milestones (e.g. ALMA Level 2 and 3) –Costing and delivery for unfulfilled Priority 1 & 2 items –Iterate with ALMA Computing and SSR Eventually move toward acceptance testing –Carried out by SSR? –Official procedure?

4 ALMA Computing IDR – December 2002, Garching 4 Current Status ALMA SSR for “Offline Data Processing” SW-18 Completed and reviewed Apr 2002 e2e SSR May 2001, revised Nov 2002 Audit Started Jul 2002, drafts Sep 2002 & Dec 2002 Based mostly on documentation, only minor testing Ready for SSR comment, passed to AIPS++ project Next Performance benchmarking, testing (need leader) Revise priorities, add timescales (e.g. based on milestones) Lead up to next audit in 2003 or 2004

5 ALMA Computing IDR – December 2002, Garching 5 Grading (1) First pass (2002) –Audit state of AIPS++ as of September 2002 in meeting ALMA requirements (SW-18) –Identify if functionality is present, based on documentation and auditor inventory of package –Some testing (e.g. on existing images or data) –Fold in AIPS++/IRAM tests where possible (e.g. in iramcalibrater module), appeared in builds at end of audit Procedure –Identify AIPS++ tools, functions, and documents relevant to each requirement –Grade based on functionality, usability, and/or documentation (depending on specific requirement)

6 ALMA Computing IDR – December 2002, Garching 6 Grading (2) Priorities (from SSR SW-18 Requirements) –1 = Critical (all Priority 1 features should be present) –2 = Important (90% of Priority 2 items should be fulfilled) –3 = Desirable (enhancements and future development) Grades (from audit) –A = acceptable –A/E = acceptable, but enhancements desired –I = inadequate –N = not available –U = unable to grade (e.g. ALMA definitions needed)

7 ALMA Computing IDR – December 2002, Garching 7 Grading (3) Severity (for I and N grades) –low –medium –high Grading procedure –3 principal auditors (Myers, Viallefond, Morita) –Plus deputy auditors (Brogan, Coulais, Caillat) –Input from others (Lucas, Glendenning, Cornwell, Brouw) –Myers audited all req., tried to have overlap on most others –In cases with disparate grades (~14% of req.), Myers drafted unification

8 ALMA Computing IDR – December 2002, Garching 8 Results – Chart Guide Work to be done by ALMA These should be 0 (in ~2007) These should be <10% of the total Explanatory – not results!

9 ALMA Computing IDR – December 2002, Garching 9 Breakdown A : A/E : I/N : U –All (489) 52% / 9% / 33% / 6% –Priority 1 (293) 60% / 7% / 29% / 4% –Priority 2 (135) 49% / 12% / 32% / 7% –Priority 3 (61) 23% / 13% / 54% / 10% Results – Overall

10 ALMA Computing IDR – December 2002, Garching 10 Breakdown A : A/E : I/N : U –All (23) 52% / 4% / 17% / 26% –Priority 1 (12) 42% / 8% / 33% / 17% –Priority 2 (9) 67% / 0% / 11% / 22% –Priority 3 (2) 50% / 0% / 50% / 0% Results – 1 General

11 ALMA Computing IDR – December 2002, Garching 11 Breakdown A : A/E : I/N : U –All (53) 58% / 11% / 26% / 4% –Priority 1 (26) 65% / 4% / 27% / 4% –Priority 2 (22) 55% / 23% / 18% / 5% –Priority 3 (5) 40% / 0% / 60% / 0% Results – 2 Interface

12 ALMA Computing IDR – December 2002, Garching 12 Breakdown A : A/E : I/N : U –All (127) 81% / 1% / 16% / 1% –Priority 1 (91) 86% / 1% / 12% / 1% –Priority 2 (24) 88% / 0% / 4% / 8% –Priority 3 (12) 33% / 0% / 67% / 0% Results – 3 Data Handling

13 ALMA Computing IDR – December 2002, Garching 13 Breakdown A : A/E : I/N : U –All (76) 22% / 4% / 63% / 11% –Priority 1 (51) 27% / 4% / 63% / 6% –Priority 2 (15) 20% / 0% / 67% / 13% –Priority 3 (10) 0% / 10% / 60% / 30% Results – 4 Calibration & Editing

14 ALMA Computing IDR – December 2002, Garching 14 Breakdown A : A/E : I/N : U –All (38) 39% / 24% / 29% / 8% –Priority 1 (24) 38% / 29% / 21% / 13% –Priority 2 (13) 46% / 15% / 38% / 0% –Priority 3 (1) 0% / 0% / 100% / 0% Results – 5 Imaging

15 ALMA Computing IDR – December 2002, Garching 15 Breakdown A : A/E : I/N : U –All (91) 56% / 19% / 24% / 1% –Priority 1 (45) 73% / 9% / 16% / 2% –Priority 2 (29) 45% / 28% / 28% / 0% –Priority 3 (17) 29% / 29% / 41% / 0% Results – 6 Data Analysis

16 ALMA Computing IDR – December 2002, Garching 16 Breakdown A : A/E : I/N : U –All (56) 32% / 13% / 46% / 9% –Priority 1 (36) 39% / 11% / 50% / 0% –Priority 2 (11) 27% / 9% / 45% / 0% –Priority 3 (9) 11% / 22% / 33% / 33% Results – 7 Visualization

17 ALMA Computing IDR – December 2002, Garching 17 Breakdown A : A/E : I/N : U –All (25) 32% / 0% / 64% / 4% –Priority 1 (8) 63% / 0% / 38% / 0% –Priority 2 (12) 17% / 0% / 75% / 8% –Priority 3 (5) 20% / 0% / 80% / 0% Results – 8 Special Features

18 ALMA Computing IDR – December 2002, Garching 18 Audit Summary Requirements and Sub(sub)requirements “equal”: –61% of all req., and 67% of Priority 1 grade A or A/E –High severity defects (I or N) 6% of all or 9% of Priority 1 –29% of Priority 1 requirements graded I or N – target –60% of all requirements classified Priority 1 – should fix this! Problem Areas: –Calibration & Editing (63% I or N for Priority 1) –Visualization (50% I or N for Priority 1) –Imaging (needs ALMA input and algorithm development) –Interface (performance and look-and-feel deemed inadequate) Cost to complete (Kumar) ~ 26 FTE?

19 ALMA Computing IDR – December 2002, Garching 19 Next – Benchmarking Goals: –Quantify AIPS++ performance on ALMA sized representative datasets –Compare with other packages –Locate problem areas in package –Basis for assay and regression testing Test Datasets –Representative of ALMA data (e.g. size) –Real and simulated data –Should cover major modes –SSR: define needed sets as soon as possible

20 ALMA Computing IDR – December 2002, Garching 20 Procedure Identify test datasets –SSR defined or provided (e.g. IRAM, BIMA, simulated) –AIPS++ provided (e.g. simulated) Build and run scripts –AIPS++ provide scripts (Rusk, Jan 2003) –SSR involvement (new SSR hire?) –IRAM PdB Phase II and III? Migrate into assay module(s) –Build into alma package –Use benchmarking tools Compare versus other packages –SSR led, with AIPS++ input –Must compare “apples” with “apples”

21 ALMA Computing IDR – December 2002, Garching 21 Outcome Identify problem areas –Determine cause of problem Augmentation or change of technology required Algorithm issue Size of problem issue (e.g. pure flops) Effectiveness of this step depends on how carefully the benchmarking was done! –Profiling of code –Fix where necessary (cost, fit into development plan) –Priorities for development Build benchmarking process into auditing and development cycles

22 ALMA Computing IDR – December 2002, Garching 22 Example – Hot Topics Imaging performance –Comparisons versus MIRIAD, GILDAS Gridding procedure (e.g. frequency independent) –Comparison with AIPS Joint Stokes deconvolution Interface performance and presentation –Interface speed Event rate, a glish issue? Change technology? –GUI look and feel Development issues (need GUI expert? User desires?) Technology choice (e.g. is Python our savior?) Measurement sets and fillers

23 ALMA Computing IDR – December 2002, Garching 23 Upcoming Deadlines AIPS++ Technical Review –Will address many of these tough questions –Tentatively scheduled for late Jan 2003 or Feb 2003 In time for PDR –Need to have some VLA benchmarks in place for this Will be significant AIPS++ and NAUG work in this area –Would like to have some first ALMA benchmarks also ALMA is AIPS++ top customer! (Rusk begin Jan 2003) Timescales –Need to have some info by PDR for March/April 2003 –Will have Technical Review in hand, could scale from this if necessary –Who will work on this? This will determine delivery date…

24 ALMA Computing IDR – December 2002, Garching 24 AIPS++ Reorganization New NRAO Director – Fred Lo –Critical reviews of all NRAO projects New roles in project –Joe McMullin (Project Manager) –Steve Myers (Project Scientist) –Kumar Golap (deputy Project Manager) –George Moellenbrock (Operations Manager)\ ALMA Subsystem interim leads –Tim Cornwell (Pipeline)  Lindsey Davis –Kumar Golap (Offline) Upcoming reviews –DM Review (late Jan 2003) –Technical Review (late Jan 2003 or Feb 2003)

25 ALMA Computing IDR – December 2002, Garching 25 Other AIPS++ Developments More User input into AIPS++ –NRAO AIPS++ User Group (NAUG) Auditing and testing, subsystem scientists –VLA Audit, testing and benchmarking (2003 Q1) –NRAO-wide requirements, audit (EVLA,GBT) Based on ALMA with some changes –Viewer focus group (May 2002) –User Interface focus group (Jan 2003) More ALMA input into AIPS++ –ALMA is high-profile customer for AIPS++! –ALMA has substantial influence on AIPS++ development e.g. through Project Scientist Through SSR and subsystem requirements


Download ppt "December2002, Garching ALMA Computing IDR ALMA AIPS++ Audit Steven T. Myers (NRAO)"

Similar presentations


Ads by Google