Download presentation
Presentation is loading. Please wait.
Published byDebra Lawrence Modified over 8 years ago
1
RupenEVLA Advisory Committee Meeting May 8-9, 2006 1 Software Testing Michael P. Rupen EVLA Project Scientist for Software
2
RupenEVLA Advisory Committee Meeting May 8-9, 2006 2 Challenge #1: Connecting Scientists and Software Dynamic, evolving project – requirements cannot be static –changes in schedule (e.g., WIDAR) and staffing –better appreciation of challenges (e.g., e2e) associated with a truly new instrument Transition from one working instrument to another –lots of trade-offs
3
RupenEVLA Advisory Committee Meeting May 8-9, 2006 3 Connecting Scientists and Software Software is both complex and important –defines telescope (e.g., rms noise achievable) –programmers are professionals – cannot assume astronomical knowledge or p.o.v. (e.g., declination -00:00:02) –still learning (e.g., user interface; implications of dynamic range goals)
4
RupenEVLA Advisory Committee Meeting May 8-9, 2006 4 Connecting Scientists and Software –organizational split: much higher walls between scientist and implementation sheer size of project accounting requirements history of aips++ requires close connections between astronomers and programmers at all levels, from overall priorities to tests of well- defined sections of code
5
RupenEVLA Advisory Committee Meeting May 8-9, 2006 5 Connecting Scientists and Software: EVLA Approach EVLA ProjSci for SW: Rupen –responsible for requirements, timescales, project book, inter-project interactions, … EVLA SW Science Reqts. Committee –Rupen, Butler, Chandler, Clark, McKinnon, Myers, Owen, Perley, (Brogan, Fomalont, Hibbard) –consultation for scientific requirements –source group for more targeted work
6
RupenEVLA Advisory Committee Meeting May 8-9, 2006 6 Connecting Scientists and Software: EVLA Approach Subsystem Scientists –scientific guidance for individual subsystems –day-to-day contacts for programmers –interpret scientific requirements for programmers –oversee (and are heavily involved in) tests –heavily involved in user documentation –consult with other scientific staff –duties vary with subsystem development
7
RupenEVLA Advisory Committee Meeting May 8-9, 2006 7 Connecting Scientists and Software: EVLA Approach Subsystem Scientists (cont’d) –Currently: Proposal Submission Tool: Wrobel ObsPrep: Chandler (“daughter of JOBSERVE”) WIDAR: Rupen/Romney Post-processing: Rupen/Owen TBD: Scheduler, ObsStatus Screens, Archive
8
RupenEVLA Advisory Committee Meeting May 8-9, 2006 8 Connecting Scientists and Software: EVLA Approach Less formal, more direct contacts –e.g., Executor (e.g., ref. ptg. priorities and testing) –e.g., Greisen developing plotting programs based on Perley hardware testing Testers –On-going tests: small in-house group (fast turn-around, very focused) –Pre-release tests: larger group of staff across sites and projects –External tests: staff + outside users
9
RupenEVLA Advisory Committee Meeting May 8-9, 2006 9 Challenge #2: Overlapping Projects Overlapping requirements (esp. ALMA, but also GBT) Socorro operations: VLA/VLBA + transition ALMA has much greater resources, but different priorities and timescales leverage ALMA effort, but don’t let EVLA priorities get lost ensure EVLA effort is useful for other projects e2e is intended to help here
10
RupenEVLA Advisory Committee Meeting May 8-9, 2006 10 Overlapping Projects: EVLA Approach Regular discussion + collaboration –not so good in the past, but trend is positive –ALMA: APDM, ASDM discussions Rupen/Shepherd, Rupen/Glendenning meet weekly (first concrete result: joint plan for helping CASA move to the user domain) Setting up monthly Rupen/Shepherd/Butler /Glendenning meetings
11
RupenEVLA Advisory Committee Meeting May 8-9, 2006 11 Overlapping Projects: EVLA Approach Direct comparison of requirements –Proposal Submission Tool (GBT, VLA) –ALMA: cross-project evaluations (ALMA checks PST; EVLA checks Phase 2 (ObsPrep)) –EVLA post-processing requirements draw heavily on ALMA
12
RupenEVLA Advisory Committee Meeting May 8-9, 2006 12 Overlapping Projects: EVLA Approach Joint testing –PST tests involve GBT, VLA, e2e –EVLA participates in ALMA CASA tests (Brisken, Owen, Rupen, …) Review panels –Butler for ALMA (BRYAN: IS THIS SO?) –ALMA person for fall EVLA S/W review
13
RupenEVLA Advisory Committee Meeting May 8-9, 2006 13 Science Domain Tests Proposal Submission ObsPrep Archive ObsMon
14
RupenEVLA Advisory Committee Meeting May 8-9, 2006 14 Science Domain Tests: Prop. Submission Tool Subsystem scientist: Frail Rupen Wrobel –heavily involved in design/implementation discussions Programmers –management: Butler, van Moorsel –in the trenches: Loveland, Butler, Witz, (Waters) Releases set by –scientific requirements (support VLA fully for Jun06 deadline) –software realities (re-factor code to match high-level architecture)
15
RupenEVLA Advisory Committee Meeting May 8-9, 2006 15 Science Domain Tests: Prop. Submission Tool Testing timetable for this release: –Feb 10: post-mortem based on user comments –Mar 1: internal test-I (van Moorsel, Butler, Rupen) –Mar 20: internal test-II (+Frail) –Apr 18: NRAO-wide test (14, incl. CV/GB) –May 3: internal test-III –May 7: public release
16
RupenEVLA Advisory Committee Meeting May 8-9, 2006 16 Science Domain Tests: Prop. Submission Tool –rest of May: fix up & test post-submission tools –Early June: gather user comments –June 7: post-mortem, set next goals
17
RupenEVLA Advisory Committee Meeting May 8-9, 2006 17 Post-Processing (CASA) Pipelines Interactive processing
18
RupenEVLA Advisory Committee Meeting May 8-9, 2006 18 Post-Processing: On-going Interactions NAUG/NAWG –each meets once a month –track and test current progress & algorithmic developments –Scistaff members: Myers, Brisken,Brogan, Chandler, Fomalont, Greisen, Hibbard, Owen, Rupen, Shepherd, Whysong
19
RupenEVLA Advisory Committee Meeting May 8-9, 2006 19 Post-Processing: On-going Interactions ALMA formal tests –ALMA testers themselves often have VLA experience (e.g., Testi) –ALMA2006.01-4 included Brisken, Owen, Rupen; first look at python –EVLA goals: gain experience; learn how ALMA tests work; influence interface
20
RupenEVLA Advisory Committee Meeting May 8-9, 2006 20 Post-Processing: On-going Interactions User Interface Working Group (Apr 06) –goal: refine requirements for user interface –Myers, Brogan, Brisken, Greisen, Hibbard, Owen, Rupen, Shepherd, Whysong –draft results on Web
21
RupenEVLA Advisory Committee Meeting May 8-9, 2006 21 EVLA CASA Tests: Philosophy Planned opportunities for collaboration –testing for mutual benefit, not an opportunity to fail important to repair aips++ programmer/scientist rift –allows us to reserve time of very useful, very busy people (Brisken, Owen, …) Set priorities Check current approaches and progress
22
RupenEVLA Advisory Committee Meeting May 8-9, 2006 22 EVLA CASA Tests: Philosophy Provide real deadlines for moving software over to the users Prepare for scientific and hardware deadlines (e.g., WIDAR) joint decisions on topics feedback going both ways (e.g., what requirements are most useful [cf. DR])
23
RupenEVLA Advisory Committee Meeting May 8-9, 2006 23 EVLA CASA Tests: Philosophy less formal than ALMA avoid duplication lesser resources EVLA-specific work is often either research, or requires much programmer/scientist interaction (e.g., how to deal with RFI) Learning curve for CASA is still steep. ALMA’s formal tests involve e.g. re-writing the cookbook; that is likely beyond our resources
24
RupenEVLA Advisory Committee Meeting May 8-9, 2006 24 EVLA CASA Tests: Current Status May-July 2005 test: w-projection –first step in the Big Imaging Problem –generally good performance (speed, robustness) but lots of ease-of-use issues –Myers, Brisken, Butler, Fomalont, van Moorsel, Owen Currently concentrating on –long lead-time items (e.g., high DR imaging) –H/W driven items (e.g., proto-type correlator) –shift to user-oriented system
25
RupenEVLA Advisory Committee Meeting May 8-9, 2006 25 EVLA CASA Tests: Current Status Feb/Mar 2006 ALMA test –1 st with python interface –in-line help March 2006 User Interface Working Group –tasks –documentation –scripting vs. “ordinary” use
26
RupenEVLA Advisory Committee Meeting May 8-9, 2006 26 EVLA CASA Tests: Fall 2006 User interface –revised UI (tasks etc.) –revamped module organization –new documentation system Reading and writing SDM –ASDM CASA MS UVFITS AIPS Basic calibration, incl. calibration of weights
27
RupenEVLA Advisory Committee Meeting May 8-9, 2006 27 EVLA CASA Tests: Fall 2006 Data examination –basic plots (mostly a survey of existing code) –first look at stand-alone viewer (qtview) Imaging –widefield imaging (w term, multiple fields) –full pol’n imaging –pointing self-cal “for pundits” Some of this may be NAUG-tested instead
28
RupenEVLA Advisory Committee Meeting May 8-9, 2006 28 EVLA CASA Tests: 2007 Test scheduled for early 2007 –probably focused on calibration and data examination Driven by need to support initial basic modes (e.g., big spectral line cubes), and to learn from the new hardware (e.g., WIDAR + wide-band feeds) Currently working on associated requirements
29
RupenEVLA Advisory Committee Meeting May 8-9, 2006 29 Involving Outside Users Science domain –fast turn-around and active discussion make this a fair commitment –reluctance to show undocumented, unfinished product –current involvement is mostly at the end of the testing process
30
RupenEVLA Advisory Committee Meeting May 8-9, 2006 30 Involving Outside Users CASA –focus has been on functionality, reliability, and speed major advances, but documentation & UI have lagged –learning CASA for a test is a little painful, esp. since the UI is evolving rapidly
31
RupenEVLA Advisory Committee Meeting May 8-9, 2006 31 Involving Outside Users –algorithmic development is special: many public documents & talks interactions with other developers student involvement: Urvashi; two from Baum –focus is shifting to user interactions as this happens, we will involve more non-NRAO astronomers
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.