Download presentation
Presentation is loading. Please wait.
Published byDominic Blair Modified over 9 years ago
1
CCSM Testing Status Tony Craig Lawrence Buja Wei Yu CCSM SEWG Meeting Feb 5, 2003
2
Outline Testing: when, who, why, what What we do now Where do we go next
3
Test Strategy When-Who-Why WHENWHOWHY pre-commitdevelopervalidate changes post-committest engineer, automated? verify commits pre-releaserelease teamverify model is ready for release post-releasetest engineer, automated? verify platform is not changing
4
Test Strategy What Exact restart Bit-for-bit, round-off, other Different Platforms Different Resolutions Serial, multitasked, threaded, hybrid Physics packages, dynamical cores Scientific Validation - long climate runs Performance I/O, Data
5
Testing Process for each Component ccsmatmlndocnicecpldata pre-commit compdev gate- keeper ?? post-commit test engr gate- keeper ?? pre-release release team N/A ? post-release test engr ??N/A ?
6
What do we do now CAM test-model –used by developers CAM dev branch testing –automated testing after each commit CCSM beta tag tests –manual testing of entire system periodically CCSM release tests –automated testing of releases versions regularly
7
CAM test-model Script that runs many CAM cases automatically Runs on many platforms (SGI, IBM, CPQ, PC/Linux) Many automated tests –dynamical cores –tasks and threads –error growth
8
CAM dev branch testing Automatic testing of CAM every night after a commit on the primary development branch Uses test-model Runs on chinook, blackforest, anchorage
9
CCSM beta tag tests Exact restart testing –Various configurations –Various resolutions Comparison with previous beta tag
10
CCSM Release Tests Weekly testing of CCSM2.0, CCSM2.0.1 releases on –chinook –blackforest –bluesky –seaborg Includes some patches
11
CCSM performance testing Carry out a large suite of timing tests Comparison with previous versions Determine appropriate load balance
12
What next Develop a formal test plan document Determine Who, When, Why, What Establish formal processes for each component and for the CCSM overall Consider resources required (both people and computer time) Unit tests
13
Notes Test test-model Test Configuration thoroughly, what’s supported, –configuration = dycores, large components, physics, machine dependencies chunking testing requirements test-model lite add tests - cost? difference between dev tests, nightly tests, release testing, test- model is same for all community testing
14
Notes - 2 pop - coarse and high resolution tests cover most of physics space, also different pe configurations test-model compares to previous version, very useful test-model for land exists make consistency between stand-alone and ccsm versions of the model, unify compiler options? ccsm requirements vs component requirements specs for makefile, specs in general? unit tests - physics wrapper
15
Notes - 3 Decide where tests take place, who is responsible for what parts recommendation that working groups test coupling aspects, also verification of make in coupled system, debug flags testing gap in test process wrt make track make fixes/changes through bug tracker go to wg, get test requirements
16
Notes - 4 performance requirement science requirement is it just the “control run” get developers input on what should be in the test suite library issues, internal libraries (ESMF, MCT), mass test cost - test-model, chinook (1 hours, 16 pes), blackforest(30- 40 minutes, 32 pes), babyblue (30 minutes), anchorage (30 minutes), lots of time spent building regular testing, automated testing, different frequencies
17
Notes - 5 Have components provide test suite to CSEG different levels of testing
18
Recommendations component working groups test coupling, cseg liaisons coordinate develop specs for makefile go to working groups, find out test requirements, then decide who does what develop ccsm test requirements revisit benchmarking requirements
19
Open Discussion Community has a difficult time keeping up with what’s happening in NCAR Could have more forward planning, IPCC, resolutions, chemistry, bgc, SE and science status Should we make component development work more visible CRBs, adaptation, software practices Need more background on decisions made Code review - need? time? Design walkthroughs and code reviews are working for the ocean model. Code reviews help educate others on code
20
open discussion - 2 SEWG encourages the use of code reviews. SEWG will explain what we encourage. Recommend brown bag. recommend a code walkthrough of cpl6. Discuss status of code review plans at june workshop, internal SEWG ccsm-progs, ccsm-sci, wg notes to be distributed to ? get mailing lists under control who do we make recommendations too? SSC, SE dev guide, word of mouth
21
open discussion - 3 Get input from many developers on code review plans
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.