Download presentation
Presentation is loading. Please wait.
Published byVivian Billock Modified over 10 years ago
1
Michelle Harris 407-306-6693 Michelle.L.Harris@lmco.com
ATML on LM-STAR® Michelle Harris Alicia Helton Steven O’Donnell
2
Introduction Implemented a set of ATML schemas on LM-STAR®
Schemas used – TestDescription ML (draft 5.0) TestResults ML (version 0.15) Diagnostic ML Bayes Common Element Model (CEM) Dynamic Context Model (DCM – version 0.07)
3
Task Definition Convert a legacy CASS ATLAS TPS into ATML TestDescription. Use TestDescription as input to the SELEX TPS Wizard™ and generate TestStand™ sequences. Execute the TestStand™ sequences on the LM-STAR®. Collect measured values using ATML TestResults. Interface with diagnostic reasoner to isolate to the fault more quickly and more accurately.
4
Initial Approach Use an externally developed tool to convert ATLAS to Intermediate XML Use XML tools to transform the Intermediate XML to TestDescription TestDescription will provide the “what” to do information for the TPS Use the TPS Wizard™ to generate TestStand™ sequence files capable of being run on LM-STAR®. ATLAS Intermediate XML ATML Test Description TestStand Sequence Files
5
Issues Legacy ATLAS TPS was not designed to maximize portability
Intermediate XML generated from ATLAS was very flat Difficult to understand test flow and translate into TestDescription Legacy ATLAS TPS didn’t adhere to style guide which would have enforced specific design rules Multiple fault callout permutations based on data evaluations made without test numbers created problems in the diagnostic model development
6
Revised Approach An application was developed to extract the “what” to do information from the ATLAS and save it to a spreadsheet. Human intervention verified the information and added missing values. An application was written to convert the spreadsheet to TestDescription. Test 2000 Test Group 1 Next on Fail DIAGNOSTIC1 Next on Pass 2010 Callout on Fail A4 High Limit 25 Low Limit NA Comparison EQ Units Ohm Entry Point No
7
TestDescription Sample
<Outcomes> <Outcome ID="0_1" value="Passed"/> <Outcome ID="0_2" value="Failed"/> <Outcome ID=" DIAGN1" value="Failed"> <ReplaceComponents> <ReplaceComponent uutComponentId="UUT-0"/> </ReplaceComponents> </Outcome> <ReplaceComponent uutComponentId="UUT-1"/> ------snipped <Step xsi:type="Step_Test" ID="Step_2" testId="2000"> <Results> <Result xsi:type="Result_Test" testOutcomeId="2000A"> <NextStep stepId="Step_3"/> <! > </Result> Using the information from TestDescription, the Selex TPS Wizard™ builds the frame of the new TPS with initiated variables, test criteria, simulation mode, pre and post conditions, and calls to “how-to” sequences.
8
TestDescription to LM-STAR®
Needed to create the “how-to” TestStand™ Sequences Highly intensive manual task Simplified through the use of Custom Steps Graphical interface to LM-STAR® system software
9
Diagnostic Model Description Development of Model
Model is based off the Bayesian and Common Element Models from the AI-ESTATE standard Stored in XML format derived from the AI-ESTATE models Development of Model Start with the fault tree of the TPS Use historical test results and maintenance data to add more intelligence to the Model Learning algorithms are used to continuously feed back newly discovered test results (in TestResults ML format) and maintenance data
10
Diagnostic Reasoner Provides run-time environment for using the diagnostic models Implements the AI-ESTATE interface to the diagnostic models Uses the Dynamic Context Model to track session information Allows for back-tracking through session Allows restart of Session from previous stopping point Provides a set of “higher-order” interface functions to minimize required calls for accessing model/reasoner data Web-service based interface (using WSDL) Utilizes a Bayesian Network Analyzer called SMILE By Decisions System Laboratory – Univ. of Pittsburgh
11
Lessons Learned Our current process is still heavily dependent on manual intervention. Very time-consuming Current legacy TPSs are implemented with tight coupling making it difficult to separate the “what” and “how” information Other ATML schemas such as UUT Description and TestAdapter could aid in the porting process They were not mature enough at the time the task started Would be more cost effective to implement UUT test requirements on new systems as opposed to re-hosting the application Not always a one-to-one test mapping from TPS to Diagnostic Model
12
Conclusion Industry needs tools that can generate and consume ATML that could be exported to C, ATLAS etc Using IEEE-1641 for Signal and Test Definition appears promising and further study by Lockheed Martin is planned Lockheed Martin is embracing ATML TestResults ML is deployed on LM-STAR® systems supporting the JSF program As ATML matures, Lockheed Martin is prepared to implement this technology into our legacy and future programs
13
DEMONTSTRATION
14
? Questions Michelle Harris Alicia Helton 407-306-6693 407-306-1592
Alicia Helton Steven O’Donnell
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.