Presentation is loading. Please wait.

Presentation is loading. Please wait.

2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.1 Digital Information Technology Testbed Conformity Assessment Activities

Similar presentations


Presentation on theme: "2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.1 Digital Information Technology Testbed Conformity Assessment Activities"— Presentation transcript:

1 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.1 Digital Information Technology Testbed Conformity Assessment Activities http://ditt.org US Army, Center For Army Lessons Learned (CALL) Ft. Leavenworth, KS http://call.army.mil Presentation By Frank Farance, Farance Inc. +1 212 486 4700, frank@farance.com 2000-03-16

2 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.2 The Standards Process Accreditation affords consistent process Committees don’t reinvent wheel Choosing a “process” can take years itself Accredited process is well- tested and “off the shelf” Development Consensus Building MaintenanceReview

3 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.3 Goals of Standards Process Standards that are well-defined: –Consistent implementations, high interoperability –Coherent functionality Commercial viability: –Standards allow range of implementations –Commercial products are possible: All conforming products interoperate, yet... Different “bells and whistles” Consumer can choose from range of conforming systems Wide acceptance and adoption Few bugs: –Consistent requests for interpretation (RFIs) –Low number of defect reports (DRs)

4 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.4 Standards are Developed In Working Groups Developing Standards Source: “from scratch” or “base documents” Create “standards wording” (normative and informative), rationale for decisions Technical committee: in- person or electronic collaboration Development

5 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.5 Development: Technical Specification [1/4] Assertions: –Sentences using “shall” –Weaker assertions: “should”, “may” Inquiries/Ranges: –Implementations have varying limitations –Interoperability: tolerances vs. minimums –Allows implementation-defined and unspecified behavior Negotiations: –Adaptation to conformance levels

6 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.6 Development: Technical Specification [2/4] Conformance: –Complies with all assertions –Performs in range allowed within inquiries/ranges and negotiations –Minimize implementation-defined, unspecified, and undefined behavior

7 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.7 Development: Technical Specification [3/4] Applications and standards use: –Strictly conforming: uses features that exist in all implementations –Conforming: uses features in some conforming implementations –Conformance levels vs. interoperability

8 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.8 Development: Technical Specification [4/4] Rationale: –Explanation of important committee decisions –Features considered/rejected –Allows for change in membership –Helps with Requests for Interpretation

9 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.9 Development: Engineering Process [1/5] Identify scope Use engineering methodology Preference for existing commercial practice Don’t standardize implementations Standardize specifications

10 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.10 Development: Engineering Process [2/5] Can new technology be incorporated? Technology horizon: –Which technology incorporated? Feasible/commercial, up to 1-2 years from now –How long should standard be useful? At least 5- 10 years from now

11 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.11 Development: Engineering Process [3/5] Risk management: scope, prior art, schedule Determining requirements Asking the right questions (1-3 years?) Document organization and phasing Base documents -- starting points Proposals and papers -- additions

12 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.12 Development: Engineering Process [4/5] Developing standards wording... –Step #0: Requirements identification (optional) –Step #1: Functionality: what it does –Step #2: Conceptual model: how it works –Step #3: Semantics: precise description of features –Step #4: Bindings: transform into target, e.g., application programming interfaces (APIs), syntax, commands, protocol definition, file format, codings –Step #5: Encodings: bit/octet formats (optional) –Step #6: Standards words: “legal” wording

13 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.13 Development: Engineering Process [5/5] Let bake -- probably 1-2 years –Baking is very important –Shakes out subtle bugs –Greatly improves quality! –Usually, vendors fight “baking”... want to announce “conforming” products ASAP

14 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.14 Formal and Informal Consensus-Building Among Standards and Specification Development Orgs Consensus-Building Steps Collaboration, harmonization with other organizations Public reviews as soon as possible Public comments Resolution of comments Approval stages (refinement): –Working Draft –Committee Draft –Draft Standard –Approved Standard Development Consensus Building

15 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.15 Relationship To Semantics: Detailed Meaning Of Operations/Transactions Assertions: sentences using shall, should, or may Inquiries: range of values Negotiations: heuristics Examples: –Array: list of elements of same type in regular structure –PutValue : replaces old object, in place Requirements Functionality Conceptual Model Semantics Bindings: APIs Bindings: Codings Bindings: Protocols Encodings: Data Formats Encodings: Calling Conventions Encodings: Various Communication Layers

16 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.16 Some Strategies for Standardizing Data Models Partition into “application areas” Build standards in several steps, example: –Year 1: Create minimal, widely adoptable standard –Year 3: Create amendment that represents best and widely implemented practices –Year 5: Revise standard, incorporate improvements Support extension mechanisms –Permits user/vendor/institutional extensions –Widely implemented extensions become basis for new standards amendments/revisions

17 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.17 Building Standards In Several Steps Maintenance Development Review The “Standard” Amendments: 2-3 years Revisions: 4-5 years Consensus Building User/Vendor/ Institutional “Extensions” Input To Next Revision Of Standard Market-Relevant, Widely-Adopted “Extensions”

18 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.18 Strictly Conforming vs. Conforming Implementations Consensus Building Strictly Conforming Implementation: Maximum Interoperability, Minimum Functionality Conforming Implementations: Less Interoperability, More Functionality Many Conforming Implementations Are Possible: Interoperability May Vary Extensions  A Conformance Issue

19 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.19 Semantics: Sources and Influences Requirements Functionality Conceptual Model Semantics Bindings: APIs Bindings: Codings Bindings: Protocols Encodings: Data Formats Encodings: Calling Conventions Encodings: Various Communication Layers Semantics Bindings: APIs C, C++, SQL, LISP, Java, Javascript, POSIX, Tcl, APL FTP, HTTP, MIME, RFC 822, CORBA, W3C P3P ISO-IEC/JTC1/SC22/WG11, ISO-IEC/JTC1/SC32/WG2, NCITS L8, NCITS T2 Conceptual Model Bindings: Codings Bindings: Protocols Language-Related Protocols Related Stds “Data model semantics are based on the conceptual model, protocols, programming languages, and related standards/specifications.” Conceptual Model  Semantics  APIs, Codings, Protocols

20 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.20 Life Cycle: Maintenance Phase Maintenance of Standards Requests for Interpretation (RFIs) Defect Reports (DRs) and Record of Responses (RRs) Amendments (AMs) and Technical Corrigenda (TCs) Development Consensus Building Maintenance

21 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.21 Maintenance [1/3] Requests for interpretation (RFIs): –Questions on ambiguous wording –Response time: 3-18 months –Interpret questions based on actual standards wording, not what is desired –Like U.S. Supreme Court –Consistent responses — highly desirable

22 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.22 Maintenance [2/3] Defect reports: –Bug reports –Fix features, as necessary –Many defect reports  weak standard –Weak standard  little acceptance/use –Little acceptance/use  failure –Failure: only recognized years later

23 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.23 Maintenance [3/3] Other deliverables: –Record of responses (to RFIs) –Technical corrigenda (for defect reports) –Amendments Next revision: –Reaffirm –Revise –Withdraw

24 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.24 Success Attributes [1/2] Participants: –expect to be involved 2-5 years Schedule: –don’t get rushed, don’t get late New technology: –be conservative Scope: –stick to it!

25 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.25 Success Attributes [2/2] Conformance: –need to measure it –should have working definition ASAP Target audience: –commercial systems and users Quality: –fix bugs now Process: –have faith in standards process — it works!

26 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.26 Failure Attributes [1/2] Incorporate new/untried technology –Why waste committee time? Ignore commercial interests –Who will implement the standard? Ignore public comments –Who will buy standardized products? Creeping featurism –The schedule killer!

27 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.27 Failure Attributes [2/2] Poor time estimates –Progress is made over quarters, not weeks Leave bugs to later –Expensive to fix later, like software Weak tests of conformance –Standard-conforming, but lacks interoperability Too much implementation-defined behavior

28 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.28 End Of Standards Life Cycle Review Cycle: Revise, Reaffirm, Withdraw Review Cycle: Revise: new work item, incorporate new technology Reaffirm: no changes, stable technology Withdraw: little use, obsolete technology Typically: 5 year cycle Development Consensus Building MaintenanceReview

29 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.29 Typical Timeline Of Standards Summary of Standards Process Development Phase: 12-48 months Consensus Phase: 9-24 months Maintenance Phase: 3-6 years Review Cycle: revise, reaffirm, or withdraw — every 5 years Typical time: from committee formed to approved standard: 18-48 months Realistic schedule ==> Good results Development Consensus Building MaintenanceReview

30 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.30 Conformance Testing Testing implementations Relationship to standards/specifications Relationship to standard/specification development organizations Relationship to testing organizations

31 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.31 Testing Implementations — A Generic Approach Implementation Under Test Test Driver Test Results Reports Test Jig

32 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.32 Assertion, Inquiry, Negotiation, and Testing (AINT): Documents that transform Standards words to specific tests Users Vendors Other SSDO (Public) AINT Document Test Developer Test Suite 3 2 1 4 “AINT documents are critical for uniform and thorough testing” Development Consensus Interpretation Test Creation Standards (specifications)

33 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.33 The DITT Testing Process: Advice, Certification, Testing, and Branding SSDOTest SuiteNISTIEEE-ISTOVendor Advisory Board Test Lab (DITT) DITT: Test Developer Test Suite Branding SSDO Branding Authorization NIST Lab Accreditation IEEE Proc. Admin. Test Suite Test Suite Licensed To Vendors Dispute Resolution Vendor Products Results 123456

34 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.34 Various Stakeholders’ Perspectives User: –some person or organization that cares about interoperability Vendor: –develops products/services that conform to standards/specifications Small Developers: –develop products, services, content; small testing resources Institutions: –strong need to reduce business risk –need interoperability

35 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.35 Testing vs. Certification Users/vendors may “self-test” via publicly available test suites Certification implies: –Controlled testing –Certified lab –Issuing a “certificate” or “branding”

36 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.36 Initial Technical Scope of DITT Conformance Testing IEEE 1484.2 (PAPI Learner) IEEE 1484.12 (Learning Objects Metadata) ISO/IEC 11179 (Metadata Registries)

37 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.37 Conformance Testing: Insures Interoperability LTSC (P1484) DITT: Conformance Testing for approved standards Note: Conformance testing involves: (1) interpretation of standards (2) development of test suites (3) testing vendor products/services ISO/IEC JTC1/SC36 GEM PROMETEUS GESTALT

38 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.38 Possibilities Follow-On Scope of Conformance Testing IEEE 1484.6 (Content Sequencing) IEEE 1484.10 (Portable Content) IEEE 1484.11 (Computer Managed Instruction) IEEE 1484.13 (Simple Identifiers) IEEE 1484.17 (Content Packaging) IEEE 1484.14 (Semi-Structured Data) IEEE 1484.18 (Platform/Browser/Media Profiles)

39 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.39 Conformance Testing Of Institutional Extensions Supporting “institutional” extensions –Learning objects metadata –Institutional extensions –Data repositories –Data registries

40 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.40 Related Conformance Testing In Healthcare ANSI HISB X12 HL7 Patient identifiers Patient records systems Security

41 2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.41 More Information Contact: –Roy Carroll, +1 913 684 5992 carrollr@leavenworth.army.mil carrollr@leavenworth.army.mil –Frank Farance, +1 212 486 4700 frank@farance.com frank@farance.com Upcoming events: –Demonstration at 2000-05 SC32/WG2 meeting –Test suites available 2000Q2-3


Download ppt "2000-03-16Digital Information Technology Testbed ©2000 Farance Inc.1 Digital Information Technology Testbed Conformity Assessment Activities"

Similar presentations


Ads by Google