Digital Information Technology Testbed ©2000 Farance Inc.1 Digital Information Technology Testbed Conformity Assessment Activities US Army, Center For Army Lessons Learned (CALL) Ft. Leavenworth, KS Presentation By Frank Farance, Farance Inc ,
Digital Information Technology Testbed ©2000 Farance Inc.2 The Standards Process Accreditation affords consistent process Committees don’t reinvent wheel Choosing a “process” can take years itself Accredited process is well- tested and “off the shelf” Development Consensus Building MaintenanceReview
Digital Information Technology Testbed ©2000 Farance Inc.3 Goals of Standards Process Standards that are well-defined: –Consistent implementations, high interoperability –Coherent functionality Commercial viability: –Standards allow range of implementations –Commercial products are possible: All conforming products interoperate, yet... Different “bells and whistles” Consumer can choose from range of conforming systems Wide acceptance and adoption Few bugs: –Consistent requests for interpretation (RFIs) –Low number of defect reports (DRs)
Digital Information Technology Testbed ©2000 Farance Inc.4 Standards are Developed In Working Groups Developing Standards Source: “from scratch” or “base documents” Create “standards wording” (normative and informative), rationale for decisions Technical committee: in- person or electronic collaboration Development
Digital Information Technology Testbed ©2000 Farance Inc.5 Development: Technical Specification [1/4] Assertions: –Sentences using “shall” –Weaker assertions: “should”, “may” Inquiries/Ranges: –Implementations have varying limitations –Interoperability: tolerances vs. minimums –Allows implementation-defined and unspecified behavior Negotiations: –Adaptation to conformance levels
Digital Information Technology Testbed ©2000 Farance Inc.6 Development: Technical Specification [2/4] Conformance: –Complies with all assertions –Performs in range allowed within inquiries/ranges and negotiations –Minimize implementation-defined, unspecified, and undefined behavior
Digital Information Technology Testbed ©2000 Farance Inc.7 Development: Technical Specification [3/4] Applications and standards use: –Strictly conforming: uses features that exist in all implementations –Conforming: uses features in some conforming implementations –Conformance levels vs. interoperability
Digital Information Technology Testbed ©2000 Farance Inc.8 Development: Technical Specification [4/4] Rationale: –Explanation of important committee decisions –Features considered/rejected –Allows for change in membership –Helps with Requests for Interpretation
Digital Information Technology Testbed ©2000 Farance Inc.9 Development: Engineering Process [1/5] Identify scope Use engineering methodology Preference for existing commercial practice Don’t standardize implementations Standardize specifications
Digital Information Technology Testbed ©2000 Farance Inc.10 Development: Engineering Process [2/5] Can new technology be incorporated? Technology horizon: –Which technology incorporated? Feasible/commercial, up to 1-2 years from now –How long should standard be useful? At least years from now
Digital Information Technology Testbed ©2000 Farance Inc.11 Development: Engineering Process [3/5] Risk management: scope, prior art, schedule Determining requirements Asking the right questions (1-3 years?) Document organization and phasing Base documents -- starting points Proposals and papers -- additions
Digital Information Technology Testbed ©2000 Farance Inc.12 Development: Engineering Process [4/5] Developing standards wording... –Step #0: Requirements identification (optional) –Step #1: Functionality: what it does –Step #2: Conceptual model: how it works –Step #3: Semantics: precise description of features –Step #4: Bindings: transform into target, e.g., application programming interfaces (APIs), syntax, commands, protocol definition, file format, codings –Step #5: Encodings: bit/octet formats (optional) –Step #6: Standards words: “legal” wording
Digital Information Technology Testbed ©2000 Farance Inc.13 Development: Engineering Process [5/5] Let bake -- probably 1-2 years –Baking is very important –Shakes out subtle bugs –Greatly improves quality! –Usually, vendors fight “baking”... want to announce “conforming” products ASAP
Digital Information Technology Testbed ©2000 Farance Inc.14 Formal and Informal Consensus-Building Among Standards and Specification Development Orgs Consensus-Building Steps Collaboration, harmonization with other organizations Public reviews as soon as possible Public comments Resolution of comments Approval stages (refinement): –Working Draft –Committee Draft –Draft Standard –Approved Standard Development Consensus Building
Digital Information Technology Testbed ©2000 Farance Inc.15 Relationship To Semantics: Detailed Meaning Of Operations/Transactions Assertions: sentences using shall, should, or may Inquiries: range of values Negotiations: heuristics Examples: –Array: list of elements of same type in regular structure –PutValue : replaces old object, in place Requirements Functionality Conceptual Model Semantics Bindings: APIs Bindings: Codings Bindings: Protocols Encodings: Data Formats Encodings: Calling Conventions Encodings: Various Communication Layers
Digital Information Technology Testbed ©2000 Farance Inc.16 Some Strategies for Standardizing Data Models Partition into “application areas” Build standards in several steps, example: –Year 1: Create minimal, widely adoptable standard –Year 3: Create amendment that represents best and widely implemented practices –Year 5: Revise standard, incorporate improvements Support extension mechanisms –Permits user/vendor/institutional extensions –Widely implemented extensions become basis for new standards amendments/revisions
Digital Information Technology Testbed ©2000 Farance Inc.17 Building Standards In Several Steps Maintenance Development Review The “Standard” Amendments: 2-3 years Revisions: 4-5 years Consensus Building User/Vendor/ Institutional “Extensions” Input To Next Revision Of Standard Market-Relevant, Widely-Adopted “Extensions”
Digital Information Technology Testbed ©2000 Farance Inc.18 Strictly Conforming vs. Conforming Implementations Consensus Building Strictly Conforming Implementation: Maximum Interoperability, Minimum Functionality Conforming Implementations: Less Interoperability, More Functionality Many Conforming Implementations Are Possible: Interoperability May Vary Extensions A Conformance Issue
Digital Information Technology Testbed ©2000 Farance Inc.19 Semantics: Sources and Influences Requirements Functionality Conceptual Model Semantics Bindings: APIs Bindings: Codings Bindings: Protocols Encodings: Data Formats Encodings: Calling Conventions Encodings: Various Communication Layers Semantics Bindings: APIs C, C++, SQL, LISP, Java, Javascript, POSIX, Tcl, APL FTP, HTTP, MIME, RFC 822, CORBA, W3C P3P ISO-IEC/JTC1/SC22/WG11, ISO-IEC/JTC1/SC32/WG2, NCITS L8, NCITS T2 Conceptual Model Bindings: Codings Bindings: Protocols Language-Related Protocols Related Stds “Data model semantics are based on the conceptual model, protocols, programming languages, and related standards/specifications.” Conceptual Model Semantics APIs, Codings, Protocols
Digital Information Technology Testbed ©2000 Farance Inc.20 Life Cycle: Maintenance Phase Maintenance of Standards Requests for Interpretation (RFIs) Defect Reports (DRs) and Record of Responses (RRs) Amendments (AMs) and Technical Corrigenda (TCs) Development Consensus Building Maintenance
Digital Information Technology Testbed ©2000 Farance Inc.21 Maintenance [1/3] Requests for interpretation (RFIs): –Questions on ambiguous wording –Response time: 3-18 months –Interpret questions based on actual standards wording, not what is desired –Like U.S. Supreme Court –Consistent responses — highly desirable
Digital Information Technology Testbed ©2000 Farance Inc.22 Maintenance [2/3] Defect reports: –Bug reports –Fix features, as necessary –Many defect reports weak standard –Weak standard little acceptance/use –Little acceptance/use failure –Failure: only recognized years later
Digital Information Technology Testbed ©2000 Farance Inc.23 Maintenance [3/3] Other deliverables: –Record of responses (to RFIs) –Technical corrigenda (for defect reports) –Amendments Next revision: –Reaffirm –Revise –Withdraw
Digital Information Technology Testbed ©2000 Farance Inc.24 Success Attributes [1/2] Participants: –expect to be involved 2-5 years Schedule: –don’t get rushed, don’t get late New technology: –be conservative Scope: –stick to it!
Digital Information Technology Testbed ©2000 Farance Inc.25 Success Attributes [2/2] Conformance: –need to measure it –should have working definition ASAP Target audience: –commercial systems and users Quality: –fix bugs now Process: –have faith in standards process — it works!
Digital Information Technology Testbed ©2000 Farance Inc.26 Failure Attributes [1/2] Incorporate new/untried technology –Why waste committee time? Ignore commercial interests –Who will implement the standard? Ignore public comments –Who will buy standardized products? Creeping featurism –The schedule killer!
Digital Information Technology Testbed ©2000 Farance Inc.27 Failure Attributes [2/2] Poor time estimates –Progress is made over quarters, not weeks Leave bugs to later –Expensive to fix later, like software Weak tests of conformance –Standard-conforming, but lacks interoperability Too much implementation-defined behavior
Digital Information Technology Testbed ©2000 Farance Inc.28 End Of Standards Life Cycle Review Cycle: Revise, Reaffirm, Withdraw Review Cycle: Revise: new work item, incorporate new technology Reaffirm: no changes, stable technology Withdraw: little use, obsolete technology Typically: 5 year cycle Development Consensus Building MaintenanceReview
Digital Information Technology Testbed ©2000 Farance Inc.29 Typical Timeline Of Standards Summary of Standards Process Development Phase: months Consensus Phase: 9-24 months Maintenance Phase: 3-6 years Review Cycle: revise, reaffirm, or withdraw — every 5 years Typical time: from committee formed to approved standard: months Realistic schedule ==> Good results Development Consensus Building MaintenanceReview
Digital Information Technology Testbed ©2000 Farance Inc.30 Conformance Testing Testing implementations Relationship to standards/specifications Relationship to standard/specification development organizations Relationship to testing organizations
Digital Information Technology Testbed ©2000 Farance Inc.31 Testing Implementations — A Generic Approach Implementation Under Test Test Driver Test Results Reports Test Jig
Digital Information Technology Testbed ©2000 Farance Inc.32 Assertion, Inquiry, Negotiation, and Testing (AINT): Documents that transform Standards words to specific tests Users Vendors Other SSDO (Public) AINT Document Test Developer Test Suite “AINT documents are critical for uniform and thorough testing” Development Consensus Interpretation Test Creation Standards (specifications)
Digital Information Technology Testbed ©2000 Farance Inc.33 The DITT Testing Process: Advice, Certification, Testing, and Branding SSDOTest SuiteNISTIEEE-ISTOVendor Advisory Board Test Lab (DITT) DITT: Test Developer Test Suite Branding SSDO Branding Authorization NIST Lab Accreditation IEEE Proc. Admin. Test Suite Test Suite Licensed To Vendors Dispute Resolution Vendor Products Results
Digital Information Technology Testbed ©2000 Farance Inc.34 Various Stakeholders’ Perspectives User: –some person or organization that cares about interoperability Vendor: –develops products/services that conform to standards/specifications Small Developers: –develop products, services, content; small testing resources Institutions: –strong need to reduce business risk –need interoperability
Digital Information Technology Testbed ©2000 Farance Inc.35 Testing vs. Certification Users/vendors may “self-test” via publicly available test suites Certification implies: –Controlled testing –Certified lab –Issuing a “certificate” or “branding”
Digital Information Technology Testbed ©2000 Farance Inc.36 Initial Technical Scope of DITT Conformance Testing IEEE (PAPI Learner) IEEE (Learning Objects Metadata) ISO/IEC (Metadata Registries)
Digital Information Technology Testbed ©2000 Farance Inc.37 Conformance Testing: Insures Interoperability LTSC (P1484) DITT: Conformance Testing for approved standards Note: Conformance testing involves: (1) interpretation of standards (2) development of test suites (3) testing vendor products/services ISO/IEC JTC1/SC36 GEM PROMETEUS GESTALT
Digital Information Technology Testbed ©2000 Farance Inc.38 Possibilities Follow-On Scope of Conformance Testing IEEE (Content Sequencing) IEEE (Portable Content) IEEE (Computer Managed Instruction) IEEE (Simple Identifiers) IEEE (Content Packaging) IEEE (Semi-Structured Data) IEEE (Platform/Browser/Media Profiles)
Digital Information Technology Testbed ©2000 Farance Inc.39 Conformance Testing Of Institutional Extensions Supporting “institutional” extensions –Learning objects metadata –Institutional extensions –Data repositories –Data registries
Digital Information Technology Testbed ©2000 Farance Inc.40 Related Conformance Testing In Healthcare ANSI HISB X12 HL7 Patient identifiers Patient records systems Security
Digital Information Technology Testbed ©2000 Farance Inc.41 More Information Contact: –Roy Carroll, –Frank Farance, Upcoming events: –Demonstration at SC32/WG2 meeting –Test suites available 2000Q2-3