Presentation is loading. Please wait.

Presentation is loading. Please wait.

Other Topics? IDV VVSG current draft Human Factors Core Requirements and Testing.

Similar presentations


Presentation on theme: "Other Topics? IDV VVSG current draft Human Factors Core Requirements and Testing."— Presentation transcript:

1 Other Topics? IDV VVSG current draft Human Factors Core Requirements and Testing

2 IDV

3 Topics IDV Review IDV in the EAC VVSG IDV Research Issues Next Steps

4 Review: What is IDV? IDV = Independent Dual Verification Voting systems that create one or more records of ballot choices that are –Also verified by the voter (along with voting system’s electronic record) –Once verified, cannot be changed by the voting system –Useful in efficient comparisons with voting system’s electronic record –Efficiently handled, resistant to damage, accessible VVPAT is one example using paper as 2nd record

5 Why is IDV important? 2nd record essential for meaningful audits and recounts 2nd record essential because voting systems are computers, which can –Be very difficult to assess and test for accuracy –Fail for a myriad of unknown reasons –Be compromised or attacked

6 EAC VVSG and IDV The VVPAT requirements in the EAC VVSG are an instantiation of IDV IDV discussion contained in appendix D Begins with core IDV definitions (characteristics) Lists more specific definitions for –Split process –Witness –Cryptographic –Op scan

7 IDV in the marketplace now 2 + witness systems available 4 + VVPAT systems (25 states now require a verified paper trail) Some ballot marking/op scan systems are split- process 1 + cryptographic systems available None these systems meet all IDV definitions but are mostly in IDV

8 IDV Issues Usability of multiple representations in comparisons and audits Usability for both voters and election officials Accessibility of multiple representations Interoperability of record formats to facilitate 3rd party audits, IDV add-ons (e.g., Witness devices)

9 Core Requirements and Testing

10 Profiles Profile: specialization of a standard for a particular context, with constraints and extensions that are specific to that context. (Glossary def. 2) Formalize profiles for “categories” Add profiles for supported voting variations and optional functions Types of independent dual verification Defined in conformance clause (Sec. 4.2) Extensible (e.g., this state’s profile)

11 Compliance points Identified, testable requirements Getting there –Extricate compound requirements from one another, make separate compliance points –Clarify general requirements via sub- requirements that are profile- and/or activity- specific –Refactor repeating, overlapping requirements

12 Implementation statement Vendor identifies profiles to which the system is believed to conform Applicable test cases are identified by profiles Certification is to those profiles only

13 Issues Sorting out will take time Identification, referencing and indexing of compliance points require a robust document production system Versioning of standard

14 Agenda 1.Standards architecture ( profiles, compliance points, formalized implementation statements…) (section 4.2.2) 2.Software integrity and coding conventions 3.Methods for conformity assessment –Logic verification –Test protocols 4.Casting, counting, and reporting requirements 5.Process model 6.Performance and workmanship requirements 7.Issue paper 8.Research papers on VVSG maintenance and information sharing 9.Future work

15 Definition Profile: (1) Subset of a standard for a particular constituency that identifies the features, options, parameters, and implementation requirements necessary for meeting a particular set of requirements. (2) Specialization of a standard for a particular context, with constraints and extensions that are specific to that context. (1) ISO 8632 (Computer Graphics Metafile)

16 Profiles in implementation statement An implementation statement shall identify: –Exactly 1 profile from group 1 –Exactly 1 profile from group 2 –All applicable profiles from group 3 Profile group 1: product classes, taxonomy ‘A’ –Precinct count –Central count

17 Structure of profiles Profiles form a classification hierarchy Profiles may subsume other profiles (e.g., paper-based subsumes marksense and punchcard, and “all systems” subsumes everything) Profiles may be declared to be mutually exclusive, or not, as appropriate (e.g., paper-based and electronic overlap in the case of marksense) Constraint: in all cases, that which conforms to a subprofile also conforms to the subsuming profile Anything conforming to a profile of a standard conforms to that standard

18 The hierarchy so far Voting systems IDV … Precinct count Central count - Disjoint - DRE Marksense Punchcard Straight-party voting Ranked order voting Split precincts All those optional features VVPAT Witness Split process End-to-end (crypto) - Disjoint -

19 Profiles in compliance points Restricts applicability of compliance point to a precisely specified subset of voting systems Requirement 4.4.3.5-1.2 Systems conforming to the DRE profile shall maintain an accurate Cast Vote Record of each ballot cast. Requirement 4.4.2-2.13.1 In systems conforming to the Provisional / challenged ballots and DRE profiles, the DRE shall provide the capability to categorize each provisional/challenged ballot.

20 Profiles in conformity assessment Like compliance points, test cases and expert reviews are associated with specific profiles or combinations of profiles Only those conformity assessment activities indicated for the profiles claimed in the implementation statement are applicable

21 Profiles and traceability Using the profiles mechanism, jurisdictions may formally define their own specializations of the standard, provided that they do not conflict with the standard (that which conforms to a subprofile must conform to the subsuming profile) Conformance to the jurisdiction’s profile is well- defined because the conformance clause is included by reference, as are all of the applicable compliance points

22 Agenda 1.Standards architecture (profiles, compliance points, formalized implementation statements…) 2.Software integrity and coding conventions (sections 4.3.1.1 and 4.3.4.1.1) 3.Methods for conformity assessment –Logic verification –Test protocols 4.Casting, counting, and reporting requirements 5.Process model 6.Performance and workmanship requirements 7.Issue paper 8.Research papers on VVSG maintenance and information sharing 9.Future work

23 What they are Mostly, requirements on the form (not function) of source code Some requirements affecting software integrity, implemented as defensive coding practices –Error checking –Exception handling –Prohibit practices that are known risk factors for latent software faults and unverifiable code –Unresolved overlap with STS….

24 Motivation Started in 1990 VSS, expanded in 2002, expanded more in IEEE P1583 5.3.2b TGDC Resolution #29-05, “Ensuring Correctness of Software Code” (part 2) Enhance workmanship, security, integrity, testability, and maintainability of applications

25 2002 VSS / VVSG 1 status Mixture of mandatory and optional Vendors may substitute “published, reviewed, and industry-accepted coding conventions” Incorporated conventions have suffered from rapid obsolescence and limited applicability Some mandatory requirements had unintended consequences

26 Recommended changes Expand coding conventions addressing software integrity –Start with IEEE requirements –Make defensive coding requirements (error and range checking) more explicit (Sec. 4.3.1.1.2) –Require structured exception handling Clarify length limits (modules vs. callable units) Delete the rest; require use of “published, credible” coding conventions

27 Credible ≈ industry-accepted Coding conventions shall be considered credible if at least two different organizations with no ties to the creator of the rules or to the vendor seeking qualification independently decided to adopt them and made active use of them at some time within the three years before qualification was first sought.

28 Issues (1) Definition of “credible” is problematic General: prescriptions for how to write code versus open-ended expert review –Performance requirement “shall have high integrity” is too vague, not measurable –2002 VSS, P1583 5.3.2b include prescriptions –STS favors open-ended expert review –Compromise: conservative prescriptions followed by STS review

29 Issues (2) Public comment, April 14: “The NASED Technical Committee has previously ruled that assembler code is permitted as long as the code meets all other requirements.” TBD – need to get a copy of that ruling, determine if it covers tabulation code, and if so, why. C doesn’t have structured exception handling Delete integrity requirements, etc. if “published, credible” replacements are found

30 Non-issues Assembly language in “hardware-related segments” and operating system software Grandfathering of stable code — part of general grandfathering strategy COTS or “slightly modified” COTS — part of COTS strategy, driven by security requirements, T.B.D.

31 Agenda 1.Standards architecture (profiles, compliance points, formalized implementation statements…) 2.Software integrity and coding conventions 3.Methods for conformity assessment –Logic verification –Test protocols 4.Casting, counting, and reporting requirements 5.Process model 6.Performance and workmanship requirements 7.Issue paper 8.Research papers on VVSG maintenance and information sharing 9.Future work

32 Conformity assessment overview Reviews –Logic verification –Open-ended security review –Accessibility, usability reviews –Design requirement verification Tests –Test protocols –Performance-based usability testing –Environmental tests

33 Agenda 1.Standards architecture (profiles, compliance points, formalized implementation statements…) 2.Software integrity and coding conventions 3.Methods for conformity assessment –Logic verification (sections 4.5.2, 5.1.2, 5.3, & 6.3.1) –Test protocols 4.Casting, counting, and reporting requirements 5.Process model 6.Performance and workmanship requirements 7.Issue paper 8.Research papers on VVSG maintenance and information sharing 9.Future work

34 What it is Formal characterization of software behavior within a carefully restricted scope Proof that this behavior conforms to specified assertions (i.e., votes are reported correctly in all cases) Complements [falsification] testing

35 Motivation TGDC Resolution #29-05, “Ensuring Correctness of Software Code” Higher level of assurance than functional testing alone Clarify objectives of source code review

36 How it works Vendor specifies pre- and post-conditions for each callable unit Vendor proves assertions regarding tabulation correctness Testing authority reviews, checks the math, and issues findings –Pre- and post-conditions correctly characterize the software –The assertions are satisfied

37 Issues Training required Limited scope = limited assurance; unlimited scope = impracticable Overlaps with security reviews

38 Agenda 1.Standards architecture (profiles, compliance points, formalized implementation statements…) 2.Software integrity and coding conventions 3.Methods for conformity assessment –Logic verification –Test protocols (s ection 6.4) 4.Casting, counting, and reporting requirements 5.Process model 6.Performance and workmanship requirements 7.Issue paper 8.Research papers on VVSG maintenance and information sharing 9.Future work

39 What it is General test template General pass criteria Collection of testing scenarios with implementation-specific behavior parameterized or abstracted out Functional tests, typical case tests, capacity tests, error case tests Performance-based usability tests might be integrated, T.B.D. (Template may differ)

40 Motivation TGDC Resolution #25-05, item 4 (test methods) TGDC Resolution #26-05, “Uniform Testing Methods and Procedures” Improved reproducibility

41 Changes from current Augments implementation-dependent, white box structural and functional testing Raises the baseline level of testing that all systems must pass Error rate, MTBF estimated from statistics collected across run of entire test suite

42 Issues Implementation-dependent white box testing: poor reproducibility, but hardly redundant Testing combinations of features requires extensive test suite Typical case tests – need input on what is typical

43 Today’s agenda 1.Standards architecture (profiles, compliance points, formalized implementation statements…) 2.Software integrity and coding conventions 3.Methods for conformity assessment –Logic verification –Test protocols 4.Casting, counting, and reporting requirements (sections 4.4.2 and 4.4.3) 5.Process model 6.Performance and workmanship requirements 7.Issue paper 8.Research papers on VVSG maintenance and information sharing 9.Future work

44 Origins Mostly derived from 2002 VSS Refactored requirements to clarify and reduce redundancy Separated election administration concerns (best practices, Sec. 4.6) Added precision

45 Motivation TGDC Resolution #25-05, “Precise and Testable Requirements”

46 Significant changes In casting and counting, specified functional requirements for the many voting variations (optional profiles) In reporting, significantly revised requirements on the content of reports Accuracy of reported totals now defined by logic model

47 Reporting issues Cast, read, and counted: three concepts, not two Reporting levels: tabulator, precinct, election district, and jurisdiction (state) –2002 VSS unclear on what levels are required –Generic facility to define arbitrary reporting contexts is not required Manual / optional processing of ballots with write-in votes: outside the system, unverifiable. Can these possibly conform to the write-ins profile? Define “unofficial”

48 Agenda 1.Standards architecture (profiles, compliance points, formalized implementation statements…) 2.Software integrity and coding conventions 3.Methods for conformity assessment –Logic verification –Test protocols 4.Casting, counting, and reporting requirements 5.Process model (section 4.5.1) 6.Performance and workmanship requirements 7.Issue paper 8.Research papers on VVSG maintenance and information sharing 9.Future work

49 What it is A formal model of the elections process in both graphical and textual representations An identification of the dependencies among activities and objects Informative, not normative

50 Motivation TGDC Resolution #33-05, “Glossary and Voting Model” Complements glossary to further clarify vocabulary Helps to specify intended scope of requirements

51 Origins Review of previous work Input from CRT subcommittee Reconciliation of conflicting vocabulary and models Elaboration as needed

52 Language Graphical representation is activity diagram as defined in Unified Modeling Language version 2.0 Textual representation is based on Petri Net Linear Form

53 Diagram: Register voters Input: [Registration database [original]] Output: [Voter lists] [Registration database [original]] -> { ->(Register new voters) ->, ->(Update voter information) ->, ->(Purge ineligible, inactive, or dead voters) -> }; ->[Registration database [updated]] ->(Generate voter lists) ->[Voter lists].

54 Issues Vocabulary is still evolving Probably never match any state’s processes perfectly

55 Accessibility, Usability and Privacy Requirements

56 Accessibility and usability are qualities of a voting system Accessibility refers to the degree to which a system is available to, and usable by, individuals with disabilities. HAVA also includes alternative language access for voters with limited English proficiency as required by the Voting Rights Act. Usability means that voters can cast valid votes as they intended, quickly, without errors, and with confidence that their ballot choices were recorded correctly. It also concerns the setup, operation and maintenance of voting equipment by poll workers and election officials.

57 Voting systems must be usable and accessible for everyone The VVSG usability and accessibility section focuses on the voting process, and on voters. Future work should focus on other users: –Election officials –Poll workers

58 Language in HAVA guided our work That system be “accessible for individuals with disabilities, including non-visual accessibility for the blind and visually impaired, in a manner that provides the same opportunity for access and participation (including privacy and independence) as for other voters.” -- 301 (a)(3)(A) At least one voting system “equipped for individuals with disabilities” must be used at each polling place for federal elections held on or after January 1, 2006. --. 301 (a)(3)(B). And “provide alternative language accessibility as already required by section 203 of the Voting Rights Act.” -- 301 (a)(4).

59 Resolutions Human factors and privacy rely on both having well designed systems, and the effective deployment of those systems in the polling place (#3-05) Human abilities exist on a wide spectrum. Strong universal usability requirements make all voting systems not only more usable, but accessible to more people. (#6-05) Ballot design, instructions and error messages are a critical part of the voting experience. They are of particular importance for people with cognitive disabilities (#8-05) Setting performance, rather than design, standards will encourage innovation to address the complex, interlocking requirements for accessibility, functionality, security and trust. (#5-05)

60 Our Practical Approach Accessibility requirements were a top priority under HAVA (#2-05) Other human factors and privacy requirements cover aspects of accurately capturing indication of a voter’s choice (#4-05) All requirements involving human interaction must ensure that basic usability, accessibility, and privacy are maintained. (#9-05) The standards themselves must be usable. Voting system standards should be written in plain language understandable by both test experts and by voting officials who are not experts in human factors or design. (#10-05) Voting machines must be available to validate conformance tests and establish performance benchmarks. (#11-05)

61 Usability should be part of all design stages During design and development Summative testing as part of qualification For each election Evaluate the product usability throughout the development process Evaluate the finished product against usability requirements to measure its success against human performance Ensure that the ballot design and use of the system in the polling place continue to meet requirements

62 Even the best standards have limitations. A standard should ensure a base level of usability, accessibility, and privacy. Good design and election procedures support and extend standard requirements

63 VVSG 2.2.7 requirements maintain or upgrade VSS 2002 Comprehensive accessibility requirements and recommendations that point the way to future requirements First usability requirements for voting systems, upgraded from informative appendix New privacy requirements focused on the voter-equipment interface Other elements –Recommendation that vendors present report of summative usability tests for both general and accessible voting systems –Work to clarify ambiguous requirements and fill gaps –Reflect what is readily achievable with current technology. –Some human factors requirements in section on VVPAT

64 Research is currently under way at NIST Usability performance benchmarks Plain language guidance for ballots, instructions, error messages Guidance for ballot design Usability of standards Courtesy Design for Democracy

65 VVSG Section 2.2.7 1. Accessibility 1.1 General accessibility 1.2 Visual 1.3 Dexterity 1.4 Mobility 1.5 hearing 1.6 speech 1.7 Cognitive 2. Alternate languages 3. Usability 3.1 Usability testing 3.2 Functional 3.3 Cognitive 3.4 Perceptual 3.5 Interaction 4. Privacy 4.1 Voting station configuration 4.2. Anonymity for alternate formats

66 VVSG current draft

67 Human Factors – VVSG 4 Areas: –Accessibility –Usability –Limited English Proficiency –Privacy Based on current state of the art

68 Human Factors – Setting the stage for VVSG 2007 Require more advanced accessibility but still in industry state of the art –Synchronized audio and video Performance measures for usability

69 VVPAT VVPAT not mandatory. These requirements are for States that choose to use VVPAT. Requirements address: –Operations –Auditability –Privacy

70 Independent Dual Verification – Setting the stage for VVSG 2007 Multiple records of ballots for verification Several types currently being studied: –Split process –Witness – End-to-end (cryptographic) –VVPAT

71 Wireless Control & identify usage Encrypt and authenticate Function without wireless Protect against wireless-based attack

72 Software Distribution & Setup Validation Help ensure correct version of voting software is used Help ensure voting software is set up correctly Use National Software Reference Library www.nsrl.nist.gov


Download ppt "Other Topics? IDV VVSG current draft Human Factors Core Requirements and Testing."

Similar presentations


Ads by Google