Federal vs. State.  Started the move towards eVote systems in the US  Old-fashioned manual punch card systems (Votomatic)  Often used in counties with.

Slides:



Advertisements
Similar presentations
2002 Voting Systems Accessibility Standards David Baquis, U.S. Access Board Elections Accessibility Conference Friday, February 20, 2004 Trenton, New.
Advertisements

17-803/ ELECTRONIC VOTING FALL 2004 COPYRIGHT © 2004 MICHAEL I. SHAMOS / Electronic Voting Session 3: Punched-Card Systems Michael I.
TGDC Meeting, December 2011 Usability and Accessibility (U&A) Research Update Sharon J. Laskowski, Ph.D.
Good or Bad?.  One of the closest contests in US history  Florida was the pivotal state  Neither Democrat Al Gore nor Republican George W. Bush had.
By Varun Jain. Introduction  Florida 2000 election fiasco, drew conclusion that paper ballots couldn’t be counted  Computerized voting system, DRE (Direct.
Voter Access & Modernized Elections 2013 LEGISLATION.
© Copyright 2009 TEM Consulting, LP - All Rights Reserved Presentation To Travis County, TX - May 27, 2009Rev 1 – 05/22/09 - HSB US Voting System Conformity.
Presentation by Christine McElroy
Electronic Voting Linh Nguyen. Electronic Voting  Voting Technologies  The Florida 2000 Election  Direct Recording Electronic Devices (DREs)‏ - Diebold.
CS294S: Build a Voting System Dan Boneh, David L. Dill, Andrew Bortz.
United States Election Assistance Commission Pilot Program Testing and Certification Manual & UOCAVA Pilot Program Testing and Certification Manual & UOCAVA.
Security flaws in existing voting systems by Slavik Krassovsky.
Voting System Qualification How it happens and why.
United States 1 Election Assistance Commission 1 Inspiring Change & Modernization in Election Administration Seattle, WA June 10, 2015.
12/9-10/2009 TGDC Meeting TGDC Recommendations Research as requested by the EAC John P. Wack National Institute of Standards and Technology
TGDC Meeting, July 2011 Overview of July TGDC Meeting Belinda L. Collins, Ph.D. Senior Advisor, Voting Standards, ITL
Election Assistance Commission United States VVSG Technical Guidelines Development Committee (TGDC) NIST July 20, 2015 Gaithersburg,
Testing Summit Sacramento, CA November 28, 2005 Barbara Guttman National Institute of Standards and Technology
Presentation of ES&S John Groh, Senior Vice President of Government Relations October 15, 2007.
Demystifying the Independent Test Authority (ITA)
The Computer vote ! The Way of the Future ?. The old-fashioned way is the way! The mind set of most people. (it was good enough for dad, it’s good enough.
NIST HAVA-Related Work: Status and Plans June 16, 2005 National Institute of Standards and Technology
Mass Mail Out & VR Cards. Mass Mail Out of Voter Registration Certificates Certificates must list jurisdictional numbers for seven (7) designated territorial.
Making every vote count. United States Election Assistance Commission HAVA 101 TGDC Meeting December 9-10, 2009.
E-Voting Dissent Sara Wilson, Katie Noto, John Massie, Will Sutherland, Molly Cooper.
Digital Democracy: A look at Voting Machines Presented by Justin Dugger April 2003.
Secretary of State Voting System Security Standards Juanita Woods Secretary of State Elections Division HAVA Information Security.
Laboratory Accreditation as a Component of the Help America Vote Act Mary H. Saunders Chief, Standards Services Division.
Accreditation for Voting Equipment Testing Laboratories Gordon Gillerman Standard Services Division Chief
Georgia Electronic Voting System Testing and Security Voting Systems Testing Summit November 29, 2005.
E-Voting in CA Original Author: Alan Huch September 12, 2007.
Briefing for NIST Acting Director James Turner regarding visit from EAC Commissioners March 26, 2008 For internal use only 1.
NIST Voting Program Activities Update February 21, 2007 Mark Skall Chief, Software Diagnostics and Conformance Testing Division.
New Hampshire’s Approach to the State Plan for the Help America Vote Act (HAVA) Disabilities Access and Voting Systems Task Force.
Electronic Voting Machine Insecurity Michael Plasmeier theplaz.com.
VOTING- MACHINES, BALLOTS, AND SOLUTIONS GAIL YACYSHYN, TOM MAHONEY, TYLER REYNOLDS, CEDRIC SUZUKI.
Election Review. Orange County Election Infrastructure Neal Kelley, Registrar of Voters April 15, 2011.
Test Plans, Test Cases, and Test Reports
Voting System Grant Program. Help America Vote Act  Provides funding to help accomplish the various requirements of the Act.
Making every vote count. United States Election Assistance Commission EAC Voting System Certification TGDC Meeting December 9-10, 2009.
How and what to observe in e-enabled elections Presentation by Mats Lindberg, Election Adviser, Organisation for Security and Co-operation in Europe (OSCE)
1 The Evolution of Voting Systems Paul DeGregorio Vice Chairman Donetta Davidson Commissioner The U.S. Election Assistance Commission.
NC Voting Systems How do S.L and HAVA impact the voting system in your county and what duties must you quickly perform?
NIST Voting Program Barbara Guttman 12/6/07
TGDC Meeting, July 2011 Voluntary Voting System Guidelines Roadmap Nelson Hastings, Ph.D. Technical Project Leader for Voting Standards, ITL
Computers in Society Electronic Voting. Team Projects What is your name? Application? Presentation? Copyright The software industry The open source business.
TGDC Meeting, Jan 2011 Help America Vote Act (HAVA) Roadmap Nelson Hastings National Institute of Standards and Technology
Canvassing, Reporting and Preserving Results 27 th Annual Election Law Seminar Ashley Fischer.
NIST Voting Program Activities Update January 4, 2007 Mark Skall Chief, Software Diagnostics and Conformance Testing Division.
1 DECEMBER 9-10, 2009 Gaithersburg, Maryland TECHNICAL GUIDELINES DEVELOPMENT COMMITTEE Commissioner Donetta Davidson.
High Assurance Products in IT Security Rayford B. Vaughn, Mississippi State University Presented by: Nithin Premachandran.
The VVSG Version 1.1 Overview Matthew Masterson Election Assistance Commission
Election Reform The Open Voting Consortium. Elections are important Voting is how we ultimately control.our government Many elections are decided by just.
Creating Accessibility, Usability and Privacy Requirements for the Voluntary Voting System Guidelines (VVSG) Whitney Quesenbery TGDC Member Chair, Subcommittee.
PREPARATIONS FOR VOTING: IN QUEST OF INTEGRITY AND PUBLIC CONFIDENCE by Roy G. Saltman Consultant on Election Policy and Technology
12/9-10/2009 TGDC Meeting The VVSG Version 1.1 Overview John P. Wack National Institute of Standards and Technology
Election Assistance Commission 1 Technical Guidelines Development Committee Meeting Post-HAVA Voting System Requirements – Federal Perspective February.
Briefing for the EAC Public Meeting Boston, Massachusetts April 26, 2005 Dr. Hratch Semerjian, Acting Director National Institute of Standards and Technology.
Next VVSG Training Security: Testing Requirements October 15-17, 2007 Nelson Hastings Alicia Clay Jones National Institute of Standards and Technology.
Elections - The ultimate time constrained project Marie Gregoire, PMP 1.
The VVSG 2005 Revision Overview EAC Standards Board Meeting February 26-27, 2009 John P. Wack NIST Voting Program National Institute.
EVoting 23 October 2006.
Ken Detzner, Secretary of State
Medical Device Cybersecurity Legislative Activities - Overview
Election Security Best Practices
Methods and Assistance Program (MAP) Reviews Lori Fetterman, MAP Supervisor Emily Hightree, MAP Team Lead Property Tax Assistance Division October 12,
Demystifying the Independent Test Authority (ITA)
Demystifying the Independent Test Authority (ITA)
Demystifying the Independent Test Authority (ITA)
Election Security Best Practices
Presentation transcript:

Federal vs. State

 Started the move towards eVote systems in the US  Old-fashioned manual punch card systems (Votomatic)  Often used in counties with low income, that had no money to buy new equipment  “hanging chads” – holes not fully punched through  Confusing paper ballot design   Uncertainty about voter intentions

 National Association of State Election Directors (NASED), in effect since 1994  No federal funding  Voting systems tested by “Independent Testing Authorities (ITA)” using 1990 Federal Election Commission Voting System Standards (VSS)  Slightly updated in 2002 (before HAVA passing)  NASED reviews ITA report and certifies a system as “meeting federal standards”  Conflict of Interest: ITAs are commercial companies; Vendors selects, and pays directly to the ITAs  ITAs have no interest in negative reports  Almost all systems used in US elections were NASED/ITA certified, yet the certification failed to prevent disasters like Florida 2000, or find the errors found in CA TTBR (see below)

 Passed in October 2002  Objective: ◦ Modernize US election technology to avoid situations like Florida 2000 in the future, through ◦ Creation of the Federal Election Assistance Commission (EAC), which would ◦ Establish uniform election system standards and create a new, more efficient federal certification system  And… 3.9 billion dollars in federal funding for states to buy new technology, guided by the EAC

 HAVA requires the EAC to develop new voting systems standards by January 1, 2004  These standards help states select technology to upgrade their election systems (using the federal funding) by January 1, 2006  BUT: Appointment of EAC commissioners delayed by almost 10 months  BUT: only US$ 2 million (of the US$ 30 million planned 2003 EAC budget for testing and R&D) was provided   No guidelines in 2003

 In 2004, of US$ 50 million budgeted for testing, research and development of standards, only US$ 1.2 million were paid out   No standards / certification in 2004  BUT: in 2004, US$ 1300 million was paid out to states to buy new technology  US Dept. of Justice insists on states having new equipment ready by January 1 st, 2006   Huge new, unregulated market for voting equipment makers

 Equipment makers rush to market  Immature products, focus on features, not code design  Insecure software  Counties buy whatever looks good  No in-house IT expertise to evaluate  No EAC guidance on what’s good and what not   Thousands of small and not-so-small disasters causes by faulty voting systems

 Voluntary Voting System Guidelines (VVSG) published only in December 13, 2005 (designed by NIST, approved by EAC)  Went into effect only in 2007  To bridge the gap, in June 2006, the EAC essentially took over the NASED/ITA program, with all its flaws  EAC’s own testing and certification program started only in January 2007

 Similar system as NASED (ITAs are now “voting system test laboratories” or VSTLs)  Testing against VVSG 2005  BUT: similar conflict of interest (direct VSTL payment and selection)  Still voluntary, states may require EAC certification, but don’t have to  Better: “Quality Monitoring Program” reviews systems after certification, and may de-certify for vendor misinformation, use of non-certified versions in the field, unauthorized change, malfunction and bugs in the field, etc  Updated VVSG II are still not finished, EAC tests against 2005 standards

 VVSG 2005 are fairly comprehensive, but EAC testing methods to verify them are not sufficient  EAC is “friendly” testing - defines test cases based on functions that the equipment is supposed to have  “Does it do what it says it does?”  Predictable, does not anticipate unusual situations or creative attacks  Adversarial testing: Assemble a group of smart people, and say “Lets see if we can break this!”  State certification programs like California TTBR, Ohio Everest, Florida SAIT

 Introduced in 2007 by Secretary of State (Sos) Debra Bowen in response to weak federal certification  All currently certified systems in use in CA are reviewed under new methodology  Severe security flaws found with all systems  SoS Office decertifies all systems for use in California (both Scanners and DREs)  Imposes strict usage conditions for re-certification ◦ for Sequoia and Diebold, only early voting, on eDay only one machine per polling place (for disabled access) ◦ all results from them must be manually recounted (100%) ◦ Hart Intercivic may be used more freely ◦ ES&S didn’t submit its software and was directly decertified  all vendors must produce plans to “harden” their equipment to protect against security vulnerabilities found by the TTBR

 States had been rushed by the Dept. of Justice to buy machines by 1. Jan 2006, even without EAC guidance  Now, in CA, millions of US$ worth of equipment (especially DREs) sat in storage, and could not be used  wasted taxpayer dollars  Counties had to revert to paper elections (e.g. Santa Clara Ct) or buy different, certified machines, spending extra money

 Penetration analysis / Red Team attacks ◦ first w/o system knowledge, then with full system knowledge  Source Code / Architectural review  Hardware review  Documentation review  Accessibility review   Threat assessment, define use conditions to mitigate the security weaknesses found

 Vendor pays SoS, not test lab  SoS then selects team who will audit   No conflict of interest  Audit teams are from State University (Professor and Grad students) – not commercial companies  Name and CV of each participating auditor is published online  academic reputation as guarantor of integrety  Teams elaborate report, SoS issues: ◦ certification, ◦ conditional certification (under use conditions), or ◦ rejection  Complete reports of teams are available online, not just summaries

 SoS must be informed for each system change  SoS decides: ◦ if the change is “minor” it “rolls over” the certification to the new version ◦ otherwise, full new certification is required  Temptation for vendor to not declare system changes to avoid cost of re-certification ◦ Case of ES&S – In Nov 2007, SoS sued ES&S for selling 972 AutoMARK Model A200 ballot-marking machines to several counties that contained hardware changes that had were not authorized by the Secretary of State ◦ Settled against fine of $3.25 Million in 2009

 Problem: need for system upgrades often arise with short notice  Not enough time to develop new software and pass through certification process in time for elections (takes months)  Because EAC certification is weak, states have their own systems, but this forces vendors to pay for all the different certification in all states they want to sell in  Prohibitively costly and time consuming  Market consolidation, only strongest vendors survive

 One strong federal certification system (modeled on State best practice) should make state certification superfluous  Cheaper for vendors, easier market entry

Thank you!