Security Criteria for Electronic Voting

Peter G. Neumann
Computer Science Laboratory
SRI International, Menlo Park CA 94025
1-650-859-2375 Neumann@csl.sri.com

[Copyright 1993, Peter G. Neumann. This paper was presented at the 16th National Computer Security Conference Baltimore, Maryland, September 20-23, 1993.]

Abstract. Some basic criteria for confidentiality, integrity, availability, reliability, and assurance are considered for computer systems involved in electronic voting. An assessment of the realizability of those criteria leads to the conclusion that, operationally, many of the criteria are inherently unsatisfiable with any meaningful assurance.

BACKGROUND

The election processes of voter registration, vote casting, vote counting, and ballot generation are becoming increasingly automated [Sal93]. Numerous cases of allegedly accidental errors have been reported, along with suspicions of fraud [Dug88,Neu90]. However, the borderline between accident and fraud is murky. Serious security vulnerabilities are commonplace in most voting systems, providing widespread opportunities for computer-system misuse --- particularly by insiders [NeuPar89,Mer93]. Indeed, incentives for bribery, collusion, and fraud are likely to be enhanced by the financial stakes involved in winning or losing an election.

At present there is no generally accepted standard set of criteria that voting systems are required to satisfy. This paper proposes a generic set of criteria similar in concept to existing security criteria such as the U.S. TCSEC (the Orange Book, TNI, TDI, etc.), the European ITSEC, the Canadian CTCPEC, and the draft U.S. Federal Criteria. We observe that essentially all existing voting systems would fail to satisfy even the simplest of the existing criteria. Worse yet, each of these criteria is itself incomplete in that it fails to encompass many of the possible risks that must ultimately be addressed. Unfortunately, previous attempts to define criteria specifically for voting systems [Sal88, Sha93, FEC, NYC87] are also incomplete. However, the risks lie in the inherent unrealizability of the criteria as well as in the incompleteness of those criteria.

ELECTRONIC VOTING CRITERIA

Generic voting criteria are suggested here as follows:

System integrity. The computer systems (in hardware and system software) must be tamperproof. Ideally, system changes must be prohibited throughout the active stages of the election process. That is, once certified, the code, initial parameters, and configuration information must remain static. No run-time self-modifying software can be permitted. End-to-end configuration control is essential. System bootload must be protected from subversion that could otherwise be used to implant Trojan horses. (Any ability to install a Trojan horse in the system must be considered as a potential for subverting an election.) Above all, vote counting must produce reproducibly correct results.

Data integrity and reliability. All data involved in entering and tabulating votes must be tamperproof. Votes must be recorded correctly.

Voter anonymity and data confidentiality. The voting counts must be protected from external reading during the voting process. The association between recorded votes and the identity of the voter must be completely unknown within the voting systems.

Operator authentication. All people authorized to administer an election must gain access with nontrivial authentication mechanisms. Fixed passwords are generally not adequate. There must be no trapdoors --- for example, for maintenance and setup --- that could be used for operational subversions.

* System accountability. All internal operations must be monitored, without violating voter confidentiality. Monitoring must include votes recorded and votes tabulated, and all system programming and administrative operations such as pre- and post-election testing. All attempted and successful changes to configuration status (especially those in violation of the static system integrity requirement) must be noted. This capability is similar to that of an aircraft flight recorder, from which it is possible to recover all important information. Furthermore, monitoring must be nonbypassable --- it must be impossible to turn off or circumvent. Monitoring and analysis of audit trails must themselves be nontamperable. All operator authentication operations must be logged. ([Gre93] analyzes accountability further.)

* System disclosability. The system software, hardware, microcode, and any custom circuitry must be open for random inspection at any time (including documentation), despite cries for secrecy from the system vendors.

* System availability. The system must be protected against both accidental and malicious denials of service, and must be available for use whenever it is expected to be operational.

* System reliability. System development (design, implementation, maintenance, etc.) should attempt to minimize the likelihood of accidental system bugs and malicious code.

* Interface usability. Systems must be amenable to easy use by local election officials, and must not necessitate the on-line control of external personnel (such as vendor-supplied operators). The interface to the system should be inherently fail-safe, fool-proof, and overly cautious in defending against accidental and intentional misuse.

* Documentation and assurance. The design, implementation, development practice, operational procedures, and testing procedures must all be unambiguously and consistently documented. Documentation must also describe what assurance measures have been applied to each of those system aspects.

Other lower-level criteria from the TCSEC are also applicable, such as trusted paths to the system, trusted facility management, trusted recovery, and trusted system distribution. All of the above criteria elements require technological measures and some administrative controls for fulfillment. The following item requires primarily nontechnological factors.

* Personnel integrity. People involved in developing, operating, and administering electronic voting systems must be of unquestioned integrity. For example, convicted felons and gambling entrepreneurs are suspect.

The above set of skeletal criteria is by no means complete. There are many other important attributes that election computing systems need to satisfy operationally. For example, Saltman [Sal88] notes that voting systems must conform with whatever election laws may be applicable, the systems must not be shared with other applications running concurrently, ballot images must be retained in case of challenges, pre- and post-election testing must take place, warning messages must occur during elections whenever appropriate, would-be voters must be properly authorized, handicapped voters must have equal access, it must be possible to conduct recounts manually, and adequate training procedures must exist.

REALIZABILITY

No criteria can completely encompass all of the possible risks. However, even if we ignore the incompleteness and imprecision of the suggested criteria, numerous intrinsic difficulties make such criteria unrealizable with any meaningful assurance.

System trustworthiness

* Security vulnerabilities are ubiquitous in existing computer systems, and also inevitable in all voting systems --- including both dedicated and operating-system-based applications. Vulnerabilities are particularly likely in voting systems developed inexpensively enough to find widespread use. Evidently, no small kernel can be identified that mediates security concerns, and thus potentially the entire system must be trustworthy.

* System operation is a serious source of vulnerabilities, with respect to integrity, availability, and in some cases confidentiality --- even if a system as delivered appears to be in an untampered form. A system can have its integrity compromised through malicious system operations --- for example, by the insertion of Trojan horses or trapdoors. The presence of a superuser mechanism presents many opportunities for subversion. Furthermore, Trojan horses and trapdoors are not necessarily static; they may appear only for brief instants of time, and remain totally invisible at other times. In addition, systems based on personal computers are subject to spoofing of the system bootload, which can result in the seemingly legitimate installation of totally bogus software. Even in the presence of cryptographic checksums, a gifted developer or subverter can install a flaw in the system implementation or in the system generation. Ken Thompson's Turing-Lecture stealthy Trojan horse technique [Tho84] illustrates that no modifications to source code are required.

* System integrity can be enhanced by the use of locally nonmodifiable read-only and once-writable memories, particularly for system programs and preset configuration data, respectively.

* Data confidentiality, integrity, and reliability can be subverted as a result of compromises of system integrity. Nonalterable (e.g., once-writable) media may provide some assistance for integrity, but not if the system itself is subvertible.

* Voter anonymity can be achieved by masking the identity of each voter so that no reverse association can be made. However, such an approach makes accountability much more difficult. One-way hashing functions or even public-key encryption may be useful for providing later verification that a particular vote was actually recorded as cast, but no completely satisfactory scheme exists for guaranteeing voter anonymity, consistency of the votes tabulated with respect to those cast, and correct results. Any attempt to maintain a bidirectional on-line association between voter and votes cast is suspect because of the inability to protect such information in this environment.

* Operator authentication must no longer rely on sharable fixed passwords, which are too easily compromised in a wide variety of ways. Some other type of authentication scheme is necessary, such as a biometric or token approach, although even those schemes themselves have recognized vulnerabilities.

* System accountability can be subverted by embedded system code that operates below the accounting layers, or by low-layer trapdoors. Techniques for permitting accountability despite voter anonymity must be developed, although they must be considered inherently suspect. Read-only media can help ensure nontamperability of the audit trail, but nonbypassability requires a trusted system for data collection. Accountability can be subverted by tampering with the underlying system, below the layer at which auditing takes place. (See also [Gre93].)

* System disclosability is important because proprietary voting systems are inherently suspect. However, system inspection is by itself inadequate to prevent stealthy Trojan horses, run-time system alterations, self-modifying code, data interpreted as code, other code or data subversions, and intentional or accidental discrepancies between documentation and code.

System Robustness

* System availability can be enhanced by various techniques for increasing hardware-fault tolerance and system security. However, none of these techniques is guaranteed.

* System reliability is aided by properly used modern software-engineering techniques, which can result in fewer bugs and greater assurance. Analysis techniques such as thorough testing and high-assurance methods can contribute. Nevertheless, some bugs are likely to remain.

* Use of redundancy can in principle improve both reliability and security. It is tempting to believe that checks and balances can help satisfy some of the above criteria. However, we rapidly discover that the redundancy management itself introduces further complexity and further potential vulnerabilities. For example, triple-modular redundancy could be contemplated, providing three different systems and accepting the results if two out of three agree. However, a single program flaw (such as a Trojan horse) can compromise all three systems. Similarly, if three separately programmed systems are used, it is still possible for common-fault-mode mistakes to be made (there is substantial evidence for the likelihood of that occurring) or for collusion to compromise two of the three versions. Furthermore, the systems may agree with one another in the presence of bogus data that spoofs all of them. Thus, both reliability and security techniques must provide end-to-end protection, and must check on each other.

In general, Byzantine algorithms can be constructed that work adequately even in the presence of arbitrary component failures (for example, due to malice, accidental misuse, or hardware failure). However, such algorithms are expensive to design, implement, and administer, and introduce substantial new complexities. Even in the presence of algorithms that are tolerant of n failed components, collusion among n+1 can subvert the system. However, those algorithms may be implemented using systems that have single points of vulnerability, which could permit compromises of the Byzantine algorithm to occur without n failures having occurred; indeed, one may be enough. Thus, complex systems designed to tolerate certain arbitrary threats may still be subvertible by exploiting other vulnerabilities.

* Interface usability is a secondary consideration in many fielded systems. Complicated operator interfaces are inherently risky, because they induce accidents and can mask hidden functionality. However, systems that are particularly user-friendly may be even more amenable to subversion than those that are not.

* Correctness is a mythical beast. In reliable systems, a probability of failure of 10**(-4) or 10**(-9) per hour may be required. However, such measures are too weak for voting systems. For example, a one-bit error in memory might result in the loss or gain of 2**k votes (for example, 1024 or 65,536). Ideally, numerical errors attributable to hardware and software must not be tolerated, although a few errors in reading cards may be acceptable within narrow ranges. Efforts must be made to detect errors attributable to the hardware through fault-tolerance techniques or software consistency checks. Any detected but uncorrectable errors must be monitored, forcing a controlled rerun. However, a policy that permits any detected inconsistencies to invalidate election results would be very dangerous, because it might encourage denial-of-service attacks by the expected losers. Note also that any software-implemented fault-tolerance technique is itself a possible source of subversion.

System Assurance

* High-assurance systems demand discipline and professional maturity not previously found in commercial voting systems (and, indeed, not found in most commercial operating systems and application software). High-assurance systems typically cost considerably more than conventional systems in the short term, but have the potential for payoff in the long term. Unless the development team is exceedingly gifted, high-assurance efforts may be disappointing. As a consequence, there are almost no incentives for any assurance greater than the minimal assurance provided by lowest-common-denominator systems. (See [Neu93] for a discussion of some of the implications of attaining high assurance.) Furthermore, even high-assurance systems can be compromised, via insertion of trapdoors and Trojan horses, and operational misuse.

CONCLUSIONS

The primary conclusion from the above discussion of realizability is that certain criteria elements are inherently unsatisfiable with assurance that can be attained at an acceptable cost. Systems could be designed that will be operationally less amenable to subversion. However, some of those will still have modes of compromise without any collusion. Indeed, the actions of a single person may be sufficient to subvert the process, particularly if preinstalled Trojan horses or operational subversion can be used. Thus, whereas it is possible to build better systems, it is possible that those better systems can also be subverted. Consequently, there will always be questions about the use of computer systems in elections. In certain cases, sufficient collusion will be plausible, even if one is not a confirmed conspiracy theorist.

There is a serious danger that the mere existence of generally accepted criteria coupled with claims that a system adheres to those criteria might give the naive observer the illusion that an election is nonsubvertible. Doubts will always remain that some of the criteria have not been satisfied with any realistic measure of assurance and that the criteria are incomplete:

* Commercial systems tend to have lowest common denominators, with numerous serious security flaws. Custom-designed systems may be even worse, especially if their code is proprietary.

* Trojan horses, trapdoors, interpreted data, and other subversions can be hidden, even in systems that have received extensive scrutiny. The integrity of the entire computer-aided election process may be compromisible internally.

* Operational misuses can subvert system security even in the presence of high-assurance checks and balances, highly observant poll watching, and honest system programmers. Registration of bogus voters, insertion of fraudulent absentee ballots, and tampering with punched cards seem to be ever-popular techniques in low-tech systems. In electronic voting systems, dirty tricks may be indistinguishable from accidental errors. The integrity of the entire computer-aided election process may be compromisible externally.

* The requirement for voter confidentiality and the requirement for nonsubvertible and sufficiently complete end-to-end monitoring are conceptually contradictory. It is essentially impossible to achieve both at the same time without resorting to complicated mechanisms, which themselves may introduce new potential vulnerabilities and opportunities for more sophisticated subversions. Monitoring is always potentially subvertible through low-layer Trojan horses. Furthermore, any technique that permitted identification and authentication of a voter if an election were challenged would undoubtedly lead to increased challenges and further losses of voter privacy.

* The absence of a physical record of each vote is a serious vulnerability in direct-recording election (DRE) systems; the presence of an easily tamperable physical record in paper-ballot and card-based systems is also a serious vulnerability.

* Problems exist with both centralized control and distributed control. Highly distributed systems have more components that may be subverted, and are more prone to accidental errors; they require much greater care in design. Highly centralized approaches in any one of the stages of the election process violate the principle of separation of duties, and may provide single points of vulnerability that can undermine separation enforced elsewhere in the implementation.

There is a fundamental dilemma to be addressed.

* On one hand, computer systems can be designed and implemented with extensive checks and balances intended to make accidental mishaps and fraud less likely. As an example pursuing that principle, New York City [NYC87] is attempting to separate the processes of voting, vote collection, and vote tallying from one another, with redundant checks on each, hoping to ensure that extensive collusion would be required to subvert an election, and that the risks of detection would be high; however, that effort permits centralized vote tallying, which has the potential for compromising the integrity of the earlier stages.

* On the other hand, constraints on system development efforts and expectations of honesty and altruism on the part of system developers seem to be generally unrealistic, while the expectations on the operational practice and human awareness required to administer such systems may be unrealistic.

We must avoid lowest-common-denominator systems, instead trying to approach the difficult goal of realistic, cost-effective, reasonable-assurance, fail-safe, and nontamperable election systems.

Vendor-embedded Trojan horses and accidental vulnerabilities will remain as potential problems, for both distributed and centralized systems. The principle of separation is useful, but must be used consistently and wisely. The use of good software engineering practice and extensive regulation of system development and operation are essential. In the best of worlds, even if voting systems were produced with high assurance by persons of the highest integrity, the operational practice could still be compromisible, with or without collusion. Vigilance throughout the election process is simply not enough to counter accidental and malicious efforts that subvert the process. Some residual risks are inevitable.

ACKNOWLEDGMENT

The author is grateful to Rebecca Mercuri for her incisive feedback during the preparation of this position paper, and to Mae Churchill for continual inspiration.

REFERENCES

[Dug88] R. Dugger. Annals of Democracy (Voting by Computer). New Yorker. November 7, 1988.

[FEC] Federal Election Commission guidelines. %voluntary standards.

[Gre93] G.L. Greenhalgh. Security and Auditability of Electronic Vote Tabulation Systems: One Vendor's Perspective. Proc. 16th National Computer Security Conference, NIST/NCSC, Baltimore MD, September 1993.

[Mer93] R. Mercuri. Threats to Suffrage Security. Proc. 16th National Computer Security Conference, NIST/NCSC, Baltimore MD, September 1993.

[NeuPar89] P.G. Neumann and D.B. Parker. A Summary of Computer Misuse Techniques. Proc. 12th National Computer Security Conference, NIST/NCSC, Baltimore MD, pp. 396--407, October 1989.

[Neu90] P.G. Neumann. Risks in Computerized Elections (Inside Risks). Comm. ACM 33, 11, p. 170, November 1990.

[Neu93] P.G. Neumann. Myths of Dependable Computing: Shooting the Straw Herrings in Midstream. Proc. 8th Annual Conf. on Computer Assurance (COMPASS '93), June 1993.

[NYC87] Electronic Voting System. Request for Proposal, Appendix G, Security and Control Considerations. New York City Board of Elections, New York City Elections Project, September 1987.

[Sal88] R.G. Saltman. Accuracy, Integrity, and Security in Computerized Vote-Tallying. NBS (now NIST) special publication, 1988.

[Sal93] R.G. Saltman. Assuring Accuracy, Integrity and Security in National Elections: The Role of the U.S. Congress. Position paper from Computers, Freedom and Privacy '93, pp. 3.8--3.17, March 1993.

[Sha93] M. Shamos. Electronic Voting --- Evaluating the Threat. Position paper from Computers, Freedom and Privacy '93, pp. 3.18--3.25, March 1993.

[Tho84] K. Thompson. Reflections on Trusting Trust. Comm. ACM, 27, 8, pp. 761--763, August 1984.