Voters confused by e-voting machines, study finds

With the presidential election less than 10 months away, a statistically significant portion of voters may accidentally vote for the wrong candidate on electronic voting machines because they find the displays confusing, according to results of a five-year study conducted by three universities.

The study, conducted by the universities of Maryland, Rochester and Michigan, found that 3 percent of people voting electronically selected a candidate they did not intend to choose. While much of the attention to electronic voting machines has been on the security issues involving the machines' software and how the systems are stored during elections, the study's authors emphasized that poor user interfaces -- the way candidates' names and ballot initiatives are displayed to voters -- pose a much greater risk of skewing elections.

"Recent history is clear: the election problem most likely to tilt a close race is not security, but the inability of voters to cast their ballots the way they intended," said Paul Herrnson, principal investigator and the director of the Center for American Politics and Citizenship at the University at Maryland. "The hazards of poor ballot design didn't end with Florida's hanging, pregnant and dimpled chads in 2000. Those people walked away not confident and not trusting the vote."

The study's authors said the 3 percent error rate is enough to affect the outcome of close elections, which have occurred more frequently in the past decade. "A 3 percent error rate sounds good until you consider that in the 2000 presidential race, the percentage of uncounted ballots was only 2 percent," Herrnson said. "Voters did pretty well with these machines. … But it's still enough to affect the outcome of a close election."

The researchers tested five current electronic voting systems and one prototype. These included a paper ballot with optical scanner (manufactured by Electronic Systems and Software); a manual advance touch-screen, which allows voters to control when the next ballot appears (Diebold AccuVote-TS); an auto advance touch-screen with paper trail (Avante Voting Systems); a dial and buttons interface (Hart InterCivic); a full-face ballot with membrane buttons, which are flat, springless buttons (Nedap Election Systems); and a zoomable touch-screen prototype designed by Benjamin Bederson at the University of Maryland.

The highest score went to the manual advance machine and Bederson's zoomable prototype, both of which scored a 5.92 on a scale of 1 to 10. The dial and buttons interface came in lowest (4.70) because voters found it to be cumbersome, time consuming and made more errors while using it. The ballot displaying the entire ballot also scored in low (5.08), with the paper ballot scoring 5.48.

"All the systems had strengths and weaknesses," Herrnson said. "Most people found the touch-screens easy to use, though there were things on all of them that could be improved."

One of the problems voters encountered when using electronic voting machines was difficulty changing a vote because a screen automatically displayed the next race or ballot initiative once the voter selected a candidate. Voters also were likely to vote incorrectly if the ballot included a straight party option, which allows users to vote for all the candidates of one party at once. Many voters felt confused about whether or not they should still vote for individual candidates after voting for a party, according to the study.

The study also found that voters preferred manual advance systems and reported that touch-screen systems were easier to use than buttons or dials. They also said the ability to view each election separately tended to reduce confusion. These differences, however, were more likely to affect the user's proclivity to request assistance than the overall accuracy of the results.

Peter Lichtenheld, director of marketing for elections at Hart InterCivic, the manufacturer of the system that scored the lowest, defended it. "Our system is 100 percent accurate," he said. "Turn the dial, highlight your selection, hit the enter button, and the box fills red to show who you voted for and you move on. There's no mis-marking, and you get a summary screen at the end."

Additional approaches tested by the researchers included adding a paper trail component or an audio verification system to electronic voting machines to ensure accuracy. Voters did not approve of either system. "All the systems increased the number of people needing help, but they didn't really improve accuracy," said Herrnson. "Once you start adding something to the basic voting system, you increase the likelihood of problems and the difficulty of set-up."

The authors urged electronic voting machine manufactures and election officials to improve how the machines display candidates' names and ballot initiatives. "In the short run, election officials should be very cognizant of the way the ballot is designed, whether it's on paper or an electronic system," said Richard Niemi, professor of political sciences at University of Rochester. "They should think very carefully about how the [candidates] are laid out on the ballot."

Herrnson and Niemi said it is imperative that election officials test interfaces to determine if they are confusing to voters. Niemi said the large number of elected offices and initiatives on the ballots in the states, congressional districts and cities makes it "almost impossible" to familiarize voters with a ballot.

"For election officials who've yet to purchase a system, the touch-screens were reviewed more favorably than any other electronic systems," Herrnson said. "For those that already have their systems, the most important thing is to pay very close attention to how they program the ballot into the system."

Stay up-to-date with federal news alerts and analysis — Sign up for GovExec's email newsletters.
Close [ x ] More from GovExec

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Sponsored by G Suite

    Cross-Agency Teamwork, Anytime and Anywhere

    Dan McCrae, director of IT service delivery division, National Oceanic and Atmospheric Administration (NOAA)

  • Data-Centric Security vs. Database-Level Security

    Database-level encryption had its origins in the 1990s and early 2000s in response to very basic risks which largely revolved around the theft of servers, backup tapes and other physical-layer assets. As noted in Verizon’s 2014, Data Breach Investigations Report (DBIR)1, threats today are far more advanced and dangerous.

  • Federal IT Applications: Assessing Government's Core Drivers

    In order to better understand the current state of external and internal-facing agency workplace applications, Government Business Council (GBC) and Riverbed undertook an in-depth research study of federal employees. Overall, survey findings indicate that federal IT applications still face a gamut of challenges with regard to quality, reliability, and performance management.

  • PIV- I And Multifactor Authentication: The Best Defense for Federal Government Contractors

    This white paper explores NIST SP 800-171 and why compliance is critical to federal government contractors, especially those that work with the Department of Defense, as well as how leveraging PIV-I credentialing with multifactor authentication can be used as a defense against cyberattacks

  • Toward A More Innovative Government

    This research study aims to understand how state and local leaders regard their agency’s innovation efforts and what they are doing to overcome the challenges they face in successfully implementing these efforts.

  • From Volume to Value: UK’s NHS Digital Provides U.S. Healthcare Agencies A Roadmap For Value-Based Payment Models

    The U.S. healthcare industry is rapidly moving away from traditional fee-for-service models and towards value-based purchasing that reimburses physicians for quality of care in place of frequency of care.

  • GBC Flash Poll: Is Your Agency Safe?

    Federal leaders weigh in on the state of information security


When you download a report, your information may be shared with the underwriters of that document.