The Investigating Software Podcast

Voting Machine Fail

June 14, 2020

We wind the clock back to November 2019 and investigate the failure of voting machines in Northampton County, Pa., USA. We break down what went wrong, what caused the problem and what we can learn about the risks of software development from this high profile incident.

Resources used to research and compile this podcast include:

A Pennsylvania County’s Election Day Nightmare Underscores Voting Machine Concerns

Press conference with Lamont McClure and Adam Carbiullido from ES&S on the analysis of the voting machines

Recount underway for all Northampton County races after malfunction in voting machines

'Human error' blamed for Northampton County election problems

Not enough voters detecting ballot errors and potential hacks, study finds

Northampton County Voting System


Show Transcript:

Peter Houghton (00:00):
Hello and welcome to investigating software. My name is Peter Houghton. Today. I'm taking you back to the 5th of November, 2019.

TV News Clips (00:08):
Yeah. Election day woes in North Hampton County. We haven't heard anything. I know people have called the County and we haven't heard, I'm going to assume that we'll have to go to a paper count. At some point we can update, now that we have heard through the newsroom or news people confirming that they recount is actually happening.

Peter Houghton (00:25):
We'll look into some bugs that happened with some new voting machines in Northampton County, Pennsylvania. Now, if you don't know where that is, I didn't draw a line due West of New York city and align due North of Philadelphia and where those lines intersect is Eastern. The County seat of Northampton County now in the front of County has a population of 305,000. And on that day it was having an election for a local judge. There's nothing unusual about that happens every couple of years, the only new issue was the introduction of some voting machines and they appear to have caused some controversy and contention in the collation of the results. The voting machines used that day and was supplied by a company called election systems and software or E S and S of Omaha Nebraska. Now it's a fairly established company it's been around for 40 years.

Peter Houghton (01:15):
It's Express Vote XL machines are used all across the country. They have 6300 of them in use at the moment in the United States. Now those machines cost the County about $2.8 million. And that sounds like a lot of money. In fact, it is a lot of money to you and me, but that only equates to about 0.6 0.7% of Northampton counties, annual budget. So a big purchase, but not a huge one by the County standards. For those of you that haven't seen one of these machines, they sort of resemble a large desktop computer, whether they have a large flat screen on the front, it's a touch screen. So voters can just choose who they want to vote for from a list. If it's a simple election with say seven candidates, they can just click on it. It'll put a nice tick next to that.

Peter Houghton (02:02):
And then they can proceed to confirm their vote on the paper ballot. Some of the screens can, depending on the election, be a little more complicated and appear in a grid fashion, more like an Excel spreadsheet, where you have rows and columns and within that are the items you need to select. Now, those screens are programmed either by the local election officials or ESS staff themselves. In this case, the majority of the work was done by ES&S as the election officials themselves had not used the machines before, and they were newly introduced to the area. So we wouldn't expect them to know how to do that. Here's a clip from Votes PA on how to use the new machines.

Votes PA instructor (02:43):
The ballot will display on the screen. You'll Mark your votes by touching anywhere inside the box, around your choice. Once selected, your choice will be highlighted in green. After you finished marking all your choices. Don't forget to review every selection before casting your ballot. You can, de-select a choice by touching it again.

Peter Houghton (03:00):
Now the development of this software really takes two parts, two steps. Firstly, there's the actual development of the application itself working with the hardware, the touch screen. And secondly, there's another stage also referred to as programming by the election officials and ES&S, which is the configuration of that software for use in a particular election. Obviously each election has different candidates and maybe different rules about who can vote or what they're voting for. So if we go back to our Excel example, there's the development of Excel itself, the application that we buy off the shelf or download, and there's the actual work you do on Excel. Often we do that ourselves and we may enter scripts or data into that to produce a particular result. So in this case, the development of the application was done prior to the election and in the days leading up to the election, the programming of the actual candidates and the details of each race that people were voting on/for that day was configured or programmed in ESS terminology into the system.

Peter Houghton (04:04):
So there's really two different areas here. In fact, more than two, if we look from a software investigation or testing point of view, but the two key areas are the initial development and obviously the testing and verification of that software and the subsequent configuration where we take the off the shelf software. And we configure that to make it appropriate for each election. Now, both of those have the potential to produce just what we want, or maybe not quite what we want. They could have bugs or misconfigurations or any number of issues with them. So on the day, the 5th of November, 2019, the leading candidates split along party lines. Now this isn't a County that has a traditionally a mix of left and right, but historically a slight skew towards the democratic party. Now in that election, you didn't have to vote along party lines. It was possible to crossfile that is choose a candidate that represented both parties, no fountain County election officials requested some instructional texts to be placed on the screen to help voters when they are presented with the screen to discern, how would they vote for one of these crossfire candidates rather than voting along party lines?

Peter Houghton (05:15):
And it's that instructional text that appears to be the root of the issue seen on election day. Now, what were those issues?

New Speaker (05:21):
When tabulated the votes got attributed to that instructional text, when we removed the instructional text. So you can see over here, the votes were correctly attributed to the proper candidates.

Peter Houghton (05:35):
One of the candidates, a basis, how don't you received 164 votes across the whole County when he was expected to get many tens of thousands, this looked a little suspicious. So they actually decided to disregard the automatically collated results from ESNs systems and go and start looking at the actual paper ballots are printed as a, each time a user makes a vote. Also many people were reporting that the screens didn't actually select what they had chosen on the screen. So for example, someone clicked on a Republican candidate and the democratic candidate had been highlighted or vice versa.

Peter Houghton (06:12):
Now it's interesting that the election officials decided to assume that because people could have verified their vote using the paper ballot on the side of the machine, that they actually did do this because a few months later in January, 2020, the university of Michigan actually published a report in which it said that 93% of voters missed incorrect information on a ballot. Now that doesn't prove that the exact same issue happened in this university of Michigan study as happened in North Hampton County. But it does make you think that if there was widespread misattribution caused purely by the screens that maybe that should have been more serious issues, something they could have considered more seriously, and maybe we run the race. Now they don't seem to join the dots here and think maybe the fact that the screens were misbehaving and people were clicking on one candidate and another was being highlighted, might have actually contributed to this issue.

Peter Houghton (07:08):
They seem to want to separate those two bugs. Now I can see why you'd want to do that. The problem with the electronic storage, not correctly tabulating the data is one that's easily remedied by a simple, albeit time consuming process of manually reviewing each of the ballots. But the problem of the screens it's much harder to rectify. You don't know what the person was actually clicking on. There could have been clicking on a Republican, a Democrat across filing or any other part of the screen. So you can't go back and retrospectively fix the data to find the correct answer. Now that's one of the things that wasn't really mentioned in any of the news reports, but if you go back and look at the original recording, which is fairly low quality, unfortunately the ESS representative actually mentioned that this particular type of configuration hadn't been tested,

Adam Carbullido (07:58):
I want to make clear that this was human error and he SNS takes full accountability. Okay. He was for North Hampton County was untested and issue should have been identified by ESS staff, correct prior to the election and during collection test.

Peter Houghton (08:15):
So this goes back to the point we made earlier, where there's two stages to the development of this system. The original system is developed back in Nebraska, and it's basically a menuing system with tabulation. And when they come to actually deploy it for a particular county or state, they reconfigure that system in another stage that sometimes referred to as programming. But it's more of a configuration and onsite testing process. It's that stage that wasn't tested. At least it was tested, but not this particular configuration. And hence the issue becomes a problem on the day. Now, this is kind of interesting because it brings me in mind of Conway's law, for those who don't know, Conway's law was coined by Malcolm way back in the 1960s. And it goes like this. Any organization that designs a system we'll produce a design who structure is a copy of the organization's communication structure and why I think that's relevant here is it appears that there's two teams, right?

Peter Houghton (09:14):
One team does the development and testing back in Nebraska. And there's other people, maybe a different team that come in later on and configure that for deployment in each state or county election. And because of that, there's this sort of miscommunication between the teams. The two teams have different views of the software. They have different concerns. And in this case, they appear to have different understandings of what the software can be used to do and how that's been tested. Now, another interesting point that Lamont McClure the county executive States in his press conferences is that the backup system worked the paper ballots allowed for an accurate confidence building election to take place. People could look at those paper ballots and verify for themselves. And indeed the election officials did verify that a certain number of votes have been applied for each of the candidates. Now that's good.

Peter Houghton (10:04):
That was a backup, but you can't rely on a backup. And when you start relying on a backup, you essentially don't have a backup anymore. What you've got then is just how you do things. A good comparison is aircraft. If you've got a passenger plane with 200 passengers and one of your engines fails, you don't keep flying the pilot. Doesn't turn around to the crew and go just, just chillax. Right? Everyone just take it easy. You've got two of these things. We'll just keep going. We'll be there in a few hours. He doesn't, he lands the plane, they do a long, hard investigation into what went wrong and they try to make sure it doesn't happen again. So hope that's what ES&S are doing here. They've actually gone back and not only fix this particular issue, allowing for crossfire candidates or restricting the software from doing that sort of behavior, they've actually maybe gone back and looked, why didn't we find that?

Peter Houghton (10:51):
Okay, we've got these two teams. Maybe they should both have more testers or developers more suited to this sort of testing or maybe some new tools or processes they can use to help raise the bar of their quality because whatever they were spending before is probably quite small in comparison to the amount of damage done by the bad publicity here. And this is where it comes down to exposure. Now it's often a mistake to judge the amount of money you're spending on testing your software by how much you're spending per day. So the cost per day might be X number of thousand dollars. You're not taken into account there. The cost of failure testing as a sort of insurance against failure. It's not going to catch all the issues, but it's going to help you reduce the number of issues that go live and cause these sort of embarrassing incidents.

Peter Houghton (11:37):
So when you're comparing the amount you spend on testing, you're really going to be comparing the amount you could make from the software against how much you could lose. If you don't have the appropriate testing in place. If one of these issues gets out there and get becomes a nationally reported incident on television, it's in the New York times, they're the costs you have to look at. Typically the cost per day are fairly low compared to the sort of ultimate costs. You'll see if you don't test your software fully, of course, what do we mean by fully? Well, it's up to you, right? Depends on your market. Maybe your exposure is low. Maybe you can shift broken software out there. And the impact for you, at least in the short term is quite small. Some companies can do that, particularly startups, but not everyone can do that.

Peter Houghton (12:21):
And even if maybe you don't actually lose much money directly, for example, you're not shipping a product that people won't buy anymore and you don't get that profit margin. You might have software that other people use Facebook, for example, has a library that lots of people use and it isn't used in their typical applications daily run. But if someone were, for example, to want to share something on Facebook, it would use that library. Now there was a problem short while ago where people had basically catastrophic failures in their applications, lots of different things were failing like Spotify, for example. And the reason was there was a flaw in Facebook's software. Now that's fine. Facebook didn't lose any money directly, but longer term people are going to start thinking, well, do we have to include that Facebook software in our application in the same situation is going to happen here.

Peter Houghton (13:05):
They may still keep their short term contracts for voting machines, but longer term people are gonna start thinking, well, maybe, you know, we won't choose that option next. We'll choose a different company or a different technology. Now this issue of exposure and software quality in general is potentially huge in our society. Now let's just take this particular type of application, nothing in particular to do with ES&S. But any company that produces this sort of equipment, what if there is a similar issue in November, 2020? Now, if that happens, that could cause civil unrest and the situation could be worse. If, for example, one of the candidates only receives slightly fewer votes than the other candidate. And that's due to a particular error. You may not find that error straight away in this case, the election officials could see exactly what was wrong, but at least see that there was a problem and needed to investigate further.

Peter Houghton (13:57):
If the issue was more subtle, it may not come out straight away. It may come out in an audit later, in which case people are more likely to ascribe a more nefarious cause to this. They're not going to think, Oh yeah, machine failed. Let's get them fixed. So they're going to think, what have you done? Who did this? Why did they allow this to happen? Why did this particular party win? And what's more troubling is that maybe the actual issues that will cause this sort of slight variation in the results, this sort of minor bug that won't be noticed straight away. It's still in the system. These sorts of systems have been tested, you know, back in the office and also to a certain extent onsite in regional elections. But that doesn't mean that everything's been tested yet. The kind of low hanging fruit have already been found more subtle issues, ones that are maybe going to come out in the longer term haven't been found yet. And these are the sort of things that will come back to bite. These are the things that ultimately raise the cost of your software, not through initial development and testing by the longterm liability is the exposure you and your project have. That's all for this podcast. Thank you. And you've been listened to investigating software.



Podbean App

Play this podcast on Podbean App