Goggle predicted the result of the 2020 US Election
December 8, 2020
With a discrepancy of just 10 votes, Goggle was the closest Electoral College predictor among established pollsters. On the Friday before the USA 2020 election, Goggle.com predicted that Former Vice President Joe Biden would win the 2020 US Election with 296 Electoral College votes. When the Electoral College votes on December 14, 2020, President Elect Biden is expected to receive 306 votes.
Over the past year, we have licensed RIWI Corp.’s technology to bring monthly updates on which party Americans think will win in their State, which candidate they personally prefer, and our final prediction for the result of the 2020 US Election.
RIWI accurately predicted the results of the 2016 Presidential Election. Therefore, we chose to license RIWI data collection technology, and we deployed it for our own use and have generated our own prediction model, applying our own algorithm and our own data analysis, to predict the winner of the Electoral College. We did not license any of RIWI’s analysis tools or expertise. Therefore, this was not, in any way, a suggestion or representation of RIWI Corp.’s own official prediction for the 2020 US Election. We made these observations only and entirely based on Goggle’s internal analysis.
What makes this data different
First, we want to highlight what makes our data unique compared to those of traditional polling methods. The tracker here at Goggle is different because we:
- Used the same RIWI-powered and proven scientific technology that predicted the 2016 election and, further, RIWI remarkably disconfirmed the overwhelming media narrative, every week, of a supposed “Blue wave” decisive Electoral College victory for President Elect Biden, the task for which RIWI was engaged by its clients in the financial services sector in this 2020 presidential campaign season
- Collected data continuously in real-time
- Captured both politically engaged and disengaged voices (traditional polling methods often only speak to those who are most engaged, skewing results)
- Did not collect any personally identifiable information, and did not incentivize our respondents, reducing social desirability bias and our exposure to the growing list of problems traditional panel companies face
- Our unweighted sample nearly matched US census data in age, gender and geographic distribution
On October 30th, the Friday before the 2020 US election, Goggle.com released its Electoral College prediction. We predicted that Former Vice President Joe Biden would win 296 Electoral College votes, more than the 270 votes required to win the election over President Donald Trump.
As it is expected that Mr. Biden will formally win the Electoral College vote on December 14, 2020 with 306 total votes, we can analyse how Goggle performed in its prediction compared to other well known polling companies that do not leverage RIWI technology.
Goggle used the “wisdom of the crowd” to make our prediction. Rather than asking respondents who they will vote for, we depersonalized the question and minimized response bias by asking respondents who they think will win in their state. We then filtered the data for likely voters, those individuals who are more politically aware, to make our call.
To make our prediction we used a similar method to the “RIWI Tie Break” rule. We assumed that the views of the forecasters who are likely to vote reflect the views of the voting population, thus forcing statistically tied results into a prediction for one candidate. In this scenario, if the candidates were statistically tied at 48% and 52%, then the leading candidate was predicted to win. Sentiment often changes over time, especially leading into an election, so we wanted to balance (a) using the most recent data collected with (b) having a robust enough sample size. We first tested using the closest time period for which there were 200 likely voter forecasters in each state (i.e., California only used October data but, for example, Kentucky used data since August.) We tested smaller sample sizes as well (only 100 and only 50 likely voters per state); these all resulted in an even higher share of seats going to Mr. Biden. In the end, we chose the most conservative estimate and used the 200 likely voter model in our official prediction.
The lack of toss-up seats in Goggle’s prediction is a key factor to its accuracy. Most established traditional pollsters that did not leverage RIWI technology defined 35 to 197 seats as “toss-ups”, thus limiting their ability to produce such close predictions. See the table below for the full breakdown of vote predictions.
Swing State Predictions
Florida has often played a crucial role in election results. Leading into this election, all eyes were on President Trump’s home state (he changed his residency in 2020). Many pollsters called it a toss-up. Inside Elections, The Economist’s elections forecasting project and FiveThirtyEight all predicted that it would lean Democrat. Only Goggle, along with Sabato, correctly predicted that Republicans would win Florida. RIWI consistently, in its own prediction, always showed Florida to reside safely in President Trump’s column. Goggle’s data showed that President Donald Trump and the Republican Party held a consistent lead in Florida all through the summer and into the election, even as the share of undecided voters dropped in the weeks preceding the election.
Typically a red state (Republican), Georgia was listed as a toss-up among many pollsters going into the election. In the month leading up to the election, there was still quite a bit of uncertainty among likely voters in Georgia, but we saw a slight edge for the Democratic party and correctly called a win for President Elect Biden. Georgia became a crucial state for Mr. Biden clinching his victory.
Additionally, Ohio was another state for which most pollsters abstained from making a prediction. Goggle correctly predicted Ohio would stay red in 2020 as the signal for a Republican win held strong through September and October.
Below is a table comparing Goggle’s results to our list of established pollsters among swing states. The swing states here are defined as states for which at least 6 pollsters called a toss-up (excluding Nebraska and Maine, for which Goggle only did state-wide predictions). Goggle correctly predicted 5 of the 7 swing states listed. Other than Sabato (which correctly predicted 6 of 7), no other pollster was nearly as close; FiveThirtyEight was in a distant third place having only predicted 3 swing states correctly. The rest of the pack each dismally predicted correctly zero to 2 out of the 7 swing states.
Additionally, out of the swing states to watch in 2020 listed by Politico (Arizona, Florida, Georgia, Michigan, Minnesota, North Carolina, Pennsylvania, Wisconsin), Goggle correctly predicted 6 of the 8 states (Arizona and North Carolina being the two states incorrectly predicted, as shown below).
How did Goggle get it right?
Predicting the results of an election has never been easy, but it has become increasingly challenging as pollsters’ typical methodology becomes more and more outdated. With massive polling misses in both the 2016 and 2020 US elections, many are calling for an end to the traditional polling industry.
But how was Goggle so close? The key to our success is our methodology.
As RIWI said in their final US election report:
“For RIWI, one truly random forecast from one randomly engaged, anonymous person is worth far more than 10 non-randomly engaged, non-random traditional survey respondents who habitually offer their personal opinions in exchange for incentives. Further, ensuring the anonymity of any respondent, without ever collecting personally identifiable information, is essential to ethically responsible data collection.”
By accessing those typically quiet voices, Goggle predicted the Electoral College votes for both the Democratic and Repulican parties within 10 seats. This is the closest margin of any established polling company. Additionally, Goggle correctly predicted 5 out of 7 swing states, for which most traditional pollsters abstained calling. Only one polling company, Sabato, out-performed Goggle in their swing state predictions, correctly predicting 6 out of the 7. But while Sabato outpaced Goggle in correctly predicting the swing states, Sabato unfortunately had an error rate 50% higher than Goggle in the final Electoral Seat prediction count (Sabato was off by 15 seats versus Goggle off by only 10 seats).
Goggle appears to have access to the technology, process and visitor base to more accurately predict US Elections than any of the leading pollsters. For more information on how to harness this type of powerful prediction analysis, we recommend that you contact RIWI Corp at firstname.lastname@example.org or visit https://riwi.com/ for more information.