A New Category of Political Intelligence Tools Will Define Tomorrow’s Political Winners

The 2016 U.S. Presidential election presented a seminal demonstration of the power of big data and analytics in the modern political campaigns. While the overall spending of nearly $1.6B by campaigns was record-breaking on its own, 2016 was also unique based on the record-breaking spend on social media and the behavioral targeting data used to hone, with laser precision, the targeting and placement of social media ads. By all accounts, the behaviorally-driven, microtargeting strategy worked. So much so that in 2018, the former CEO of an obscure, UK-based behavioral analytics firm called Cambridge Analytica claimed on hidden camera that his firm played a decisive role in U.S. President Donald Trump’s election victory.

Trump’s campaign officials have disputed both the utility and use of Cambridge Analytica’s behavioral data. But what is not in dispute is that elections — the federal level, and increasingly at the state and local levels — will be won and lost based on access to vast, disparate amounts of data, as well as complex analysis applications and people who can effectively interpret and leverage the most granular data to influence voters.

While campaigns have always been, to an extent, “data-driven,” the game of influencing voter turnout and winning elections is no longer just about the raw data or polls. In advance of culling voters to the polls in 2020 and beyond, today’s campaign managers must deliver up-to-the-minute, immediately actionable, unbiased intelligence to their candidates from announcement day to Election Day so confident decisions and course corrections can be made faster. And to accomplish that, the types, sources, and use cases for data need to be re-imagined.

As once stated by the venerable Republican political consultant Arthur Finkelstein, “Every election is decided before it even begins.” In today’s environment, the cycle is seemingly reset daily.

The modern, data-driven digital campaign was pioneered by Barack Obama’s 2008 meteoric rise from relative obscurity to the Presidency. But the process of building an intelligence advantage over his opponent was arduous and time-consuming. According to a 2012 analysis by MIT Technology Review, massive amounts of survey data was combined with algorithms to predict voter behavior that was then leveraged to activate turnout:

“In the 2008 presidential election, Obama’s targeters had assigned every voter in the country a pair of scores based on the probability that the individual would perform two distinct actions that mattered to the campaign: casting a ballot and supporting Obama. These scores were derived from an unprecedented volume of ongoing survey work. For each battleground state every week, the campaign’s call centers conducted 5,000 to 10,000 so-called short-form interviews that quickly gauged a voter’s preferences, and 1,000 interviews in a long-form version that was more like a traditional poll. To derive individual-level predictions, algorithms trawled for patterns between these opinions and the data points the campaign had assembled for every voter — as many as one thousand variables each, drawn from voter registration records, consumer data warehouses, and past campaign contacts.”

In contrast, in the 2016 Trump/Clinton Presidential cycle, better data and voter influence responses were collected and leveraged in much shorter time frames, and at a much larger scale, enabling faster course corrections and advertising effectiveness. For example, by using features within the Facebook advertising platform called “Custom Audiences” and “Lookalike Audiences,” then-Trump Digital Director Bradley Parscale and his team were able to upload voter lists into Facebook’s advertising platform and match them with similar Facebook users — amplified hundreds or thousands of times. Because of the vast sums of funding the Trump campaign was able to deploy, and the massive customization of advertising tailored to Trump’s target audience, the campaign was able to uniquely target and effectively influence very small groups of voters on issues tailored to their interests, leading Trump to claim electoral college victory by less than 100,000 votes.

In an interview with CBS’ 60 Minutes, Parscale said the campaign was able to effectively reach small clusters of rural voters, such as “15 people in the Florida Panhandle that I would never buy a TV commercial for”. “I started making ads that showed the bridge crumbling,” he said. “I can find the 1,500 people in one town that care about infrastructure. Now, that might be a voter that normally votes Democrat.”

With the guidance of an embedded team of ad experts from Facebook, Parscale said the campaign typically tested 50,000 to 60,000 variations of ads in a single day, and sometimes as many as 100,000, each with minute changes in the design, color, background, and phrasing of content, in order to maximize their impact.

Today, accessing vast amounts of data across channels and sources, data accuracy, bias induction, as well as an increasingly fragmented use of tools used to process and interpret data, all contribute to risks that can adversely impact campaigns.

Accompanying these risks are diminishing returns (and increased turnaround time) of interacting with likely and registered voters via traditional public opinion polls and primary research. Mobile phone access is nearly ubiquitous, but consumers are more reticent to answer as a result of rampant robo- and spam calls. Landline phone access dropped from 92% in 2004 to 42% in 2018, making traditional evening phone polling more challenging among various demographic segments. And ad blockers and other browser privacy tools have had an adverse impact on the quality of online surveys.

To deliver results that match the depth and accuracy of previous cycles, campaigns will be forced to wade through larger datasets, or base decisions off of weaker sample sizes, sacrificing efficiency and accuracy at every turn.

To add to these already formidable challenges, the impact of “poll lag,” whereby a candidate continues to promote a position or talking point while polling is conducted, can add fuel to the fire of an unpopular position or result in missed opportunities to amplify popular ones.

In an environment where every consumer has the ability to widely distribute content on a myriad of platforms and where the media is omnipresent, campaigns are caught in the midst of a perfect storm: too much data, too little context, and not enough time.

To address this new reality, a number of Artificial Intelligence (AI) technologies — including Natural Language Processing (NLP), advanced machine learning, and rapidly scalable data storage and processing — have enabled the next generation of intelligence platforms and services to deliver context and insights from massive amounts of data from disparate sources and channels in near real-time.

Take the breadth of data collection and pattern analysis that made the Obama ‘08 campaign so successful, with the unique targeting and data access that made the Trump ‘16 campaign so successful, and one can begin to formulate a view of what’s coming next. New technologies have the ability to mimic the reading behavior of an average human, but process text approximately 48,000 times faster. They can qualitatively measure every word in an article, blog, social media post, or broadcast transcript, effectively assessing the relevance, impact, and sentiment of candidates, policy positions, and their sound bites. All in a matter of minutes or hours, rather than days.

In addition, cognitive functionality within these platforms enables non-technical candidates, campaign managers, and support teams to ask questions in everyday, natural language, and receive accurate, unbiased answers. For example, a candidate can ask, “How did Iowa media and political journalists carry my health care message from the debate last night, and is it resonating positively with Iowa voters?”

While opponents wait through poll lag or rely on single channel strategies to inform messaging, more agile candidates will be identifying opportunities and avoiding missteps with talking points, location selections, and positioning papers that offer much more granularity, and at a much greater speed.

In today’s world of tribal politics, perpetual news cycles, and massive cross-channel spending and fundraising, the difference between winning and losing can be assumed to be very small. Artificial intelligence has raised the bar immeasurably in gleaning and utilizing political intelligence in real time to a candidate’s advantage. In this cycle, it is paramount for serious contenders at all levels to understand the state-of-the-art, and how it can be used to further their campaigns

Leigh Fatzinger is Founder and CEO of Turbine Labs. Turbine Labs is on a mission to rebuild the way information is made available and consumed by leaders and their support teams. Combining the scale and speed of AI with human validation, Turbine Labs transforms information chaos into unvarnished, meaningful intelligence, surprisingly fast.

Thanks to @element5digital for making this photo available freely on @unsplash 🎁 https://unsplash.com/photos/T9CXBZLUvic

Founder and CEO of Turbine Labs. I write about bootstrapping an AI SaaS company, and how leaders consume and use intelligence. (turbinelabs.com / @turbinelabs)

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store