The Science (and Company) Behind The Trump and Brexit Elections

Posted by Pile (25013 views) Add this story to MyYahoo Add this article to del.icio.us Submit article to Reddit Add story to Furl Add story to StumbleUpon [E-Mail link]


[Psychology]
Did you know there's a single company behind some of the most controversial elections in modern history? An ad company that has been using Facebook and "psychometrics" to accurately predict, classify and manipulate people and elections?

Psychometrics and the (counter)revolution in marketing that is helping bring fascism to power around the world

Note: The following is an unauthorized translation of a December 2016 article that caused quite a stir in the German-language press. Das Magazin (Zurich) occupies a respected position within the German-language cultural and literary media landscape, functionally similar to (though perhaps not quite as prominent as) The New Yorker, and this work by investigative reporters Hannes Grassegger and Mikael Krogerus got a lot of attention—and generated some controversy, for apparently having scooped the English-language media with sensational observations about 2016’s most sensational story, the campaign and electoral victory of a fascist dictator in the United States.

Perhaps for this reason, the article has not appeared in translation in (or even had its investigative threads taken up by) English-language media outlets, even after nearly two months. Antidote presents, therefore, our own preemptive translation to fill this gap. We trust the skill of the reporters who wrote it and the veracity of their claims (which are verifiable by anyone with a search engine—we have embedded links where appropriate), and we question why this particular synthesis of public information is not being made available to non-German-speaking readers by outlets with more reach and respectability than us dirty DIY dicks.

On the occasion of this article’s authorized wider release in English, should that come to pass, we will consider removing this post if we are asked nicely. Until then: Enjoy!

“I just showed that the bomb was there.”
By Hannes Grassegger and Mikael Krogerus for Das Magazin (Zurich)
3 December 2016 (original post)

Psychologist Michal Kosinski developed a method of analyzing people’s behavior down to the minutest detail by looking at their Facebook activity—thus helping Donald Trump to victory.

On November 9th, around 8:30 in the morning, Michal Kosinski awoke in his hotel room in Zurich. The 34-year-old had traveled here to give a presentation to the Risk Center at the ETH [Eidgenössische Technische Hochschule or Federal Institute of Technology, Zurich] at a conference on the dangers of Big Data and the so-called digital revolution. Kosinski gives such presentations all over the world. He is a leading expert on psychometrics, a data-driven offshoot of psychology. Turning on the television this morning in Zurich, he saw that the bomb had gone off: defying the predictions of nearly every leading statistician, Donald J. Trump had been elected president of the United States of America.

Kosinski watched Trump’s victory celebration and the remaining election returns for a long while. He suspected that his research could have had something to do with the result. Then he took a deep breath and turned off the television.

On the same day, a little-known British company headquartered in London issued a press release: “We are thrilled that our revolutionary approach to data-driven communications played such an integral part in president-elect Donald Trump’s extraordinary win,” Alexander James Ashburner Nix is quoted as saying. Nix is British, 41 years old, and CEO of Cambridge Analytica. He only appears in public in a tailored suit and designer eyeglasses, his slightly wavy blond hair combed back.

The meditative Kosinski, the well-groomed Nix, the widely grinning Trump—one made this digital upheaval possible, one carried it out, and one rode it to power.

How dangerous is Big Data?

Anyone who didn’t spend the last five years on the moon has heard the term Big Data. The emergence of Big Data has meant that everything we do, online or off-, leaves digital traces. Every purchase with a card, every Google search, every movement with a cellphone in your pocket, every “like” gets stored. Especially every “like.” For a while it wasn’t entirely clear what any of this data would be good for, other than showing us ads for blood pressure medication in our Facebook feeds after we google “high blood pressure.” It also wasn’t entirely clear whether or in what ways Big Data would be a threat or a boon to humanity.

Since November 9th, 2016, we know the answer. Because one and the same company was behind Trump’s online ad campaigns and late 2016’s other shocker, the Brexit “Leave” campaign: Cambridge Analytica, with its CEO Alexander Nix. Anyone who wants to understand the outcome of the US elections—and what could be coming up in Europe in the near future—must begin with a remarkable incident at the University of Cambridge in 2014, in Kosinski’s department of psychometrics.

Psychometrics, sometimes also known as psychography, is a scientific attempt to “measure” the personality of a person. The so-called Ocean Method has become the standard approach. Two psychologists were able to demonstrate in the 1980s that the character profile of a person can be measured and expressed in five dimensions, the Big Five: Openness (how open are you to new experiences?), Conscientiousness (how much of a perfectionist are you?), Extroversion (how sociable are you?), Agreeableness (how considerate and cooperative are you?), and Neuroticism (how sensitive/vulnerable are you?). With these five dimensions (O.C.E.A.N.), you can determine fairly precisely what kind of person you are dealing with—her needs and fears as well as how she will generally behave. For a long time, however, the problem was data collection, because to produce such a character profile meant asking subjects to fill out a complicated survey asking quite personal questions. Then came the internet. And Facebook. And Kosinski.

A new life began in 2008 for the Warsaw-born student Michal Kosinski when he was accepted to the prestigious University of Cambridge in England to work in the Cavendish Laboratory at the Psychometrics Center, the first-ever psychometrics laboratory. With a fellow student, Kosinski created a small app for Facebook (the social media site was more straightforward then than it is now) called MyPersonality. With MyPersonality, you could answer a handful of questions from the Ocean survey (“Are you easily irritated?” – “Are you inclined to criticize others?”) and receive a rating, or a “Personality Profile” consisting of traits defined by the Ocean method. The researchers, in turn, got your personal data. Instead of a couple dozen friends participating, as initially expected, first hundreds, then thousands, then millions of people had bared their souls. Suddenly the two doctoral students had access to the then-largest psychological data set ever produced.

The process that Kosinski and his colleagues developed over the years that followed is actually quite simple. First surveys are distributed to test subjects—this is the online quiz. From the subjects’ responses, their personal Ocean traits are calculated. Then Kosinski’s team would compile every other possible online data point of a test subject—what they’ve liked, shared, or posted on Facebook; gender, age, and location. Thus the researchers began to find correlations, and began to see that amazingly reliable conclusions could be drawn about a person by observing their online behavior. For example, men who “like” the cosmetics brand MAC are, to a high degree of probability, gay. One of the best indicators of heterosexuality is liking Wu-Tang Clan. People who follow Lady Gaga, furthermore, are most probably extroverted. Someone who likes philosophy is more likely introverted.

Kosinski and his team continued, tirelessly refining their models. In 2012, Kosinski demonstrated that from a mere 68 Facebook likes, a lot about a user could be reliably predicted: skin color (95% certainty), sexual orientation (88% certainty), Democrat or Republican (85%). But there’s more: level of intellect; religious affiliation; alcohol-, cigarette-, and drug use could all be calculated. Even whether or not your parents stayed together until you were 21 could be teased out of the data.

How good a model is, however, depends on how well it can predict the way a test subject will answer certain further questions. Kosinski charged ahead. Soon, with a mere ten “likes” as input his model could appraise a person’s character better than an average coworker. With seventy, it could “know” a subject better than a friend; with 150 likes, better than their parents. With 300 likes, Kosinski’s machine could predict a subject’s behavior better than their partner. With even more likes it could exceed what a person thinks they know about themselves.

The day he published these findings, Kosinski received two phonecalls. One was a threat to sue, the other a job offer. Both were from Facebook.

Only Visible to Friends

In the meantime, Facebook has introduced the differentiation between public and private posts. In “private” mode, only one’s own friends can see what one likes. This is still no obstacle for data-collectors: while Kosinski always requests the consent of the Facebook users he tests, many online quizzes these days demand access to private information as a precondition to taking a personality test. (Anyone who is not overly concerned about their private information and who wants to get assessed according to their Facebook likes can do so at Kosinski’s website, and then compare the results to those of a “classic” Ocean survey here).

It’s not just about likes on Facebook. Kosinski and his team have in the meantime figured out how to sort people according to Ocean criteria based only on their profile pictures. Or according to the number of their social media contacts (this is a good indicator of extroversion). But we also betray information about ourselves when we are offline. Motion sensors can show, for example, how fast we are moving a smartphone around or how far we are traveling (correlates with emotional instability). A smartphone, Kosinski found, is in itself a powerful psychological survey that we, consciously or unconsciously, are constantly filling out.

Above all, though—and this is important to understand—it also works another way: using all this data, psychological profiles can not only be constructed, but they can also be sought and found. For example if you’re looking for worried fathers, or angry introverts, or undecided Democrats. What Kosinski invented, to put it precisely, is a search engine for people. And he has been getting more and more acutely aware of both the potential and the danger his work presents.

The internet always seemed to him a gift from heaven. He wants to give back, to share. Information is freely reproducible, copyable, and everyone should benefit from it. This is the spirit of an entire generation, the beginning of a new era free of the limits of the physical world. But what could happen, Kosinski asked himself, if someone misused his search engine in order to manipulate people? His scientific work [e.g.] began to come with warnings: these prediction techniques could be used in ways that “pose a threat to an individual’s well-being, freedom, or even life.” But no one seemed to understand what he meant.

Around this time, in early 2014, a young assistant professor named Aleksandr Kogan approached Kosinski. He said he had received an inquiry from a company interested in Kosinski’s methods. They apparently wanted to psychometrically measure the profiles of ten million American Facebook users. To what purpose, Kogan couldn’t say: there were strict secrecy stipulations. At first, Kosinski was ready to accept—it would have meant a lot of money for his institute. But he hesitated. Finally Kogan divulged the name of the company: SCL, Strategic Communications Laboratories. Kosinski googled them [so did Antidote. Here. —ed.]: “We are a global election management agency,” said the company website [really, the website has even creepier language on it than that. “Behavioral change communication”? Go look already]. SCL offers marketing based on a “psychographic targeting” model. With an emphasis on “election management” and political campaigns? Disturbed, Kosinski clicked through the pages. What kind of company is this? And what do they have planned for the United States?

What Kosinski didn’t know at the time was that behind SCL there lay a complex business structure including ancillary companies in tax havens, as the Panama Papers and Wikileaks revelations have since shown. Some of these had been involved in political upheavals in developing countries; others had done work for NATO, developing methods for the psychological manipulation of the population in Afghanistan. And SCL is also the parent company of Cambridge Analytica, this ominous Big Data firm that managed online marketing for both Trump and the Brexit “Leave” campaign.

Kosinski didn’t know any of that, but he had a bad feeling: “The whole thing started to stink,” he remembers. Looking into it further, he discovered that Aleksandr Kogan had secretly registered a company to do business with SCL. A document obtained by Das Magazin confirms that SCL learned about Kosinski’s methods through Kogan. It suddenly dawned on Kosinski that Kogan could have copied or reconstructed his Ocean models in order to sell them to this election-manipulating company. He immediately broke off contact with him and informed the head of his institute. A complicated battle ensued within Cambridge University. The institute feared for its reputation. Aleksandr Kogan moved to Singapore, got married, and began calling himself Dr. Spectre. Michal Kosinski relocated to Stanford University in the United States.

For a year or so it was quiet. Then, in November 2015, the more radical of the two Brexit campaigns (leave.eu, led by Nigel Farage) announced that they had contracted with a Big Data firm for online marketing support: Cambridge Analytica. The core expertise of this company: innovative political marketing, so-called microtargeting, on the basis of the psychological Ocean model.

Kosinski started getting emails asking if he had had anything to do with it—for many, his is the first name to spring to mind upon hearing the terms Cambridge, Ocean, and analytics in the same breath. This is when he heard of Cambridge Analytica for the first time. Appalled, he looked up their website. His methods were being deployed, on a massive scale, for political purposes.

After the Brexit vote in July the email inquiries turned to insults and reproaches. Just look what you’ve done, friends and colleagues wrote. Kosinski had to explain over and over again that he had nothing to do with this company.

First Brexit, Then Trump

September 19th, 2016: the US presidential election is approaching. Guitar riffs fill the dark blue ballroom of the Grand Hyatt Hotel in New York: CCR’s “Bad Moon Rising.” The Concordia Summit is like the WEF in miniature. Decision makers from all over the world are invited; among the guests is Johann Schneider-Ammann [then nearing the end of his year term as president of Switzerland’s governing council].

A gentle women’s voice comes over the PA: “Please welcome Alexander Nix, Chief Executive Officer of Cambridge Analytica.” A lean man in a dark suit strides towards the center of the stage. An attentive quiet descends. Many in the room already know: this is Trump’s new Digital Man. “Soon you’ll be calling me Mr. Brexit,” Trump had tweeted cryptically a few weeks before. Political observers had already been pointing out the substantial similarities between Trump’s agenda and that of the rightwing Brexit camp; only a few had noticed the connection to Trump’s recent engagement with a largely unknown marketing company: Cambridge Analytica.

Before then, Trump’s online campaign had consisted more or less of one person: Brad Parscale, a marketing operative and failed startup founder who had built Trump a rudimentary website for $1,500. The 70-year-old Trump is not what one would call an IT-whiz; his desk is unencumbered by a computer. There is no such thing as an email from Trump, his personal assistant once let slip. It was she who persuaded him to get a smartphone—the one from which he has uninhibitedly tweeted ever since.

Hillary Clinton, on the other hand, was relying on the endowment of the first social media president, Barack Obama. She had the Democratic Party’s address lists, collected millions of dollars over the internet, received support from Google and Dreamworks. When it became known in June 2016 that Trump had hired Cambridge Analytica, Washington collectively sneered. Foreign noodlenecks in tailored suits who don’t understand this country and its people? Seriously?

“Ladies and gentlemen, honorable colleagues, it is my privilege to speak to you today about the power of Big Data and psychographics in the electoral process.” The Cambridge Analytica logo appears behind Alexander Nix—a brain, comprised of a few network nodes and pathways, like a subway map. “It’s easy to forget that only eighteen months ago Senator Cruz was one of the less popular candidates seeking nomination, and certainly one of the more vilified,” begins the blond man with his British diction that produces the same mixture of awe and resentment in Americans that high German does the Swiss. “In addition, he had very low name recognition; only about forty percent of the electorate had heard of him.”

Everyone in the room was aware of the sudden rise, in May 2016, of the conservative senator within the Republican field of presidential candidates. It was one of the strangest moments of the primary campaign. Cruz had been the last of a series of Republican opponents to come out of nowhere with what looked like a credible challenge to frontrunner Trump. “How did he do this?” continues Nix.

Cambridge Analytica had begun engaging with US elections towards the end of 2014, initially to advise the Republican Ted Cruz, and paid by the secretive American tech billionaire Robert Mercer. Up to that point, according to Nix, election campaign strategy had been guided by demographic concepts. “But this is a really ridiculous idea, the idea that all women should receive the same message because of their gender; or all African-Americans because of their race.” The Hillary Clinton campaign team was still operating on precisely such amateurish assumptions—Nix need not even mention—which divide the electorate up into ostensibly homogeneous groups…exactly the same way as all the public opinion researchers who predicted a Clinton victory did.

Nix clicks to the next slide: five different faces, each representing a personality profile. It is the Ocean model. “At Cambridge, we’ve rolled out a long-form quantitative instrument to probe the underlying traits that inform personality. This is the cutting edge in experimental psychology.” It is now completely silent in the hall. “By having hundreds and hundreds of thousands of Americans undertake this survey, we were able to form a model to predict the personality of every single adult in the United States of America.” The success of Cambridge Analytica’s marketing arises from the combination of three elements: this psychological behavioral analysis of the Ocean model, Big Data evaluation, and ad targeting. Ad targeting is personalized advertisement tailored as precisely as possible to the character of a single consumer.

Nix explains forthrightly how his company does this (the presentation can be viewed on YouTube). From every available source, Cambridge Analytica buys up personal data: “What car you drive, what products you purchase in shops, what magazines you read, what clubs you belong to.” Voter and medical records. On the screen behind him are displayed the logos of global data traders like Acxiom and Experian—in the United States nearly all personal consumer data is available for purchase. If you want to know, for example, where Jewish women live, you can simply buy this information. Including telephone numbers. Now Cambridge Analytica crosschecks these data sets with Republican Party voter rolls and online data such as Facebook likes, and constructs an Ocean personality profile. From a selection of digital signatures there suddenly emerge real individual people with fears, needs, and interests—and home addresses.

The process is identical to the models that Michal Kosinski developed. Cambridge Analytica also uses IQ-Quiz and other small Ocean test apps in order to gain access to the powerful predictive personal information wrapped up in the Facebook likes of users. And Cambridge Analytica is doing precisely what Kosinski had warned about. They have assembled psychograms for all adult US citizens, 220 million people, and have used this data to influence electoral outcomes.

Nix clicks to the next slide. “This is a data dashboard that we prepared for the Cruz campaign for the Iowa caucus. It looks intimidating, but it’s actually very simple.” On the left, graphs and diagrams; on the right, a map of Iowa, where Cruz had done surprisingly well in the caucuses. On this map, hundreds of thousands of tiny dots, red and blue. Nix begins to narrow down search criteria to a category of Republican caucus-goers he describes as a “persuasion” group, whose common Ocean personality profile and home locations are now visible, a smaller set of people to whom advertisement can be more effectively tailored. Ultimately the criteria can be narrowed to a single individual, along with his name, age, address, interests, and political leanings. How does Cambridge Analyica approach this person with political messaging?

Earlier in the presentation, using the example of the Second Amendment, Nix showed two variations on how certain psychographic profiles are spoken to differently. “For a highly Neurotic and Conscientious audience, you’re going to need a message that is both rational and fear-based: the threat of a burglary and the ‘insurance policy’ of a gun is very persuasive.” A picture on the left side of the screen shows a gloved hand breaking a window and reaching for the inside door handle. On the right side, there is a picture of a man and child silhouetted against a sunset in tall grass, both with rifles, obviously duck hunting: “for a Closed and Agreeable audience, people who care about traditions and habits and family and community, talking about these values is going to be much more effective in communicating your message.”

How to Keep Clinton Voters Away

Trump’s conspicuous contradictions and his oft-criticized habit of staking out multiple positions on a single issue result in a gigantic number of resulting messaging options that creates a huge advantage for a firm like Cambridge Analytica: for every voter, a different message. Mathematician Cathy O’Neil had already observed in August that “Trump is like a machine learning algorithm” that adjusts to public reactions. On the day of the third presidential debate between Trump and Clinton, Trump’s team blasted out 175,000 distinct variations on his arguments, mostly via Facebook. The messages varied mostly in their microscopic details, in order to communicate optimally with their recipients: different titles, colors, subtitles, with different images or videos. The granularity of this message tailoring digs all the way down to tiny target groups, Nix explained to Das Magazin. “We can target specific towns or apartment buildings. Even individual people.”

In the Miami neighborhood of Little Haiti, Cambridge Analytica regaled residents with messages about the failures of the Clinton Foundation after the 2010 earthquake in Haiti, in order to dissuade them from turning out for Clinton. This was one of the goals: to get potential but wavering Clinton voters—skeptical leftists, African-Americans, young women—to stay home. To “suppress” their votes, as one Trump campaign staffer bluntly put it. In these so-called dark posts (paid Facebook ads which appear in the timelines only of users with a particular suitable personality profile), African-Americans, for example, are shown the nineties-era video of Hillary Clinton referring to black youth as “super predators.”

“Blanket advertising—the idea that a hundred million people will receive the same piece of direct mail, the same television advert, the same digital advert—is dead,” Nix begins to wrap up his presentation at the Concordia Summit. “My children will certainly never understand this concept of mass communication. Today, communication is becoming ever increasingly targeted.

“The Cruz campaign is over now, but what I can tell you is that of the two candidates left in this election, one of them is using these technologies. And it’s going to be very interesting to see how they impact the next seven weeks. Thank you.” With that, he exits the stage.

It is not knowable just to what extent the American population is being targeted by Trump’s digital troopers—because they seldom attack through the mainstream broadcast media, but rather mostly with highly personalized ads on social media or through digital cable. And while the Clinton team sat back in the confidence that it was safe with its demographic calculations, a new crew was moving into the Trump online campaign headquarters in San Antonio, Texas, as Bloomberg journalist Sasha Issenberg noted with surprise after a visit. The Cambridge Analytica team, apparently just a dozen people, had received around $100,000 from Trump in July; in August another $250,000; five million in September. Altogether, says Nix, they took in around fifteen million.

And the company took even more radical measures: starting in July 2016, a new app was prepared for Trump campaign canvassers with which they could find out the political orientation and personality profile of a particular house’s residents in advance. If the Trump people ring a doorbell, it’s only the doorbell of someone the app has identified as receptive to his messages, and the canvassers can base their line of attack on personality-specific conversation guides also provided by the app. Then they enter a subject’s reactions to certain messaging back into the app, from where this new data flows back to the control rooms of Cambridge Analytica.

The company divided the US population into 32 personality types, and concentrated on only seventeen states. And just as Kosinski had determined that men who like MAC cosmetics on Facebook are probably gay, Cambridge Analytica found that a predeliction for American-produced cars is the best predictor of a possible Trump voter. Among other things, this kind of knowledge can inform Trump himself which messages to use, and where. The decision to focus candidate visits in Michigan and Wisconsin over the final weeks of the campaign was based on this manner of data analysis. The candidate himself became an implementation instrument of the model.

What is Cambridge Analytica Doing in Europe?

How great an influence did these psychometric methods have on the outcome of the election? Cambridge Analytica, when asked, did not want to disclose any documentation assessing the effectiveness of their campaign. It is possible that the question cannot be answered at all. Still, some indicators should be considered: there is the fact that Ted Cruz, thanks to the help of Cambridge Analytica, rose out of obscurity to become Trump’s strongest competitor in the primaries; there is the increase in rural voter turnout; there is the reduction, compared to 2008 and 2012, in African-American voter participation. The circumstance of Trump having spent so little money on advertising could also speak for the effectiveness of personality-specific targeting, as could the fact that three quarters of his marketing budget was spent in the digital realm. Facebook became his ultimate weapon and his best canvasser, as a Trump staffer tweeted. In Germany, the rightwing upstart party Alternative für Deutschland (AfD) may like the sound of this, as they have more Facebook friends than Merkel’s Christian Democrats (CDU) and the Social Democrats (SPD) combined.

It is therefore not at all the case, as is so often claimed, that statisticians lost this election because their polls were so faulty. The opposite is true: statisticians won this election. It was just certain statisticians, the ones using the new method. It is a cruel irony of history that Trump, such a detractor of science, won the election thanks to science.

Another big winner in the election was Cambridge Analytica. Steve Bannon, a Cambridge Analytica board member and publisher of the ultra-rightwing online site Breitbart News, was named Trump’s chief strategist. Marion Maréchal-Le Pen, ambitious Front National activist and niece of the presidential candidate, has tweeted that she has accepted the firm’s invitation to collaborate. In an internal company video, there is a live recording of a discussion entitled “Italy.” Alexander Nix confirms that he is in the process of client acquisition, worldwide. They have received inquiries out of Switzerland and Germany.

Kosinski has been observing all of this from his office at Stanford. After the election, the university was in an uproar. Kosinski responded to the developments with the most powerful weapon available to researchers: a scientific analysis. Along with his research colleague Sandra Matz, he conducted a series of tests that will soon be published. The first results seen by Das Magazin are unsettling: psychological targeting, as Cambridge Analytica deployed it, increases the clickthru rate on Facebook ads by more than sixty percent. And the so-called conversion rate (the term for how likely a person is to act upon a personally-tailored ad, i.e. whether they buy a product or, yes, go vote) increases by a staggering 1400 percent.*

The world has been turned upside down. The Brits are leaving the EU; Trump rules America. It all began with one man, who indeed tried to warn of the danger, and who still gets accusatory emails. “No,” says Kosinski quietly, shaking his head, “this is not my fault. I did not build the bomb. I just showed that it was there.”

Translated by Antidote

Source

 

 

Comments

 
Name: (change name for anonymous posting)
Title:
Comments:
   

1 Article displayed.

Pursuant to Section 230 of Title 47 of the United States Code (47 USC § 230), BSAlert is a user-contributed editorial web site and does not endorse any specific content, but merely acts as a "sounding board" for the online community. Any and all quoted material is referenced pursuant to "Fair Use" (17 U.S.C. § 107). Like any information resource, use your own judgement and seek out the facts and research and make informed choices.

Powered by Percleus (c) 2005-2047 - Content Management System

[Percleus 0.9.5] (c) 2005, PCS