The whistleblower who sounded the alarm about Facebook’s data vulnerabilities in the 2016 election cycle isn’t done yelling — even on Election Day 2018.
Christopher Wylie blasted the social network on Tuesday as insufficiently attentive to the problems first revealed in the last election. Calling out Facebook for “making a digital clone of our society,” Wylie described Facebook as similar to the European giant that centuries ago plundered the resources of its colonial subjects.
“This is a story of colonialism. Facebook is our generation’s East India Company,” Wylie said during an onstage interview at the Web Summit in Lisbon, Portugal. “The problem is that our government is not equipped to handle this.”
He was unsparing of U.S. politicians asked to keep that plunderer in check. While he said he did not regret coming forward, Wylie was clearly agitated by the lack of action in the aftermath of his bombshell revelations.
“We can regulate nuclear power,” he said. “Why can’t we regulate some fucking code?”
Wylie, of course, knows a bit about voter manipulation. In the run-up to the 2016 presidential election, Wylie worked at a data analytics firm called Cambridge Analytica — that Cambridge Analytica, the same firm that collected Facebook data from millions of people for profiling, and that some believe helped Donald Trump get elected president.
Wylie says he saw how Facebook data was collected and used to create psychological profiles of potential voters. It was personal information that made it easier for those Facebook users to be manipulated or pushed toward a particular political view.
Coupling that profile information with Facebook’s algorithms — the software used to determine what you see and don’t see in your Feed — can be dangerous, he said.
“When you look at what the alt-right is and what the role of Cambridge Analytica was in catalyzing the alt-right — it’s an insurgency. It was built to be an insurgency,” Wylie said. “People who were vulnerable to disinformation were profiled and targeted using the same kinds of techniques and tactics the military would use against ISIS.”
Wylie says that the game plan was to find potential supporters of alt-right causes and encourage them to visit alt-right pages or groups on Facebook. That would signal to Facebook that these people wanted to see more of that type of content, creating a feedback loop.
“Facebook’s algorithms were, at least at the time, very sensitive,” he added. “So if you brought people onto Pages, the News Feed would change. And Facebook would do half of the work for you.”
Facebook has spent the better part of the past two years trying to avoid this from happening ahead of Tuesday’s 2018 midterms elections. It has not only tweaked its algorithm to favor friend updates over those of publishers and pages, but it’s still taking down organized efforts from other countries trying to spread this divisive content.
Facebook removed a group of 115 Facebook and Instagram accounts late Monday night that U.S. law enforcement agencies “believe may be linked to foreign entities.” Just 24 hours before the U.S. elections, there are still foreign groups trying to manipulate voters on Facebook.