In case you live under a rock and have not seen national and/or local news coverage recently, the 2018 midterms are nearly upon us. Two weeks from today, on November 6, voters will head to the polls across the nation, with 35 U.S. Senate seats, every seat in the U.S. House of Representatives, and countless state and local offices up for grabs. Latest forecasts have many races looking competitive but it appears that Democratic efforts to take back the House will be successful and Republicans are poised to pick up some seats in the Senate.
Regardless of the outcomes on November 6, the fact of the matter is that, at this moment, there are countless races that may very well be decided by the slimmest of margins, including crucial toss-up House and Senate races. Why does this matter? Beyond the political ramifications of who wins and who loses in such races, these closely contested races will not be decided until the last votes are counted on Election Day. Who wins and who loses will very much depend on who shows up to vote (i.e. different voting demographic groups) and how many people in those groups show up to vote. For the final two weeks, candidates in tightly contested races will be putting in as much effort as possible to mobilize their key voting bases and potentially discourage those who would vote against them from showing up to the polls.
This ties into the behavior of Russian operatives in the 2016 election and their efforts to mobilize certain bases to show up, encourage other bases to stay home, and generally undermine faith in the American democratic process. I will not waste time by rehashing the nuances of the Russian disinformation campaign, as we have discussed their efforts and looked closely at their ads in-depth during class time. Suffice it to say, Russian operatives working for the Internet Research Agency utilized various social media platforms, most notably Facebook, to spread disinformation and enhance the polarization of differences among Americans. While only spending $100,000 or so on ads, Russia was able to reach an audience potentially as large as 70 million people, according to our reading from Siva Vaidhyanathan.
Unsurprisingly, we are now seeing the same tactics out of Russia during this midterm season. Since the 2016 election, the Internet Research Agency and its employees have been cultivating ads and working on campaigns of disinformation to sow chaos and ultimately undermine the results of the upcoming midterm elections. A recent Department of Justice criminal complaint has named Elena Khusyaynova as a specific employee of the Internet Research Agency and its efforts to interfere with the 2018 midterms through Project Lakhta. The strategies in use have not changed much. Namely, Russia has been continuing to purchase ads and operate fake accounts on Facebook and Twitter to spread lies, discourage certain groups from voting, and make political exchanges online more visceral and caustic than ever before.
The DOJ report comes at an awkward time for Facebook and its supposed efforts to crack down on misinformation spread through its platform. Just after unveiling its election “war room” dedicated to tracking and taking down deliberately misleading ads and pages, the DOJ says Facebook has still been allowing these ads to continue over the past two years, even after such schemes were exposed in the wake of the 2016 presidential election.
I, for one, was not particularly surprised by this report as I generally have a healthy distrust of corporate statements such as the ones Facebook gave in 2016 in which Facebook said it was “in a position to constructively shape the emerging information ecosystem” and then failed to provide any concrete examples of how it planned to go about doing so. Still, this latest DOJ complaint shows both the extent to which Russia will continue to try to undermine U.S. elections and how Facebook’s efforts to curb trolling and state-sponsored information have been unsuccessful, to put it generously.
Despite its pledges to use better filtering, AI, and machine learning to flag and remove misinformation and fake accounts, Facebook still has a long ways to go. Given the tendency of Facebook as an organization to keep its cards close to the vest (at least when it comes to sharing the methods employed to fight deliberate misinformation) one has to wonder whether Facebook has even made a sincere effort at actually stopping trolls and Russian state-sponsored ads or if it is primarily concerned with the public perception of the platform, a motive which is at least partially fueling the Facebook election war room and ads like this one:
Facebook claims in the ad above that things (see: gross negligence in stopping Russian misinformation, wanton disregard for users’ private data, etc.) are going to change. If anything, the evidence above shows little has changed thus far. Facebook better stop talking the talk and start walking the walk. Both the outcome of our midterm elections and indeed the very health and resiliency of our democracy depend on it.