Disinformation

GOP Bringing "Moscow Rules" on Disinformation to American Politics

A series of events over the past several months raises questions about whether using Russian style disinformation tactics has become a core part of the GOP’s electoral strategy for the 2020 elections.

Let’s review what we’ve seen so far. In June, the Trump campaign used foreign-shot stock footage to manufacture fake people who were then used in ads run on Facebook. A top Trump campaign consultant built a series of websites falsely purporting to be the official sites of Democratic Presidential candidates. The President tweeted out a video of Nancy Pelosi he knew had been altered, and also one morning retweeted dozens of accounts almost all of which were certainly — and obviously — fake. A new set of Trump campaign Facebook ads include one which lies about the Vice President and other Democratic candidates supporting single payer health care, falsely using an image from a different question from the most recent Democratic debate.

 

This morning, the Chairwoman of the RNC, Ronna McDaniel, retweeted a tweet by Senator Marco Rubio which featured selectively and misleadingly edited remarks by Rep. Ilan Omar. That the video was misleading and grossly misrepresented what she said had already been established. Yet the GOP Chair shared it anyway.

And of course there is the relentless, grinding flood of disinformation coming from the vast network of right wing bots and trolls. We’ve put together a list of some of the top right wing “amplifiers” here so as to better understand this critical part of the right’s disinfo dystopia. 

While we shouldn’t be surprised that the American political party which so enthusiastically embraced and amplified Russian active measures and disinformation in 2016 would be at it again, it does not mean that responsible Americans should accept these tactics as normal and routine. They aren’t. They are outside of what should be permissible in a mature democracy; and that we are seeing them emerge in this election should challenge all of us to do something concrete about it. Here are some ideas on what can and should be done:

Name and shame — First, we have to begin openly talking about what is going on here; condemn it when it happens; and be prepared to rebut and respond to these false attacks when they come. This tweet from the DNC’s War Room this morning is a good example.

 

Next, the social media platforms should be notified and encouraged to take down blatantly false material. Someday we may have to find a way to more formally regulate all this, as my friend Amb. Karen Kornbluh has recommended. But in the short term pressure should be applied to the platforms to be as aggressive as they can be to not knowingly spread false information.

Finally, the mainstream media should be judicious in how they cover these moments so they don’t end up just promoting false and misleading videos, statements and attacks. The role of the traditional media is particularly important here. The day the President took to Twitter and tweeted out dozens of accounts purporting to be firefighters who supported him, the Washington Post ran a story whose headline read “Trump retweets dozens of people taking issue with a firefighters union’s endorsement of Biden.” The problem of course is that The Post had no idea if these accounts were real people. Reviewing them, very few looked real. So what would be more accurate would have been “Trump retweets dozens of accounts taking issues with a firefighters union’s endorsement of Biden.” There has to be consciousness now in all stories going forward that there is a possibility these accounts are fake and that the entire episode was “disinformation” — the use of fake accounts and other means to create an impression about something which is not true.

It is my hope that all news organizations are having internal conversations now about how they are going to deal with these kinds of moments in the coming months. Have they trained their reporters and editors about common disinformation tactics? Is there a special editor assigned to officiate when questions about authenticity and whether something is disinformation are raised? Do internal practices need to be reviewed and updated to the moment? I hope all these things are happening now inside all news organizations as we get deeper into the 2020 election. For not understanding, or being surprised, can no longer be a legitimate excuse for anyone in the information or media business.

Non-proliferation — If we view disinformation and fraudulent representations as a societal “harm,” something dangerous and improper, then Democrats and other responsible actors in the political system should commit to not use these illicit tactics in their own operations. Vice President Biden has made such a commitment, and the 50 state Democratic Parties have called on the national party to seek such a commitment from all Democrats at all levels of government across the country. My hope is that other organizations in the day-to-day scrum of national politics — trade associations, advocacy groups, lobbying campaigns — also make similar commitments. Using these kind of Russian inspired disinformation tactics should be seen as something that is not just wrong, but unpatriotic, a betrayal of our democracy. Knowingly misleading your fellow citizens using fraudulent means can just never ever become okay.

Of course the fakery and fraud we discuss here is of a very conventional kind. We all expect artificial intelligence enhanced “deep fakes” to be deployed in this election. As you can see in this presentation, the ability to determine something which looks so real could be made up is going to very hard for our system and the American people, still struggling to handle the fraudulent representation described above, to manage.

After what we’ve seen already these last few months, the relentless daily lying by the President, and Mitch McConnell’s years of blocking legislation to protect our democracy and discourse, it is perhaps unreasonable to expect the Republican Party here in the US to do anything other than play by Moscow Rules in 2020. But the rest of us cannot be naive and unprepared this time. We need to condemn it, counter it, combat it and ultimately ensure that these kind of illicit tactics have no place in a democracy like ours.

This essay was originally published on the Medium website on Friday, July 26th, 2019. 

Protecting the 2020 Dem Primary from Disinformation, Bots and Hacking

In a long thread this past weekend, I called on the 2020 Democratic Presidential candidates, the DNC and the State Parties to band together to fight disinformation and illicit campaign tactics in the Democratic Presidential primary.  Among the things I called for is for the Presidentials to sign a pledge committing to forgo use of bots, trolls, troll farms, fake accounts, fake sites, deepfakes and faked images, hacking and use of hacked materials; and for the campaigns to be vigilant about reporting illicit activity to the proper authorities, the platforms and the Party, and to discourage the use of these tactics by their supporters. My hope is that either the DNC or the State Parties will demand they sign such a pledge, or for the campaigns to make their own pledges now, without delay, and not wait for the Party to get involved. 

I know the DNC is in the process of standing up a disinformation unit - it should be given ample funding and broad support to become a leader in this space and not a laggard.  This unit can help all the campaigns stand up their own disinformation teams, something which is now, unfortunately, a requirement in today's politics - a must to have, not a nice to have. 

Simply, we cannot let what happened in 2016 happen again.  As we learned at the DCCC in 2018, we are not powerless.  We got the social media platforms to do takedowns, refered illegal activity to the FBI and helped train our campaigns how to protect themselves and win in a fast changing information landscape.  Healthy democracies cannot accept the poisoning of their discourse, and must do everything they can to let the residents of their nation drive the daily debate not inauthentic voices from inside or outside the country.  

For more on the work we did at the DCCC this past cycle battling disinformation, see this NBC News op-ed I wrote with DCCC Chief of Staff Aaron Trujillo; this Washington Post article about our strategy to first "flood the zone" to make it harder for disinfo to work; a Reuters story which looks into one of our major take downs; and a great NBC News piece recounting right wing trolls complaining about the more aggressive countermeasures coming from the platforms - in this case the take down they were complaining about was a multi-platform campaign the DCCC found and worked with Twitter, Facebook and YouTube to mitigate.  Here is a link to the DCCC's pledge" from the fall of 2018, and a smart deep dive from the Atlantic's Natasha Bertrand on the pledge and the GOP's refusal to sign on even after months of negotiation. 

NDN originally dove into this world through a paper we published back in the fall of 2017 on bots and disinformation, and I worked on a project in the fall of 2017 which deployed bot detection technology for the Democratic candidates in both the Virginia Governor's race and the Alabama Senate race.  Articles about that work in Virginia can be found in the Washington Post and Politico.  And this Alabama work should not be confused with the illicit campaign waged as a test by some Democrats - we were working for Doug Jones not the other guys. 

Finally, my vision for how we counter disinformation and illicit tactics goes far beyond a pledge.  What we learned in 2018 is that we are not powerless; using modern tools and being smart about hunting disinfo organizations and campaigns can provide an effective counter.  It is my belief that what must happen in the next two years is that parties, campaigns, governments, NGOs, and even the media must go on offense here, be louder and smarter on the Internet and social media, and learn how to better manage the discourse in their own particular areas of debate and discussion.  If all of us do our part we can together eliminate a lot of the low hanging fruit, the easy stuff.  The platforms while they have much more to do have made things harder.  But we can't wait for them, or for governments, to act.  We all need to do our part to clean up our areas of the Internet and social media, and ensure that authentic and well intentioned voices prevail.  The tools to counter disinformation are cheap and readily available - it is more about learning how this all works and making a true commitment to win debates in a very new and changing information landscape. 

Fri, 2/1 Update - Justin Hendrix has taken this pledge concept and turned into a thought piece on the Internet.  Go check it out and offer feedback.  He also discusses this idea in this new piece on Just Security.

Mon, 2/4 Update - Got to talk about all this stuff on Joy Reid's MSNBC program on Saturday.  Check it out - was a very good segment. 

Syndicate content