Facing up to ‘national security challenge of the 21st century’

Listen 4:26
From left, Google's Law Enforcement and Information Security Director Richard Salgado, Twitter's Acting General Counsel Sean Edgett, and Facebook's General Counsel Colin Stretch, appear together during a Senate Committee on the Judiciary, Subcommittee on Crime and Terrorism hearing on Capitol Hill in Washington, Tuesday, Oct. 31, 2017, on more signs from tech companies of Russian election activity. (AP Photo/Andrew Harnik)

From left, Google's Law Enforcement and Information Security Director Richard Salgado, Twitter's Acting General Counsel Sean Edgett, and Facebook's General Counsel Colin Stretch, appear together during a Senate Committee on the Judiciary, Subcommittee on Crime and Terrorism hearing on Capitol Hill in Washington, Tuesday, Oct. 31, 2017, on more signs from tech companies of Russian election activity. (AP Photo/Andrew Harnik)

Over the past several years, and especially during the most recent presidential campaign, you may have recognized accelerating fervor around hot-button topics of gun control, LGBTQ rights, immigration, and social justice.

While these issues and the political discourse around them are legitimate, “agents of a hostile foreign power” involved in “sophisticated misinformation campaigns” have targeted Americans — Republicans, Democrats, independents, liberals, and conservatives — through social media.

Testimony on Capitol Hill this week from tech giants Facebook, Twitter, and Google revealed a new understanding of the phenomenon along with increased commitment from lawmakers toward recognizing and stopping such threats.

Two main sources of targeted misinformation have been identified as the Kremlin-backed Internet Research Agency and RT (formerly Russia Today), also backed by the Russian government.

Many others persist.

Foreign propaganda, identified in these instances from Russia, is understandably of great concern to the tech companies and national security analysts testifying before congressional committees.

But while the misinformation campaigns targeted supporters of both Hillary Clinton and Donald Trump in the run-up to the 2016 election, they most often exploited hot-button issues — using our biases to sow discontent and division among us.

Inciting specific Facebook followers

For example, ads meant to stir up disagreement on racial and social justice issues were targeted to residents of Baltimore and Ferguson, Missouri, sites of two notorious police slayings of black men.

U.S. Sen. Chris Coons, D-Delaware, pointed out one of the most egregious examples of that tactic.

Foreign operatives — in this case, Russians — created Facebook pages aimed at attracting a very specific following.

One such page, the “Heart of Texas,” billed itself as a patriotic home for lovers of the Lone Star state, BBQ and beer.

Another page, “Islam and Proud,” posed as a home for American Muslims who were proud of their heritage, religion, and home in America.

The pages seemed safe and passed the “sniff test” for harmless Facebook fun. Each had hundreds of thousands of followers. Then, the Russian operatives behind the pages created real-life events pitting the two groups against each other.

At noon on May 21, 2016, in Houston, “Heart of Texas” followers showed up to “prevent the Islamization of Texas.” At the same time in the same place, “Islam and Proud” supporters showed up to protect Muslim literature.

Fomented into a fury, the groups arrived and confronted each others. Not an organizer was to be found

The ad campaign cost Russians $200, paid for in rubles.

Which brings up an even bigger issue.

Election ad campaigns in the U.S. are restricted to Americans. Or at least, American dollars.

In this case, as U.S. Sen. Al Franken, D-Minnesota, brought up Tuesday, not only were the ads paid for by people outside of the U.S., but paid for in foreign currency — something strictly prohibited by FEC regulations.

Other countries are banned from ad buys in American elections. Television and radio have rules against this. But Facebook, Twitter, and Google see themselves as neutral tech companies providing a service — not as media publishers.

More oversight and legislation are on the way, as promised by U.S. Sen. Jean Klobuchar, D-Minnesota., who along with senators John McCain, R-Arizona, and Mark Warner, D-Virginia, have a bill already working its way through Congress.

An even more challenging problem for government, the tech industry, and national security analysts will be coming up with regulations to keep American shell companies from buying ads with foreign money.

Over at Google, RT’s YouTube channel is making money since its videos are getting lots of clicks, and the ads on those videos generate dollars for RT.

International issue

Ahead of the 2016 election, most social media content criticized Clinton. After the election, content criticized Trump.

But U.S. Sen. Ted Cruz, R-Texas, said calls for Facebook to police content “raises troublesome concerns” — especially in light of findings that Google, Facebook, and Twitter lean toward the left on what’s shown in search results, and what types of content are deemed vile and pulled.

And it’s not just an American problem. With the developing world as the largest growing audience for social media and tech in general, the stakes are even higher, said U.S. Sen. Patrick Leahy, D-Vermont.

He cited the Rohingya people of Myanmar and those in Cambodia facing political violence in light of a newly minted Russian handbook of exploitative social media tactics. Leahy said it will no doubt be used in other, less regulated frontiers.

Exercise due diligence

But in all the promises of transparency, searchable ad listings, and continued investigation, social media users are urged to be vigilant and to do our due diligence.

That means being mindful about what we like on Facebook, the accounts we follow on Instagram and Twitter, and the sources we trust and share.

For instance, don’t like a page just because it shares something funny. Before you click, check out the fine print on the post that lists the domain name.

Is it something you’ve heard of before? Who shared it in the first place?

It’s a good habit to click into the page and see what other things have been shared and posted. You’ll usually find out pretty quickly if it’s what it seems or if it’s strange and spammy.

Sometimes, we’re added to groups without even knowing it. On Facebook, in particular, it’s good practice to look into your history.

While logged on, check out the options on the right side. Click on Pages and then click Liked Pages. Continue through and anything that’s not “verified” with a little blue check mark — or someone you know personally — maybe just unfollow.

Then go to Groups and do the same thing. If you never wanted to be there in the first place, it’s as good a time as any to opt out of some of those spammy groups.

Think of it this way. If a stranger came up and told you something on the street, would you tell your grandma? No! Same goes for the internet. Sharing something you can’t confirm only helps the spread of misinformation.

As proved in the 2016 election, social media is vulnerable. Our desires to share common points of view and disagree online has been exploited for economic and other reasons.

With social media in its angsty adolescence, this is not going to go away quickly.

But Congress is promising increased oversight, and online corporations are pledging more transparency. Educating and making American users more aware can all contribute toward being more prepared for what U.S. Sen. Lindsay Graham, R-South Carolina, has called the “national security challenge of the 21st century.”

 

Want a digest of WHYY’s programs, events & stories? Sign up for our weekly newsletter.

It will take 126,000 members this year for great news and programs to thrive. Help us get to 100% of the goal.