What do conservative college student Harmony, former model Chloe, and proud Black American Amy G all have in common? They’re all active on social media, but none of them are real. They’re just three of the accounts featured in the Spot the Troll Quiz, an online resource developed by Clemson University professors Darren Linvill and Patrick Warren with the help of Charlotte firm Interactive Knowledge to help push back against internet misinformation campaigns.
That attractive young woman posting about voter fraud on Twitter, the Nevada police support group on Instagram and the Facebook user who reposts inspirational quotes from Michelle Obama? There’s a good chance they’re not who they seem. In fact, it’s quite possible that they are trolls.
Trolls are fake social media accounts. They’ve grown so numerous and ubiquitous online that there are internet troll farms, like the one created by Internet Research Agency (IRA) in St. Petersburg, Russia. This troll factory employs people to concoct reasonable facsimiles of real people who want to engage you or be your friend online. Trolls’ tactics, goals and who they work for may vary, but they’ve all been created to spread misleading information that creates confusion and deepens divisions already present in American society.
“A troll is … not what they purport to be,” says Warren, an associate professor in the Department of Economics at Clemson. “They’re entering into a conversation in order to further some goal that is not apparent.”
Foreign actors like Russia, China, Cuba and Iran, often set up internet troll accounts to sow discord in America.
“They want to make sure that [the U.S.] can’t walk and chew gum at the same time, which ideally, is what a strong democracy would be able to do,” says Linville, an associate professor of Communication.
A frequent intention is to distract folks in the U.S. from actions each of these actors undertake in their own backyard.
“The more the United States had to look inward, the more we talk about what is going on in the streets of Seattle or Kenosha,” Linvill says, “the less we’re talking about what Putin is doing in the Ukraine, and a more of a free hand he has to poison his political rivals without any kind of international response.”
The professors’ quiz presents eight profiles found on Instagram, Facebook and Twitter, with a sampling of posts from each of the accounts. The quiz takers’ challenge is to examine each of the profiles and decide which ones are real and which ones are internet trolls. Through each profile, users can learn tell-tale signs of trollery.
But if Linvill and Warren wanted to teach people about virtual bad actors and online disinformation, why did they do it in the form of a quiz?
“If you want to really engage people; you have to make the content engaging,” Linvill offers. So, he and Warren designed an activity that is fun, challenging and something that people would want to share. (When Queen City Nerve spoke with Warren and Linville on Oct. 9, they reported that 680,000 users had taken the quiz.)
Warren offers one clue for users when it comes to spotting an internet troll.
“Don’t look for things that they do but rather look for things that they don’t do,” he says. Real people do things that inauthentic accounts can’t do, Warren maintains, like showing pictures of themselves and their families, posting about local restaurants or shops, or commenting on local politics.
“If all they do is talk about national politics all the time, you don’t want to spend your time with them,” Warren says.
A Child of the Cold War
“I never really meant to become a disinformation researcher,” Linvill says, “but sometimes life does with you what it wants to.”
A self-described child of the Cold War who says he spent more time watching Rocky IV and Top Gun than was healthy for a 1980s adolescent, Linvill had been working with social media since the beginning of his academic career.
He was well-positioned when the Russians launched cyberattacks in 2015 and 2016. They hacked the Democratic National Committee’s server, targeted Hillary Clinton’s account, and flooded the internet with fake social media accounts spreading disinformation and fiery opinions about what the hack supposedly revealed.
There’s little doubt the operation helped pave the way for Donald Trump to occupy the White House. Last August, a Republican-controlled Senate panel determined that the Russian government disrupted the 2016 election to help Trump become president, and that Trump’s advisers were eager for the help.
But internet trolls are interested in far more than presidential politics. In Charlotte, a 2016 Black Lives Matter protest following the shooting death of Keith Lamont Scott by a CMPD officer was later found to be planned by a Russian troll farm.
At Christmastime 2017, Linvill and his friend and colleague Warren were playing board games and talking about the unprecedented attack on democracy, specifically the piece that fell within their wheelhouse — social media.
They knew that Twitter had released suspended accounts attributed to the IRA to the House Intelligence Committee, which in turn made the accounts public. This action ran counter to Twitter’s usual practice of purging suspended accounts, a procedure that David Carroll, associate professor of media design at the New School, equated with erasing history “in an almost Orwellian way.”
“Twitter didn’t necessarily want the world to have those [accounts],” Linvill remembers. “They didn’t oppose [the release] per se, but they only released the Twitter handles. They didn’t release the content and meta data.”
Using a marketing and public relations tool called Social Studio in a way that the makers never intended, Linvill and Warren were able to get access to the tweets. They examined them at length, ultimately sharing them with journalists and other researchers.
Linvill looked at tweets until my eyeballs bled. He says he felt obligated to undertake the time-consuming task because he finds the idea of some foreign power meddling in our election offensive.
The Spy Who Sold Me
In effect, internet trolls aren’t engaging espionage; they’re practicing marketing, Linville says. They’re spreading narratives designed to fan the flames of fires constantly smoldering in America.
Trolls also don’t produce most of their content, Warren says. They often copy it from real people here in America and then pass it off as their own, while accentuating certain aspects of the stolen messages. Warren cites a hypothetical social media flare up about of video of someone being racist as an example.
“The IRA would move into those conversations and they would pull out and emphasize certain parts, the parts that would encourage people to take real world action, like sharing a person’s home address, phone number or employer.” And if sometimes those actions escalate into violence, that’s all the better.
Internet trolls have grown far more sophisticated in recent years, says Linvill. The IRA used to utilize a lot more automated accounts, or bots, that are less successful at mimicking humans. In 2016 the IRA was a lot more brazen, because few were looking, and businesses weren’t scrutinizing where the money came from. The IRA was paying for Facebook ads in rubles, and they were registering their accounts with Russian phone numbers.
Nowadays troll slowly infiltrate online communities to gain trust and status before unleashing their messages of confusion and division upon online users, says Warren.
These coordinated inauthentic informational operations are like terrorism in that they don’t have to be successful in their direct goals in order to have a major impact on society, Warren offers.
A terrorist that plants bombs that never go off or cause any harm can still foment terror, he maintains.
“We have an open society that has a ton of benefits, but one of the costs is that it leaves us open to [disinformation] operations,” Warren offers. “Just their existence is a harm, but that’s a harm we’re stuck with.”
The fact that we know about the IRA and other internet trolls getting into the game may be a double-edged sword. On one hand they have to be sneakier to attain their goals, but on the other, visibility can give them more power. It may sometimes seem that trolls are everywhere and invincible.
One of the biggest effects of disinformation is to make us all less trusting on social media, says Linville.
“I can now disagree with you not because you have a different perspective that me, but because you’re a Russian troll, not because you have a different lived experience than I do, but you don’t even exist. I can dismiss you completely,” he says.
Playing Whack-a-Mole
Warren and Linvill have worked hard to identify accounts and work to have those accounts suspended, often working in cooperation with the government. They were recently instrumental in getting an internet troll farm in Ghana shut down. The farm was outsourced by the IRA, which funded and trained its personnel.
The social media platforms, while far from perfect, are doing more to address the issue than they used to. The FBI recently tipped off Twitter to suspend over 100 Iranian accounts that were talking about the presidential debate.
“But you can’t stop this problem by playing whack-a-mole,” Linvill says. Simply shutting down accounts is not enough because it’s so cheap and easy to create more.
“We also have to address this from the educational perspective, to try to make a more resilient public. We have to address this problem by making better users.”
Since Warren and Linvill were in the unique position of understanding trolls’ strategy and tactics, they had the knowledge, context and obligation to create their quiz. But they couldn’t do it all.
“Neither Darren nor I are web developers,” Warren says.
To bring their quiz concept to fruition they turned to Charlotte firm Interactive Knowledge.
“They took out vision and made it a reality,” Warren offers.
In the end, a better educated online public may be the most effective and lasting tactic to battle the internet trolls. Between the Russians, the Cubans, the Iranians and our own homegrown disinformation spreaders like QAnon, we may just have to live with trolls, and simply become more adept at spotting these virtual manipulators.
“Frankly, a much bigger problem is our own domestic disinformation,” says Linville, who published a paper with Warren last April about homegrown accounts — many of them real — spreading COVID disinformation.
While foreign state actors try to spread disinformation, Linvill and Warren say, all the content is homegrown, originating here in the United States. It’s like America is in a cheesy thriller, where the call from the Russian troll is coming from inside the house — or even the White House.
“We’re our own worst enemy,” Linvill says. “Foreign nations are only making a bad thing worse.”
Become part of the Nerve: Help us continue to connect community and culture and tell the overlooked stories of everyday Charlotte. Get better connected and become a monthly donor to support our mission and opt-in to our tri-weekly newsletter.
The Link LonkOctober 16, 2020 at 06:13PM
https://ift.tt/2H4G2Dv
'Spot the Troll' Quiz Aims to Make a More Informed Internet - qcnerve.com
https://ift.tt/3g8jVYU
Troll
No comments:
Post a Comment