Please ensure Javascript is enabled for purposes ofwebsite accessibility
Close Alert

Austin startup helps companies fight disinformation attacks


Facebook has removed 115 accounts that may be linked to foreign operatives -- and is working to confirm whether the accounts are linked to the Russian Internet Research Agency. This after the social media giant was tipped off by the FBI, which says Russia and other foreign countries are still trying to influence Americans, and possibly the midterm election, through disinformation. (Photo: Colin / Wikimedia Commons / CC BY-SA 4.0)
Facebook has removed 115 accounts that may be linked to foreign operatives -- and is working to confirm whether the accounts are linked to the Russian Internet Research Agency. This after the social media giant was tipped off by the FBI, which says Russia and other foreign countries are still trying to influence Americans, and possibly the midterm election, through disinformation. (Photo: Colin / Wikimedia Commons / CC BY-SA 4.0)
Facebook Share IconTwitter Share IconEmail Share Icon

Facebook has removed 115 accounts that may be linked to foreign operatives -- and is working to confirm whether the accounts are linked to the Russian Internet Research Agency. This after the social media giant was tipped off by the FBI, which says Russia and other foreign countries are still trying to influence Americans, and possibly the midterm election, through disinformation.

Disinformation is the whole reason New Knowledge, a startup based in Downtown Austin, exists. "They're using an army of social media accounts to manipulate the public conversation by forcing information to trend," said New Knowledge CEO Jonathon Morgan. "You can convince people that it's probably true, it's probably legitimate, I've heard it in so many different places."

New Knowledge helps companies fight disinformation -- something Morgan says can happen when outside entities decided that it is politically expedient to cause problems for private entities. Bad information that leads to boycotts against companies is a good example. So, when Morgan heard the FBI had tipped off Facebook to more than 100 questionable accounts, he wasn't surprised. "It starts out on the fringes, the political fringes and the idea is to move as many people to those fringes as possible whether it's on the right or the left," he said.

We found Paul Hastings in the parking lot outside of a polling place today. "I am very political -- I am not being paid to be here. I am volunteering," he said. He was talking to voters in support of one of the Austin city propositions on the ballot. Like most Americans, Paul has seen stuff on social media. "I've definitely seen friends share what I would consider kind of bogus," he said, nodding. But Paul says he can spot disinformation. "One is common sense. B is looking at spelling and typos and grammar," he said.

Morgan says the trolls are getting smarter. No longer are they the hyperactive accounts that post 24 hours a day. "They're changing their behavior to seem more realistic every day," he said. The big tip off that you might be dealing with disinformation, no matter what your political stripe is, is how it makes you feel.

"If it makes you feel too angry or really provokes that type of almost tribal response, then it may be designed to manipulate you," Morgan said. If it seems too juicy -- it's probably not true. "People should be concerned about things that encourage them to change their behavior."

It's important to note that this is one of the first public instances of the FBI working with a social media platform to combat foreign entities online. The FBI says there are still agents working to interfere in US politics and elections. Morgan says it's important for law enforcement and social media companies to work together to get a handle on the problem.

Loading ...