As the coronavirus pandemic fuels a wave of online disinformation, governments, public personalities and companies are turning to technology to weed out the fake social media profiles and bad actors behind disinformation and deception.

With millions more people spending time at home, often not going out or engaging in other activities, there is a growing captive audience hooked to social media feeds. They have become the prime source of disinformation, explained Dan Brahmy, co-founder and CEO of Israel-based startup Cyabra, an information verification company which uses artificial intelligence to track down fake social media profiles and expose online disinformation campaigns, working with clients that range from large public corporations to the US State Department.

“These actors, they see an opportunity where people are staying at home,” Brahmy told an episode of SALT Talks.  “That represents an ocean of opportunity for good and for bad.  People are what they call ‘wired.’  They are wired constantly. They are consuming content and news and pieces of information much more broadly than they were previously.”

Cyber defense

Three out of Cyabra’s four founders are veterans of the Israel army’s elite cyber-defense units. The company hit its stride in 2020, just as the coronavirus pandemic began to present a daunting new challenge in online disinformation.

The fact that the pandemic, and the regulations to control it, are affecting nearly everyone around the world also means people globally are paying a lot of attention to news and information about the virus itself, making the topic a ripe opportunity for disinformation, said Vincent O’Brien, a foreign service officer in the U.S. State Department, one of Cyabra’s clients.

“Because you’ve got a large audience, it provides a large opportunity for nefarious actors to use it as a hook to either push new types of disinformation narratives , or to connect long-standing disinformation narratives to exploit this new opportunity,” O’Brien said.

Add to that the fact that there is still a lot that is not known about the virus, and it becomes a perfect topic for deception, O’Brien said, citing as an example the online spread of false information about the origins of the virus, including a theory that the United States created the virus and let it loose from a lab in China.

“The actual information on it isn’t clear, so that’s why it’s a great space for disinformation, because even the experts who are trying their very best to define and explain it and help the public, don’t have all the information,” O’Brien said.  “It’s exacerbated by the fact that the point of origin of the virus, the country that it came from – China – has not been completely transparent about it.”

Damaging information

This uptick in pandemic-related online deceit comes as both governments and corporations have been struggling in recent years with how to detect and respond to dangerous and damaging information. Cyabra’s platform uses more than 800 parameters to quickly identify content produced by fake profiles or bots.  It can also detect fabricated videos that look like they portray real people, known as deep fakes.

“You can literally create people from the ground up,” Brahmy said.  “We look at the problem of visual manipulation as a rising problem and threat.”

The U.S. State Department added Cyabra to its tool chest to help monitor online information around topics like elections both at home and abroad.

“We have to see which of these are false personas attempting to direct the conversation and why,” O’Brien said. “Technology companies like Cyabra and others that we use can help us identify that, and then reverse-engineer it, bring it back to the source.

Cyabra is one of the organizations that have really done a good job at identifying the information and presenting it to you in a way that you can best make a decision about it,” he said.

Government agencies in other countries are also signing up to the platform. Governments should be open to hiring or using products from private companies like Cyabra to help with detecting disinformation, O’Brien said.

“The private sector can move much more quickly, they’re developing the AI products and machine learning products and big data analysis products that government just simply by its very nature can’t go out and design themselves, just like we don’t just go out and design the office furniture,” he said.

Companies and advertising agencies are also relying on Cyabra as they experience more damage to their images or brands due to disinformation, whether through bad reviews left by bots or false information spread on social media, often by profiles not connected to real people.

“A lot of them are suffering,” Brahmy said, adding that such information can result in decreased sales or falling stock prices. “It’s absolutely insane, it’s becoming more and more accessible. Anyone can skew the public opinion for better or worse. The private companies are being targeted and are being skewed by bad and fake actors, no doubt in my mind, we are seeing this on a weekly basis.”

Malicious information

Cyabra leaves it up to its clients, whether they are governmental bodies or private companies, to choose how to respond to detected disinformation, Brahmy said. While the platform can identify fake or malicious information, it is up to clients to analyze its potential impact and figure out what to do about it.

Governments are still trying to figure out how and when to respond, especially in democracies with free speech, according to O’Brien.  While the US government does respond to criminal activities and threats to national security, handling disinformation without violating the right to free speech is difficult.

“That’s a very slippery slope,” O’Brien said. He said the most effective way to deal with disinformation is educating people to recognize it, and continuing to encourage the sharing and reporting of real and true information.

“In the course of human history, we’ve never been able to stop people from lying.  Now they are just lying better and on a medium that has global reach and moves faster,” he said. “But the remedy for speech that is false is speech that is true.”

Meanwhile, experts don’t expect an easy solution any time soon for disinformation, and say fake and malicious online activities will likely become more advanced.

“It is a cat and mouse play,” Brahmy said.  “We will get better at some things, but they will too.  The addressable market for a company like Cyabra or other similar company is, I’d like to say, endless. The more companies we have trying to solve this huge problem, the better we are at understanding it, and getting to the last pieces of the puzzle to solve it through technology. But I’m optimistic.”

For more information on Cyabra, click here.