Fighting Extremist Content Online: Fed Dedicates $1.9M to Terrorist Analytical Tool – National IG News

IG News Updates,

The federal government is offering new funding to continue development of an automated tool to find and flag terrorist content online.

In a press release Tuesday evening, the Department of Public Security details a $1.9 million, three-year investment in funding “to combat terrorist and violent extremist content online.”

“We need to confront the rise of hate and violent extremism,” Prime Minister Justin Trudeau said. in a tweet on Tuesday,

“At the Christchurch Call Summit, I announced that Canada will fund a new tool that helps small and medium-sized online platforms better identify and combat content related to terrorism and violent extremism.”

Read more:

How close is too close to far-away? Why some experts are concerned about Canadian lawmakers

The tool Trudeau mentioned is the Terrorist Content Analytics Platform.

Story continues below ad

Created in 2020 by the United Nations’ Tech Against Terrorism initiative, the tool scours different corners of the Internet for terrorist content and flags it for review from tech companies around the world – and, if they choose to do so remove.

Public Safety Canada has funded the construction of this tool through the Community Resilience Fund. However, according to the website for the tool, the government is well ahead of what the TCAP does, despite their funds backing it.

How does the Terrorist Content Analytics Platform (TCAP) work?

Typically, terrorists share their content on “smaller platforms” first, according to Adam Hadley, executive director of Tech Against Terrorism.

“Unfortunately, even smaller platforms have limited ability to handle terrorist use of their services,” he explained to Global News in an emailed statement.

“With TCAP we are able to quickly alert this content to smaller platforms and thereby prevent the content from spreading across the internet before it goes viral.”

Story continues below ad

TCAP begins with a team of open source intelligence (OSINT) analysts figuring out which platforms are preferred among terrorist entities. This team then identifies links to smaller terrorist-operated websites and social media platforms where their content is hosted, and uploads those links to TCAP.

Automated scrapers also extract data from platforms identified by OSINT, uploading relevant links to TCAP.


Click to play video: 'Terror survivors aim for the future to help others'







Survivors of Terrorism with a Future Aim to Help Others


Surviving Terrorism With a Future Aim to Help Others – September 12, 2022

Once these links are uploaded to TCAP, they are verified and serialized into the terrorist organization concerned. If the verified links are on a platform that is registered with TCAP, the tech company will receive an automated alert deciding whether or not they want to moderate the content. The platform also monitors content to see what tech platforms decide to do.

As a final step, TCAP archives material that it describes as “academic and human rights purposes” for its website. Although it is not yet available, the collection of tools will eventually be opened up to researchers and academics.

Story continues below ad

To date, the TCAP tool has sent 20,000 alerts to 72 different platforms, according to its website.

Alert has dealt with a total of 34 different terrorist entities.

In its latest transparency report, which covered the period from December 2020 to November 2021, Tech Against Terrorism said that 94 percent of the content about which the TCAP tool alerted tech platforms was eventually removed.

However, the expulsions did not occur evenly among the different terrorist groups. On average, tech companies that received alerts about Islamic terrorist content took down 94 percent of the content they flagged.

However, the removal rate of far-right terrorist material after the alert was only 50 percent.

On top of that, far-flung media were submitted to the TCAP at a very low rate. While 18,787 submissions were made about Islamic terrorist content – resulting in 10,959 alerts sent – only 170 submissions were made about far-right terrorist material, resulting in 115 alerts sent.

Story continues below ad

Read more:

Some trucker convoy organizers have a history of white nationalism, racism

The reason for the very low submission rates may be due to the stringent verification processes carried out by the TCAP tool. To be considered for an alert, content must be linked to a designated terrorist organization – an official classification created in Canada under the Anti-Terrorism Act.

Canada only began adding right-wing extremist groups to its list of outlawed terrorist organizations in 2019, when it added the names of Blood and Honor and Combat 18.

Hadley said, “We further closely follow the designation of violent far-right organizations, and will include any newly designated organizations in the TCAP as soon as they are legally designated by the above democratic institutions and nation states.” “

“We would argue that major democracies need to do more to ensure that far-right violent extremist organizations, groups and individuals are designated.”

Debate on the efficacy of automatic flagging

According to Hadley, part of the goal of the latest round of funding is to help TCAP ramp up its efforts to archive the content it flagged.

Story continues below ad

He added that funding from Canada will, in part, “ensure that content referrals are auditable and accountable, providing access to original content after referrals for removal.”

Auditing content that has been flagged for removal is one of the important steps in this process, according to extremism-focused author and researcher JM Berger, who has authored four critically acclaimed books.

“There is a dire need of some sort of organized effort to archive extremist material that is vulnerable to takedowns, which is one of the functions of TCAP,” he told Global News.

“This material is not only important for prosecution and research, but it is an essential component in any effort to audit how tech companies conduct takedowns.”

As things stand now, the current takedown regime is “quite opaque,” Berger said.

“The collection may enable some of the first steps towards accountability, but much more needs to be done.”


Click to play video: 'Canada adds 13 organizations including Proud Boys to terrorist list'







Canada adds 13 organizations including Proud Boys to terrorist list


Canada adds 13 organizations including Proud Boys to terrorist list – February 3, 2021

However, not everyone agrees that automation is the best route for managing terrorist content online, including Stephanie Carwin, a former CSIS analyst who now teaches at Carleton University.

Story continues below ad

“I’m not necessarily against it,” Carwin said of the TCAP tool.

However, she said tech companies should take more initiatives to deal with far-fetched content on their platforms without relying on automated tools.

“The fact is you have problems with the far right that have to be addressed with the (tech) companies themselves.”

Some major tech companies have taken recent steps to crack down on the content of white supremacists and far-right militias.

According to Reuters, in 2021 several US tech companies including Twitter, Alphabet, Meta – still known as Facebook at the time of this announcement – ​​and Microsoft will begin contributing to the Global Internet Forum to Counter Terrorism (GIFCT) database .

This allows them to share their data across different platforms to better identify and remove extremist content.

Still, despite the efforts of tech companies and TCAP, some far-right content risks falling through the cracks, given how quickly their symbols and memes change relative to Islamist terrorist groups like Daesh.

“When you had groups like Daesh using their flags and stuff… it was very easy,” Carvin explained.

Story continues below ad

“But the furthest thing, for example, which I think is the primary concern of the Government of Canada is that memes and content change very quickly.”

Read more:

Canada adds neo-Nazi groups Blood and Honor, Combat 18 to list of terrorist organizations

Meanwhile, the Canadian government says providing online security is a “central part” of efforts to keep Canadians “safe,” according to a statement by Audrey Champoux, a spokeswoman for the minister of public safety, sent to Global News. was.

“We must confront the rise of hate, misinformation and propaganda, and violent extremism that very often spread and spread online – which can result in real-world consequences.”

— With files from Reuters

© 2022 Global News, a division of Corus Entertainment Inc.

RELATED ARTICLES

Most Popular