Taliban content material banned on Fb, Instagram, WhatsApp

Taliban fighters with a vehicle on a highway in Afghanistan.

Saibal Das | The India Today Group | Getty Images

Facebook said Tuesday it would not lift a ban on content promoted by the Taliban from major Facebook platforms Instagram and WhatsApp.

The social media giant told CNBC that it considers the Afghan group, which has been using social media platforms to get its messages across for years, to be a terrorist organization.

Facebook said it has a dedicated team of content moderators that monitor and remove posts, pictures, videos and other Taliban-related content. It is unclear how many people are on the team.

Afghanistan fell into the hands of the militant Islamic group over the weekend when they captured the capital Kabul and the presidential palace. After President Joe Biden’s decision in April to withdraw US troops from Afghanistan, the Taliban made breathtaking strides on the battlefield – and almost the entire nation is now under insurgent control.

A Facebook spokesman told CNBC: “The Taliban are sanctioned as a terrorist organization under US law and we have banned them from our services under our dangerous organization policy.”

The Taliban have been banned from Facebook for several years, said a spokesman.

Facebook said it means removing accounts held by or on behalf of the Taliban, as well as those that praise, support and represent them.

“We also have a dedicated team of Afghanistan experts who are native Dari and Pashto speakers and who know the local context and who help us to identify and raise awareness of emerging problems on the platform,” said the Facebook spokesman.

Facebook said it doesn’t decide whether to recognize national governments. Instead, it follows the “authority of the international community”.

WhatsApp dilemma?

The Taliban reportedly continue to use WhatsApp to communicate. The chat platform is end-to-end encrypted, which means that Facebook cannot see what people are sharing on it.

“As a private messaging service, however, we do not have access to the content of people’s personal chats.

A Facebook spokesperson told CNBC that WhatsApp uses AI software to analyze unencrypted group information including names, profile photos and group descriptions to meet legal obligations.

YouTube, which is owned by Alphabet, said its community guidelines apply to everyone equally and that it is enforcing its guidelines regarding the content and the context in which it is presented. The company said it allows content that has sufficient educational, documentary, scientific, and artistic context.

A Twitter spokesperson told CNBC: “The situation in Afghanistan is developing rapidly. We also see people in the country using Twitter to seek help and assistance. Twitter’s top priority is keeping people safe and we’re staying vigilant. “

“We will continue to proactively enforce our rules and review content that may violate Twitter rules, particularly those glorifying violence, platform manipulation and spam,” they added.

Rasmus Nielsen, professor of political communication at Oxford University, told CNBC it was important that social media companies act consistently in crisis situations.

“Every time someone is banned, there is a risk that they are only using the platform for legitimate purposes,” he said.

“Given the disagreement over terms such as ‘terrorism’ and who can identify individuals and groups as such, civil society groups and activists will want clarity on the nature and extent of working with governments on these decisions,” added Nielsen. “And many users will be reassured that any technology used for enforcement will protect their privacy.”

Comments are closed.