All Social Media platforms need to do more to properly monitor their platforms for any activity that may involve extremist groups or conspiracy theories. New rules have been set out, indicating how social media firms should moderate their content.

Facebook claims it had removed 30,000 pages, events and groups related to what it called “militarised social movements”. Monika Bickert, Facebook’s vice president of global policy management stated: “We have a 24-hour operation centre where we are looking for content from groups… of citizens who might use militia-style language.” Yet, half of all designated white supremacist groups still had a presence on Facebook just last year.

Twitter was also questioned on its decision to ban former President Donald Trump. In response, Nick Pickles, the head of Twitter’s public policy strategy declared that it was time to “move beyond” a debate about whether social networks are enforcing their own rules properly.

Finally, while TikTok did not play a huge part in motivating the riots, Yvette Cooper, the chair of the Home Affairs Committee did find that the video platform contained a significant amount of anti-Semitic content.

 

The post Social Media Fails to Monitor Extremist Content appeared first on IT Security Guru.