by MWNUK Team
Concerns over the spread of misinformation fanning the flames of hatred have led to many leaving X. Recent months have seen the exodus of well-known organisations, including The Guardian, which has described X as a ‘toxic media platform.’
In challenging negativity, on 18th March, 2025 Muslim Women’s Network UK in collaboration with X hosted an online safety training session for Muslims. The event was part of the Muslim Heritage Month initiative on tackling anti-Muslim prejudice through positive story-telling and aligned with the UN International Day to Combat Islamophobia, observed annually on the 15th March.
The training session was led by Hakim Charles, who is the Global Government Affairs at X. Initially the session covered the basics before diving into X’s policies and mechanisms for safety and reporting. Tips on account safety, such as using two factor authentication and enabling password reset protection were recommended to add extra layers of security for X account. While all accounts on X is public, there is the option of protecting posts by making them visible to those who follow you, therefore enabling users to control the conversation and tracking of their posts. Other options for controlling interaction include: requiring follow requests for account, restricting replies to followers or those being followed, and utilising the block/mute features to manage engagement.
‘We really do care and would like everyone’s voice to be heard,’ said Charles. Although reassuring words demonstrating a willingness to foster better online social ties, more is needed to build confidence in X’s response to fake news virality. Particularly as many Muslims have expressed concerns over the way social media amplified harmful content that aggravated racism and religious hostility during and after the UK summer riots in 2024.
‘We really do care and would like everyone’s voice to be heard,’ said Charles.
During the training session, questions were asked about the policy guidelines; how the platform ensures posts and account holders do not incite hatred and the consequences for those who repeatedly violate the policies.
Explanation of X’s rules and policies on violations involving violent speech, hateful conduct and harassment dispelled some assumptions that X is a poorly regulated platform with weak content moderation policies. Enforcement actions carried out against users that violate guidelines include: account bans, read-only mode for accounts that violate policy for 30 days after posting; de-amplification of posts and accounts that have reached their limit in violations and privacy violations (i.e posting someone else’s private content without consent). These enforcement options indicate there is accountability and safety standards for users. However, the effectiveness of these tools depends on how well AI recognises violations and human moderation, which can be subjective. For more details, users can refer to The X Rules to stay informed about the platform’s policies.
Community Notes was highlighted as a useful tool for countering misinformation as it brings perspective of people from the community. In the UK, there are over 71,000 contributors on Community Notes and this is increasing by day. When contributors flag posts as inaccurate, they are less likely to be amplified. Additionally, users who previously engaged with the post will receive an alert about its inaccuracy. This can be a useful tool for Muslim organisations and individuals to engage with and be part of in providing a counter perspective and content. Using Grok, which is an AI assistant integrated with X can also be a useful tool to find information as it will answer questions and provide unfiltered responses, drawing from vast amounts of data. Although this feature is exclusive to premium users.
For X to continuously improve its moderation process reporting is important. Users are encouraged to report any posts that violates policy guidelines by either tapping on the three dots in the right corner of a post or selecting report post.
It is worth acknowledging that X has made notable improvements in platform safety, as reflected in the H2 2024 Global Transparency Report:
· 28% decrease in total account violations, largely due to a drop-in platform manipulation and spam violations.
· 61% decline in total content violations, with a 64% reduction in platform manipulation and spam.
· 19% decrease in non-platform manipulation and spam reports from H1 to H2.
Additionally, the DSA Transparency Report (October 2024) provides further insights, with an updated version expected soon.
The training session highlighted the importance of engagement and action. By staying vigilant and making use of the available safety tools, the Muslim community can create a safer and more positive experience on the X platform.
Raise your voice and get connected