Social media giant Facebook removed 20 million pieces of content that contained misinformation about the coronavirus pandemic across its various platforms in the second quarter.
In its latest transparency report, the company, which also owns Instagram, said the prevalence of hate speech on its platforms has declined sharply due to improvements in detecting it and changes made to its News Feed.
The social media giant said that in addition to deleting 20 million posts from Facebook and Instagram on Covid misinformation, it deleted more than 3,000 accounts, pages and groups for repeatedly violating its rules against spreading false information related to the pandemic.
More than 190 million pieces of Covid-related content on Facebook had warnings displayed on them, the company added.
Facebook said the level of hate speech has fallen for three quarters in succession with removal of such content increasing by 15 times since it first began reporting it.
Overall, the level of hate speech fell to 0.05 per cent, or five views per 10,000 views. This compares to 0.05 to 0.06 per cent, or five to six views per 10,000 views in the first quarter.
The company said it took action on 6.2 million piece of organised content on Facebook in the second quarter, down from 9.8 million for the first three months of 2021. It also removed 16.8 million pieces of content related to suicide and self-injury, up from 5.1 million in the first quarter, due to a technical fix it made that allowed the company to go back and catch pieces it had previously missed.
Action was also taken on 34.1 million pieces of violent and graphic content on Facebook in the second quarter, it said.
With regard to Instagram, some 667,000 of organised hate content was uncovered and action taken against, in addition to 3 million pieces of suicide and self-injury content and 7.6 million violent and graphic text and/or visuals.
In terms of child safety, Facebook said it took action on 2.3 million pieces of content related to nudity and physical abuse on its flagship platform, as well as 458,000 pieces from Instagram. On child sexual exploitation content it took action against 25.7 million pieces on Facebook and 1.4 million on Instagram.
Facebook has more than 15,000 content moderators globally, with a large number of these based in Dublin. Many of these are employed by Covalen, a subsidiary of the Irish recruitment company CPL, which is contracted by the social media giant.