A think tank estimates that there are approximately 10,000 tweets per day that contain racial and ethnic slurs – which amounts to 1 in 15,000 total tweets.
UK-based think tank Demos published a study on the way hate speech is employed online.
The team scraped all tweets that contained one or more pre-defined slurs (which were crowdsourced via Wikipedia) over a nine day period in November. These tweets were filtered to ensure the slurs occurred in the body of the tweet, as opposed to the username of the individual sending the tweet. All-in-all, the researchers studied 126,975 tweets.
Using two types of analysis, the researchers determined that there are approximately 10,000 English tweets per day that contain some sort of slur. However, not all slur-containing tweets were meant to be offensive.
The study found that slurs were employed in six distinct ways: negative stereotype; casual use of slurs; targeted abuse; appropriated; non-derogatory; and offline action/ideologically driven.
And the most common type of slurs were actually meant in a non-offensive, non-abusive manner, or to express in-group solidarity. This group represented between 47.5-70 percent of tweets, depending on whether human or computer analysis were employed.
The study also found that different specific slurs were used in different ways, with some lending themselves more to descriptive uses while others were used more derogatorily.
You can read the study in its entirety here [PDF].
(Expletive image via Shutterstock)
- Live-Tweeting Significantly Boosts Follower Growth for TV Shows
- Social Media Ad Spend Share Worldwide [STUDY]
- UK Social Ad Spend to Reach £1 Billion in 2015 [STUDY]
- Time Spent in Social Media Apps Rises 49% After Strong 'Snacking' Behaviour [STUDY]