The Umati project seeks to understand the dissemination of hate and dangerous speech in the Kenyan online space. During the 2007 general election in Kenya, when Internet penetration in the country was still quite low, mobile text messages (SMS) were sent to incite the public to violence that left over 1,1oo people dead and about 300,000 displaced. With today’s technological advancements and the rapid growth of social media, hate speech can reach thousands of people online, and potentially, cause much more damage. With the Internet’s potential reach and influence, as both a possible threat and solution, the Umati project set out to create the tools and methodology to monitor the online discourse and analyse the roots of hate and dangerous speech.
The first phase of Umati ran from September 2012 through May 2013. After a month recess, the second phase began in late July 2013. This quarterly report summarizes the progress and findings of the period from July to September 2013.
In the second phase, we have continued to monitor online content and record incidents of hate and dangerous speech in online forums, social media networks and blogs. We were also able to expand the project methodology to include some efforts at automating the monitoring process through Machine Learning (ML) and Natural Language Processing (NLP).
During the period from the end of July through September 2013, we discovered several distinct findings:
- First, hate speech has become more diverse; it is not only tribal, but also includes other forms of hate speech based on religion, sexuality, and gender. In the previous phase, most hate speech found online was focused on tribal divisions.This recent period also saw offensive speech (the lowest category of hate and dangerous speech) as the most common form 0f hate speech, followed by moderately dangerous, and finally, the extremely dangerous category. This trend seems to roughly follow previous trends shown in the first monitoring phase. This is a shift from the election period when more direct calls to violent action were more common.
- We also found that identifiable commenters were the most active disseminators of hate speech. This could be attributed to the lack of serious action taken against propagators by the National Cohesion and Integration Committee (NCIC), the government body tasked with addressing or taking action against such speech. There could be a perceived lack of consequences for disseminating hate speech.
- Speech encouraging discrimination against others was highest. The most frequent call to action was encouraging discrimination against other groups. The crafting of statements and reactions has mostly implied the discriminatory speech to be a result of commonly held or acceptable views on the various communities targeted. These calls to action haven’t been explicit either, and imply a sense of resignation to holding the discriminatory view against different groups.
As ‘netizens’ congregate and converse online, forming networks around issues of interest, the possibility of organizing offline reactions to online conversations is likely. Authorities, while acknowledging and endorsing online media through adoption (as has been the case with various arms of the Kenyan government) are yet to appreciate these findings and address them effectively.
Also, as we have previously observed, most dangerous speech occurs as a response to major ongoing events on the ground and this trend could be useful to relevant stakeholders for better understanding of reactions in such instances. Authorities, for instance, should be on high-alert after noteworthy events to be ready to deal with potential acts of backlash violence as evidenced by any subsequent hate and dangerous speech.
We continue to monitor and surface these insights, and build a database with which problem statements can be devised towards effective addressing and mitigation of hate speech in the Kenyan online space.
The Umati end year report will be released in December, and will offer insights into trends in online hate speech in Kenya in 2013.