Death by social media: thoughts on the emerging social challenge.

Rumors have always spread since the beginning of time, sometimes with disastrous consequences for society. They are harmful because there is no way good way to fight them. Yet they can be everywhere. The more vicious they are, the more they are repeated and believed, and targets can suffer irreparable harm from them. Yet modern day social media platforms have multiplied the risk. Whereas in the past, a rumor limped its way around a community, today, it spreads far and wide in a matter of seconds. Platforms like Facebook and WhatsApp are not inherently bad, but they have accentuated the toxic rumors risk. In India, this phenomenon has evolved sometimes with tragic consequences which warrant government intervention in the interest of public safety.

Recurring tragedies
The following cases provide examples of this challenge:

  • In 2017, a message falsely alleging child abductors, speaking  Hindi, Bangla and Malyali were on the prawl, was circulated on Whatsapp, leading to the lynching of innocent people in Jharkhand.
  • In May 2018, a 55-year-old woman was killed by a mob in southern Indian state of Tamil Nadu, amid false Whats App messages and rumors over kidnapping of children.
  • In June 2018, two men that had stopped to ask for directions were beaten to death by a large mob in north-eastern Assam state. They were mistaken for being child abductors following a message spread on Whatsapp leading unsuspecting villagers into believing that their children may be targeted by abduction gangs.

A significant adaptive challenge

It may very well be that mischief-makers bent on causing community alarm and despondency spread these messages, with tragic consequences including disturbing the peace. We cannot however, just pin the blame on the technology, because that assumes people were good prior to the advent  of social platforms.

What is clear is that there is no technical fix to this challenge of vigilante justice. There is neither a silver bullet to it, nor a known answer to the challenge. This situation is a typical adaptive challenge, which has no easy answers. Nonetheless, the remedies to this situation can be found by analysing the key stakeholders to the challenge. These stakeholders can be mapped as follows:

The stakeholders to these adaptive challenges can can be managed as follows:

  • Facebook and Whats App
    With over half a billion internet users in India, according to the Washington Post, both platforms have a significant customer base. Facebook has over 200 million users while WhatsApp has over 270 million users. Since Whats App is owned by Facebook, we can argue that the same company has nearly the entire population on the Internet in India on their platforms. Because India contributes almost a quarter of Facebook 2.27 billion users, it is important that Facebook understands its responsibilities to public safety in a market it has a significant stake in. To that extend, engaging both Facebook and Whats App ought to focus on:
    – Engaging both platforms, which derive significant value from India, to send messages to their users warning of the spread of such toxic rumours on their platforms, especially once alerted by either the public or authorities.
    – Request users to flag or alert them within those platforms where a suspiciously toxic and dangerous message or video if being circulated. This helps to crowd-source risk management or flagging of such messages as it may not be possible for the platforms to check all messages/content users circulate. This is especially important for Whats App which does not currently have such a feature.
    – Engage Facebook to prioritise posts by professional media which usually flag such rumors when they start circulating. This is especially important because Facebook in March 2018 presented a separate newsfeed on their platform that prioritises content from family and friends while hiding away posts by professional news organisations. This has the effect of promoting and amplifying such rumors among friends and family making professional news debunking of such rumors less prominent.
    – Nudge Facebook and Whatsapp to engage local language moderators who understand local nuances to sift through content and moderate it. This would be complicated for Whatsapp that encrypts its messages, but this is where they need to crowdsource this risk by implementing a reporting function.
    – Push both platforms to set up a local corporate entity as well as appoint a grievance officer to manage and help curb the spread of rumors that have claimed several lives.
  • Local Community/Village Leaders
    For local communities, the fundamental issue is educate them to promote community cohesion and peace. For example, once the rumors start spreading among the community, the leaders in those communities can flag the issues with authorities to ensure that law enforcement kicks in. This issue goes beyond the social media platforms, it is about promoting tranquility among people in local communities.Dealing with the community entails engaging community leaders. For this to be successful, the ministry can arrange training, for example via seminars, for both law enforcement officers and community leaders in order to curb vigilante justice.In addition, there is no point in wasting a crisis. The ministry should coordinate with the justice ministry to ensure that the law is not just enforced, but perfect examples are set for perpetrators of vigilante justice.
  • Technology Ministry/State Government
    The ministry and state government both play a role in ensuring that the above stakeholders play their part, and that education programs are implemented. Evolving adaptive challenges like these require leadership, and its important for the ministry to lead these efforts, including engaging all stakeholders. In addition, the ministry can also try the following:
    -Holding social media groups administrators responsible. This is particularly important for WhatsApp where toxic messages can be circulated/broadcast rapidly within groups. Holding administrators within groups can help push for responsible actions within groups to make sure that someone is accountable.
    – In times of emergency situations where there is a severe breach of the peace or loss of lives due to nasty fall-outs arising from these rumours, the ministry can try, as a last resort to temporarily switch of the Internet access to the two social media platforms via local internet service providers. While turning off the Internet can be a plan B in times of crisis, it is not a solution to the spread of hate rumors. It simply slows down the pace. What is important is to work towards building social cohesion. Neither Facebook nor Whatsapp created the problems of distrust in the communities.
  • Local Media/Radio Stations
    Professional media can be mobilised too to help correct and counter toxic rumors that have had tragic consequences. Radio would be most effective where it has coverage as the message can be broadcast to a broad audience. 

A double edged-sword.
Technology has brought with it massive advantages to information dissemination. Platforms like Facebook have democratized communication tools by enabling anyone with a smartphone the ability to broadcast. However, these are just tools, and they can be used to spread constructive messages, or hate. Just like in the past, traditional mass media has been used to mobilize mass violence. The genocide in Rwanda in the 90s provides an informative case study. We cannot wait for tragedies and genocides to occur before we move in to manage these new e-platforms. They must be responsible, and must be held responsible as they can supercharge content that ratchets up tribal and religious hate which can upset fragile social balances in communities. 

Leave a Comment

Log in