Related
Should brands take a stance on social and political issues?
Geetanjali Saluja and Nitika Garg 29 Aug 2024
Public display of old SA flag is hate speech, rules SCA
Tania Broughton 25 Apr 2023
First, one must correct possible misconceptions. Nzimande says newspapers must be held responsible for the comments they carry. They already do, in both a legal and moral sense - so there is no need to change the law, nor for the intervention of state bodies - like the Film and Publications Board, which has been itching to regulate Net content. One can use existing criminal and civil law to take on those responsible for hate speech and racism online.
Secondly, the real problem lies not in those sites controlled by the major media groups, which are subject to law and self-regulation, but by the maverick, often one-person sites out there that purvey hate, prejudice and violence.
Media houses, editors and journalists - at least some of them - have been concerned about this for some time. The World Editors' Forum (WEF) last year published a guide to the emerging international best practice for dealing with the problem. Sanef, the national editors' forum, has been holding a discussion moving towards guidelines on how to deal with this. Independent Newspapers has initiated its own inquiry into the matter. Other newspapers already have controls, such as not allowing anonymous commentary and mediating all comments, which seems to cut out much of the worst material. The Press Council already oversees the websites of their members and their code outlaws hate speech and racism, along with a host of other things.
The problem is that one wants to restrict hate or other illegal speech, but not prejudice the power of the internet to enrich and expand the public conversation. The internet opens up journalism and news to participation in an unprecedented way. Previously, ordinary citizens could only write a letter or try to phone a radio station, and only a few would be published or broadcast. Citizens were just consumers of media, but the internet has made everyone a potential participant, contributor and opinion and news influencer. While the potential for this is still stymied by unequal access to the Net, it is a powerful potential tool for citizen engagement in public affairs - and one does not want to compromise this.
How then does one balance the desire to encourage an informed and interesting conversation that adds value to news and public opinion with the fact that there are those who would abuse this space, sometimes dangerously? We have to try and stop hate speech, but avoid giving anyone the power to censor out opinions just because it offends or discomfits them. We have to ensure that we target the dangerous stuff, and not just the uncomfortable.
WEF recommends an approach based as much on promoting useful discussion as preventing abuse:
Media organisations should publish clear, thorough and transparent guidelines for comments, which includes zero tolerance for hate speech and illegal content
Media houses should hire a dedicated community manager to cultivate constructive discussions and prevent abuse
Journalists and others who have valuable commentary to add (such as Blade Nzimande) should be encouraged to participate in such conversations. Journalists can pose and answer questions, respond to criticism and highlight the most interesting comments.
Give prominence to the most interesting comments
Give feedback to readers, and educate them, rather than simply deleting unsuitable information.
The shift towards a journalism that is more about dialogue between reporter and audience means that these problems are not going to go away. But, as WEF says, "they can be addressed and the potential of comments to make a positive contribution to a news outlet is considerable".