5 tips for recognizing and dealing with online hate Published in 2022 1. Understand what online hate is According to the UN, online hate is defined as “any kind of communication in speech, writing or behavior, that attacks or uses pejorative or discriminatory language with reference to a person or a group on the basis of who they are, in other words, based on their religion, ethnicity, nationality, race, color, descent, gender or [any] other identity factor.” It is hard to estimate the psychological damage that results from online hate, since it is hard to see people’s reactions and nonverbal behavior online. Examples of online hate are ‘trolls’ who enter Zoom meetings and use hateful, racial slurs targeting participants; or hateful groups online who put out posts that dehumanize identifiable groups. These types of posts and attacks are not OK. Everyone has the right to non-discrimination and to be free from hate. 2. Use human rights to educate yourself and others about online hate Children and young people have the right to be protected from all forms of discrimination, both online and offline, as do adults. The recent introduction (in March 2021) of General Comment 25 by the UN Committee on Children’s Rights acknowledges that: “children may be discriminated against […] by receiving hateful communications.” To know what constitutes a hateful post, it is also important to differentiate it from what is typically referred to as cyberbullying. 3. Know the difference between online hate and cyberbullying Both cyberbullying and online hate are serious and can cause victims a great deal of distress. Bullying usually targets an individual, while hate may incite violence towards an entire group of people. Both can have devastating consequences. To better recognize hateful conversations online and avoid joining them, it’s essential to avoid being pulled toward toxic conversations. 4. Recognize conflict in conversations Social media encourage and even guilt people into reacting quickly. Inflammatory posts – ones that garner a lot of attention and provoke people – are often used purposefully as a way to solicit young people to join their online forums and communities. Trolls are well-versed at provoking others in tactic. Trolling is deliberately putting out enraging comments to cause outrage. Young people may fall prey to these posts and engage in conversations where they are being manipulated or victimized, often without realizing it. A question to ask yourself when deciding whether to respond to a post is: am I feeling angered by this? Awareness, as a first step, can help you monitor your own reactions to posts more effectively; and it can also help you decide whether conversations are put out by safe and well-meaning groups rather than ones instigating hatred. 5. Engage in empathic and peace-building conversations online When we see discrimination and injustice online, it is stifling not to be able to talk about it. Unfortunately, digital spaces and social media are not always supportive of people speaking out against injustice. When users defend their rights and promote social justice, there is often an outpouring of cancelling and backlash which can escalate into outcries from hateful groups online. Good practices for discouraging online hate include replacing negativity with positivity, and using counterspeech – through hashtags and digital allyship – to respond to hateful posts. Users themselves have the power to influence digital culture, to amplify marginalized voices, and to build more empathic and inclusive online communities. N.B. The points provided are not intended to give legal advice to readers. For legal information, please visit Canadian legal information websites, your provincial or federal Ministry of Justice site, or your provincial human rights commission.