Using the ‘Harm’ Argument to Censor Online Speech
October 4, 2022
Canadian PM Justin Trudeau, Canadian Heritage minister Pablo Rodriguez, Gov. Gen. Mary Simon at cabinet swearing-in ceremony.

From left: Canadian PM Justin Trudeau, Canadian Heritage minister Pablo Rodriguez, Gov. Gen. Mary Simon at cabinet swearing-in ceremony. Image via Flickr (Public Domain)

 

What constitutes harm?

This is the question currently debated by Canadian lawmakers surrounding the Online Harms Bill — a controversial bill currently under development that seeks to protect vulnerable people online and curb the dissemination of ‘harmful’ content.

In July 2022, the Canadian Heritage Department — the department tasked with overseeing the online harms bill — restarted the consultation process. The consultation process is part of the Canadian federal government’s efforts to revive the bill.

Canadian Heritage Minister Pablo Rodriguez has been trying to push the bill forward despite an earlier version introduced last year that was met with widespread criticism. While lawmakers haven’t yet reached a consensus on what exactly is meant by ‘harm,’ there seems to be unanimous agreement that “we have to do something,” Rodriguez told reporters.

Rodriguez added, “[P]eople are seeing things that they shouldn’t be seeing on the internet, facing threats, receiving all kinds of stuff, very nasty stuff… and it’s our obligation as a government to act.”

Rodriguez is also behind advancing another highly contested bill, Bill C-11, that would see a more regulated internet for Canadian users and creators alike. It seems Rodriguez is on a mission to regulate the internet in more ways than one.

Based on the initial proposal, we know that lawmakers seek to include five types of content they deem harmful, including: terrorist content; content that incites violence; hate speech; non-consensual sharing of intimate images; and child sexual exploitation content. While these definitions are defined in the Criminal Code, it is noted that lawmakers may further modify them to a ‘regulatory context,’ which may cause confusion, increase censorship, and potentially undermine the Canadian Charter of Rights and Freedoms.

Every one of these categories is subject to different interpretations depending on who is assigning meaning to it. We already know that hate speech laws can be used as a tool to silence marginalized voices or quash dissent. We know that terrorist content can be subject to various interpretations, e.g., some Canadians have expressed concern that Muslim Canadians may be disproportionately impacted. Even defining ‘child sexual exploitation content’ has become more contested in this age of polarization and cultural degradation.

Some Canadians are even calling for the removal of ‘conspiracy theory’ content. This concept, much like the concept of ‘harm,’ comes up against the same challenges and limitations. Who would determine what is a ‘conspiracy theory’? This concept could easily be used as a tool to silence anyone who holds an opinion that deviates from the status quo.

Lawmakers are also tossing around the idea of implementing a 24-hour takedown requirement for ‘harmful’ content. This may force online platforms to “over compliance” and preemptively take down content that does not fall within the ‘harmful’ categorization. Canadians who want to challenge the removal of their content may also be subject to a new appeal system which may further bureaucratize an increasingly sanitized and regulated internet.

The government is expected to hold public hearings to give Canadians the opportunity to voice their concerns, though based on past hearings (particularly for Bill C-11), it’s unclear whether these concerns will actually be fairly heard by MPs.

Ultimately, those who argue for heavy-handed moderation often cite the reason of ‘harm’ as to why certain speech should be censored, but they’re often sticky on details as to what would be considered harmful.

While defining ‘harm’ is one part of the problem, the other is actually enforcing it.

While the government can pass laws stipulating what is harmful and on what grounds, it’s up to online platforms to actually enforce these new rules. These platforms already wield so much power online–this move would give them even more power to censor content with the government’s go-ahead.

Tech companies are already susceptible to their own set of biases. Content moderators may be liberal in the way they apply this law depending on their own belief systems. For them, it would be easier to justify the removal of content if the definition of ‘harm’ is so wide ranging. Platforms may take the extra steps to remove content they personally disagree with, not content that is necessarily deemed ‘harmful.’ While this is already being done on major tech platforms, this law would give even more power to Big Tech and away from the user.

Bills like this are dangerous because they rely on broad concepts that could easily be interpreted in such a way that punishes users.

Nations, including Canada, may have delineated hate speech laws to determine what is considered hateful, but regulating the concept of ‘harm’ would take it so much further.

In a polarized climate, this bill can be easily used to silence wrongthink, punish dissidents, and facilitate a chilling effect on online discourse. Ultimately, passing a bill that primarily revolves around the premise of what is deemed ‘harmful’ would be a disaster for online speech.

 

Free speechPlebity
The why and what of our Free Speech and the Left virtual conference

The why and what of our Free Speech and the Left virtual conference

Conference

Free speech and the left conference 2023

Momentum is building for our Free Speech and the Left virtual conference scheduled for June, 2023.

The conference participants include some of today’s most interesting voices, coming together to discuss the most divisive issues of the day.

We’re in active collaboration with the New Zealand site Redline, India & the Global Left and acTVism Munich.

What is the left?

Panelists will consider whether leftist thought offers relevant and useful ways of thinking about capitalism and exploitation, empire and forever wars, cancel culture and identity politics, environmental destruction and degrowth, social justice and universalism, animal rights and morality, and of course free speech.

Animal rightsRozali Telbis
How Rhetoric Shapes the Animal Rights Movement

How Rhetoric Shapes the Animal Rights Movement

Animal Rights

A terrified cow looks through an opening from the inside of a transport truck arriving at a Dutch slaughterhouse.

Open dialogue is an important tool for moving discourse forward and gaining a better understanding of the issues we face in our time. In the spirit of open dialogue, the following is my response to the essay Animal Rights and the Challenge of Activism.

In the essay, the author describes the different tactics used by animal rights activists to persuade non-vegans. She emphasizes the importance of free speech, open inquiry, and debate, in particular the importance of non-vegans’ ability to challenge vegans. But in doing so, she also unwittingly exposes how her own rhetoric might influence people in such a way that might undermine the animal rights movement. My response isn’t an attempt at a ‘take-down’ of the author, but instead it is a way to show how important language is in animal rights activism and how even those of us with the best intentions can fall into these traps.

CanadaTechnologyRozali Telbis
Update: Senate Passes Controversial Internet Censorship Bill

Update: Senate Passes Controversial Internet Censorship Bill

Article

Canadian Heritage Minister Pablo Rodriguez

On February 2, 2023, the Senate passed Bill C-11, also known as the Online Streaming Act, with 43 senators voting ‘yea,’ and 15 voting ‘nay.’

The Senate proposed dozens of amendments to the bill, including highlighting the promotion of Indigenous languages and Black content creators; proposing an age verification system to restrict access to certain content; requiring the CRTC to be more flexible on determining what is deemed ‘Canadian enough’; and requiring the CRTC to focus on commercial content only.