Using the ‘Harm’ Argument to Censor Online Speech
October 4, 2022
Canadian PM Justin Trudeau, Canadian Heritage minister Pablo Rodriguez, Gov. Gen. Mary Simon at cabinet swearing-in ceremony.

From left: Canadian PM Justin Trudeau, Canadian Heritage minister Pablo Rodriguez, Gov. Gen. Mary Simon at cabinet swearing-in ceremony. Image via Flickr (Public Domain)

 

What constitutes harm?

This is the question currently debated by Canadian lawmakers surrounding the Online Harms Bill — a controversial bill currently under development that seeks to protect vulnerable people online and curb the dissemination of ‘harmful’ content.

In July 2022, the Canadian Heritage Department — the department tasked with overseeing the online harms bill — restarted the consultation process. The consultation process is part of the Canadian federal government’s efforts to revive the bill.

Canadian Heritage Minister Pablo Rodriguez has been trying to push the bill forward despite an earlier version introduced last year that was met with widespread criticism. While lawmakers haven’t yet reached a consensus on what exactly is meant by ‘harm,’ there seems to be unanimous agreement that “we have to do something,” Rodriguez told reporters.

Rodriguez added, “[P]eople are seeing things that they shouldn’t be seeing on the internet, facing threats, receiving all kinds of stuff, very nasty stuff… and it’s our obligation as a government to act.”

Rodriguez is also behind advancing another highly contested bill, Bill C-11, that would see a more regulated internet for Canadian users and creators alike. It seems Rodriguez is on a mission to regulate the internet in more ways than one.

Based on the initial proposal, we know that lawmakers seek to include five types of content they deem harmful, including: terrorist content; content that incites violence; hate speech; non-consensual sharing of intimate images; and child sexual exploitation content. While these definitions are defined in the Criminal Code, it is noted that lawmakers may further modify them to a ‘regulatory context,’ which may cause confusion, increase censorship, and potentially undermine the Canadian Charter of Rights and Freedoms.

Every one of these categories is subject to different interpretations depending on who is assigning meaning to it. We already know that hate speech laws can be used as a tool to silence marginalized voices or quash dissent. We know that terrorist content can be subject to various interpretations, e.g., some Canadians have expressed concern that Muslim Canadians may be disproportionately impacted. Even defining ‘child sexual exploitation content’ has become more contested in this age of polarization and cultural degradation.

Some Canadians are even calling for the removal of ‘conspiracy theory’ content. This concept, much like the concept of ‘harm,’ comes up against the same challenges and limitations. Who would determine what is a ‘conspiracy theory’? This concept could easily be used as a tool to silence anyone who holds an opinion that deviates from the status quo.

Lawmakers are also tossing around the idea of implementing a 24-hour takedown requirement for ‘harmful’ content. This may force online platforms to “over compliance” and preemptively take down content that does not fall within the ‘harmful’ categorization. Canadians who want to challenge the removal of their content may also be subject to a new appeal system which may further bureaucratize an increasingly sanitized and regulated internet.

The government is expected to hold public hearings to give Canadians the opportunity to voice their concerns, though based on past hearings (particularly for Bill C-11), it’s unclear whether these concerns will actually be fairly heard by MPs.

Ultimately, those who argue for heavy-handed moderation often cite the reason of ‘harm’ as to why certain speech should be censored, but they’re often sticky on details as to what would be considered harmful.

While defining ‘harm’ is one part of the problem, the other is actually enforcing it.

While the government can pass laws stipulating what is harmful and on what grounds, it’s up to online platforms to actually enforce these new rules. These platforms already wield so much power online–this move would give them even more power to censor content with the government’s go-ahead.

Tech companies are already susceptible to their own set of biases. Content moderators may be liberal in the way they apply this law depending on their own belief systems. For them, it would be easier to justify the removal of content if the definition of ‘harm’ is so wide ranging. Platforms may take the extra steps to remove content they personally disagree with, not content that is necessarily deemed ‘harmful.’ While this is already being done on major tech platforms, this law would give even more power to Big Tech and away from the user.

Bills like this are dangerous because they rely on broad concepts that could easily be interpreted in such a way that punishes users.

Nations, including Canada, may have delineated hate speech laws to determine what is considered hateful, but regulating the concept of ‘harm’ would take it so much further.

In a polarized climate, this bill can be easily used to silence wrongthink, punish dissidents, and facilitate a chilling effect on online discourse. Ultimately, passing a bill that primarily revolves around the premise of what is deemed ‘harmful’ would be a disaster for online speech.

 

Free speechReligionMark White
Is criticism of religion an intolerant act of bullying against a minority or a courageous challenge to established power?

Is criticism of religion an intolerant act of bullying against a minority or a courageous challenge to established power?

ARTICLE

Religion and Free speech - Daniel Ben-Ami and Jacob Mchangama

One afternoon last June, Salwan Momika, an Iraqi refugee living in Sweden, stepped in front of the largest Mosque in Sweden and lit several pages from the Quran on fire. Momika had sought and received a permit from the Swedish authorities for his provocative demonstration, and there were police monitoring the event.

The event, apparently involving two people, caused international uproar and condemnation from world leaders, especially in the Middle-East. It may affect Sweden’s prospects for entry into NATO, and both the Pope and Putin felt the need to weigh in.

Conference 2023Free speechFree speech and the left conference 2023
Reflections on Covid: One of the Most Elaborate Propaganda Campaigns in Modern History?

Reflections on Covid: One of the Most Elaborate Propaganda Campaigns in Modern History?

Conference 

Reflections on Covid: One of the Most Elaborate Propaganda Campaigns in Modern History?

Panelists: Max Blumenthal, Christian Parenti

Description:  An exploration of the Covid propaganda from March 2020 to the present day; how the official Covid Narrative is unraveling (or not), the left’s support for authoritarianism, democratic societies’ descent towards authoritarianism, the rapid adoption of digital tech under the guise of public health, inequities created by Covid, capitalism and the wealth disparity, the psychology behind conformity, the conflation of ‘freedom’ and far-right beliefs, the state of health care, and more.

Max Blumenthal is a journalist and editor of the independent site The Grayzone. He is the author of several books and documentaries and is based in Washington, DC.

Christian Parenti is a professor at John Jay College and a journalist. He has written a number of articles about the pandemic, including How the organized Left got Covid wrong, learned to love lockdowns and lost its mind: an autopsy. Christian is author of the book The Soft Cage: Surveillance in America, from Slavery to the War on Terror.