Digital policy experts say the Liberal government should exercise restraint in developing pledge legislation to regulate online harms.
In an interview with CBC Radio the house On the program, which aired Saturday, Emily Laidlaw, an associate professor at the University of Calgary, said the federal government faces major challenges in crafting a bill that is technically workable and doesn’t go too far. .
“Writing legislation is really difficult,” Laidlaw, who holds the Canada Research Chair in Cybersecurity Law, told host Katherine Cullen.
“Everywhere you turn, this raises issues of freedom of expression. It’s difficult to get it right because some of the solutions are actually technical solutions. It takes a lot of effort and detailed work. and not everyone will agree on the final outcome.” ”
the house10:36For the Liberals, there is growing urgency to introduce online harms legislation
Between the rise in online anti-Semitism and Islamophobia, and recent horrific sextortion cases involving young people, there are a number of pressing issues that the Liberals’ long-promised Online Harms Bill could address. Host Katherine Cullen speaks with Emily Laidlaw, Canada Research Chair in Cybersecurity Law, and Matt Hatfield of Open Media to find out what the new law is and why. Let’s talk about why it takes so long.
Prime Minister Justin Trudeau’s federal Liberal Party has long promised legislation to address online harm, which includes issues ranging from harassment to child sexual exploitation.
Earlier this week, The Canadian Press reported that the government is also considering making sure the bill covers exploitative deepfakes, such as the recent images of Taylor Swift that drew attention around the world, including from the White House, in January. It was reported that
In an emailed statement to The Canadian Press, Justice Minister Arif Virani said: “Keeping children and young people safe online is a vital part of government’s legislative agenda, especially given the evolving capabilities of AI. It is a top priority.”
He cited deepfakes as content that has the potential to “exacerbate online exploitation, harassment, and cyberbullying.”
An earlier version of the Online Harms Bill, introduced just before the 2021 election, faced strong criticism from a range of stakeholders.
Possible change in approach
The concept of the bill was unpopular with Conservative Leader Pierre Poièvre, who has consistently criticized the Liberal government’s overreach and censorship, saying the government “cannot distinguish between hate speech and speech they hate.”
But Mr Laidlaw said there was hope the Government would reconsider its approach to the issue away from a “removal model” and towards a “duty of care” approach.
“My biggest concern right now is that the moment this bill is introduced, it’s going to be positioned as the savior of the internet,” Laidlaw said. “If this works, it will actually be relatively narrow. It won’t solve all the problems of online harm.”
Matt Hatfield, executive director of advocacy group Open Media, agreed that the first version of the government’s bill would not have been a good outcome.
“There were a lot of very serious problems with that proposal at the time. It took a very simplistic, very punitive approach, and I think it led to the removal of a lot of legitimate content.” he said.
“I sincerely hope that sensible legislation will be introduced that directly addresses the most sensitive content, strengthens platform transparency, and establishes regulators who can provide more information about what is happening on their platforms. Perhaps further legislation will be warranted in the future.”
Hatfield said he was worried about “how much we’re going to do.” [the government] This is what we have actually learned from the wider consultation that was pursued during the development of this new law.
Canada’s push to enact online harm legislation is part of a broader international push to regulate social media companies. The federal government is already at loggerheads with big tech companies over the online news bill.
Meta, the parent company of Facebook and Instagram, said recently It would restrict teens from viewing content related to suicide, self-harm, and eating disorders.
“Internet companies should be held accountable for enforcing standards against harmful content. It’s impossible to remove all harmful content from the internet, but it’s possible that people use dozens of different shared services, all with their own policies. A more standardized approach is needed when used in processes,” CEO Mark Zuckerberg wrote in 2019.
“I think this bill, done right, has the potential to do more good than either bill.” [government’s Streaming Act or Online News Act]” Hatfield said.
“If you do it wrong, it can cause even more harm. So I think it’s really important that we get this right.”