City of Nelson pushes for federal ‘duty of care’ law for digital platforms

Is it a municipal government’s job to confront online companies such as Open AI, X, and Roblox about the online harms they cause?

Nelson city councillor Keith Page thinks it is.

He says society expects a car manufacturer or a food producer to be held accountable for the quality of their products and to face consequences if that product proves to be harmful. Online platforms, he says, should be held to the same duty of care.

At council’s Feb. 3 meeting, Page proposed that Nelson council join with other municipalities to lobby the federal government to create new consumer protections against sextortion, sexual deepfakes, encouragement of self-harm, online grooming and other abusive practices that are especially dangerous for young people.

He is involved in discussions about this with his fellow tech professionals and with politicians across the country. But he also deals with it at street level in Nelson.

As the owner of The Repair Factory, a computer service company, Page says parents often ask for advice about how to protect their kids online.

“They want to know what they should be looking out for,” he says. “They need to lock down their systems, they need to understand what devices and programs their kids can be involved in because it’s often not clear. A lot of these things are pitched to family audiences (like Roblox and the YouTube Kids app) but then underneath there aren’t real human moderators, it’s all algorithmic.”

Page says the gaming platform Roblox, “a game delivered to our homes on a regular basis,” has somehow survived multiple investigations about the presence of active predators in its spaces designed for children. But the company, Page says, still does not take its duty of care seriously because, like many online platforms, it is only concerned with its numbers and earnings.

Stay in your lane?

When it was suggested at the council meeting that this issue is not within the jurisdiction of a municipal government, Page argued that this abusive activity happens in every local community.

Page told council that online social media and gaming platforms have become digital public spaces.

“It is daily life, in every home, throughout our lives, so there is a consumer responsibility, a duty of care, to create spaces that are … good for the community and the well-being of the people who use them.”

In an interview following the meeting, he said a municipal councillor’s job is not only to deal with local issues but to lobby senior governments about issues that affect everyone. He said families are struggling with these issues of online harm and “it makes sense to take it to the federal and provincial governments and say, ‘We’re seeing this kind of struggle and harm in our community. What can you do for us?’”

The lobbying mechanism is a chain of influence starting with the Association of Kootenay Boundary Local Governments (AKBLG), then the Union of BC Municipalities (UBCM) then further to the Federation of Canadian municipalities (FCM). The latter two organizations have influence over senior governments, and they regularly lobby on behalf of local governments about a wide variety of issues.

Nelson City Council unanimously agreed to take a resolution to the annual meeting of the AKBLG to be held in Trail in April.

The resolution proposes that the AKBLG and the UBCM “advocate to the government of Canada to establish clear federal legislation creating a duty of care for digital platforms, requiring reasonable measures to prevent foreseeable harm, protect youth and vulnerable users, and ensure accountability where platforms fail to meet these obligations.”

If the resolution passes at the AKBLG, it will then be presented to the annual meetings of the UBCM and the FCM later this year.

Councillor Jesse Woodward, supporting the resolution, said during the meeting he has to be vigilant at home about his child’s computer use.

“I am sure for every household with young kids, it feels like all that weight lands in the parents’ lap, rather than the companies. We have to end up being the guardrails – this is an everyday experience as a parent.”

Legislation or not

In 2024, following the suicide of a 12-year-old in Prince George who fell victim to online sextortion, the provincial government drafted online harms legislation (Bill 12, the proposed Public Health Accountability and Cost Recovery Act).

At the time, Premier David Eby likened addictive and toxic algorithms to the dangers presented by tobacco and opioids.

But in April 2024, the B.C.government put the proposed law on hold after coming to a deal with Meta, TikTok, X and Snapchat to meet collaboratively to find ways to prevent harm.

This did not prevent another online harms tragedy in B.C. in February. After the mass shooting in Tumbler Ridge, B.C., Open AI, the parent company of Chat GPT, disclosed it had not reported its prior intelligence about the shooter’s state of mind to the police.

“From the outside, it looks like Open AI had the opportunity to prevent this tragedy, to prevent this horrific loss of life, to prevent there from being dead children in British Columbia,” Eby said.

Meanwhile in Ottawa, the federal government drafted Bill C-63 in 2024 to enact the proposed Online Harms Act. The bill was then split into two sections – one about protection of children and the other about violence, hate speech, and terrorism.

Neither bill has yet reached the floor of the House of Commons.

But Page thinks this will happen soon. He is privy to national discussions about this through his membership on three committees of the Federation of Canadian Municipalities: equity and inclusion, social and economic development, and the rural caucus.

“I think we’re seeing a lot of federal movement,” he says. “You see conversations from the Minister of AI, specifically around Chat GPT and the child sexual abuse material on Grok and X, and you’re starting to see political awareness from federal leaders that this is still an issue.”

In his discussions of this issue Page often returns to the central concept of duty of care.

“When companies develop products for the Canadian market in the Canadian consumer, we have a regulatory responsibility to ensure that those products are operated responsibly and that the money they make is to the benefit of Canadians overall and not a detriment to our sovereignty or our ability to talk to each other, or frankly the safety of our children.”

aFOPStGNUbEPEPN Yuwu ADistZC