B.C.’s attorney general says if the federal government will not step in to regulate social media and AI chatbot use by youth, then her government will do so itself in concert with other provinces.
“If we see that there is no action, then I think an alliance amongst provinces to take action would be the way,” Attorney General Niki Sharma told reporters on Tuesday.
Sharma sent a letter to the federal government on April 17 providing a rundown of protections that B.C wants implemented.
This includes the establishment of age-related restrictions on platforms unless companies can demonstrate they are safe for youth, the inclusion of AI chatbots and social media platforms in this regime, reporting standards for when companies must disclose that users may be planning violence, and an oversight regime to ensure compliance, which Sharma wants to be mandatory.
The federal Liberal Party voted at a policy convention earlier this month to back an age restriction on social media use, but the government has yet to introduce a bill.
Manitoba plans to take this concept a step further by introducing age-related bans for both social media and AI chatbot use. Sharma backs this type of plan for B.C., but sees the federal government as having better oversight tools than provincial governments.
“I’m really interested in how Manitoba is thinking that they’re going to implement that,” she said. “It’s been our view that the federal level — they have the best tools and the best ability to implement such a ban.”
Restrictions on youth chatbot usage would be a step beyond the recent world-headline-making youth social media ban introduced by Australia. Sharma says that at this stage, a ban is “worthless” without including chatbots.
“All social media platforms have AI embedded now in their platforms,” Sharma said. “So if we’re going to have real action and we’re going to think of design standards, and the best interests of vulnerable people and children, then we need to include AI chatbots.”
Sharma mentioned the Tumbler Ridge school shooting in the letter to the federal government, saying the tragedy shows the law needs to “catch up.” OpenAI banned the alleged shooter last summer for troubling online activity, but the company did not report the ban to police until after the shooting. CEO Sam Altman has apologized for the failure.
“We can’t have these companies that control a lot of the wealth in the world also deciding what’s safe or unsafe for our children and our vulnerable people in our society,” Sharma said. “That’s government’s job to step in and set a standard to make sure that people are safe.”
Beyond high-profile cases such as Tumbler Ridge, Sharma said there is a link to rising eating disorders, suicides and sexploitation cases.
“Clearly, self-regulation is not working,” she said.