Arab Canada News
News
Published: December 16, 2023
Collège Béliveau is dealing with the dark side of artificial intelligence after AI-generated nude images of minor students were discovered circulating at a Winnipeg school.
An email sent to parents on Thursday afternoon said school officials learned late Monday that altered images of students in grades seven through twelve at the French immersion school had been shared online, and that school officials contacted the police.
The message said, "We are grateful and proud of the students who came forward to bring this matter to our attention."
The mother of one of the girls whose altered images were among those circulating told CBC News she hopes the responsible person is held accountable and wondered why AI companies allow this to happen.
Another mother, who has a son in grade twelve at Collège Béliveau, said she was shocked by the news.
Noni Kopchowski said, "I think it’s more terrifying for the girls than for the boys, and that’s just my opinion."
"I have three boys, but I think the best advice for my kids is to always be respectful... Put yourself in someone else’s shoes, this is hurtful and the damage it causes is long-lasting."
Kopchowski said there had been discussions about AI in her home, including the risks that come with its use.
She said, "It’s gotten scarier there, we are much more teaching ourselves."
The school said the original images appear to have been collected from publicly available social media then explicitly altered, and they did not state how many images they believe were shared or how many girls fell victim.
The email to parents said the school "is investigating to gain a better understanding of the extent of what happened and who was involved," and officials are "taking necessary steps to respond to the actions of individuals who shared these images."
While we cannot assume we have evidence of all manipulated images, we will contact the caregivers of those students whose images were manipulated directly."
The school's website says the school, located in the Windsor Park neighborhood, has just under 600 students.
School officials have also contacted Cybertip.ca, an online abuse tip line run by the Canadian Centre for Child Protection in Winnipeg.
Images received by the school will be uploaded to Cybertip’s Arachnid project, which can help have them removed.
Danny MacKinnon said the police exploitation unit is conducting an investigation, but it’s still too early for an active investigation to provide more details. He added, "Artificial intelligence is a new, complex, and delicate part of our world," "It’s definitely the latest form of stealthy behavior; we are in an unknown area."
No charges have been laid at this time.
School officials said support teams are available for any student affected directly or indirectly by what happened.
The school’s email stated that additional support and resources are available through Cybertip.ca and Need Help Now, another Canadian resource for youth and families.
Cybertip’s director said the exploitation of youth through AI-generated images without their knowledge in photo editing programs, then shared online, has been a troubling issue for years.
But Steven Swaine said, "AI has definitely accelerated that."
Although there are cases of AI being used to create child sexual abuse material in Canada, including a case in Quebec, Swaine is not aware of a previous case in this country where teenagers at a school were victimized by other teenagers in this way.
It’s not clear at this stage whether the person responsible is a fellow student, and neither the police nor the school commented on that.
Swaine was also unable to speak specifically about the investigation at Collège Béliveau, but he said people who create explicit images often do not understand the long-term consequences and effects.
He said, "Even if these materials were created because people think they’re funny or think it’s [a] harmless prank... it can still be considered child pornography or child sexual abuse material under the law."
"These materials can come back to haunt the victim later in other stages of life if shared online with personal information."
The Arachnid project—a web crawler that scans the internet for known child sexual abuse images and issues notifications to companies to have them removed—can reduce the spread of child sexual abuse material online, but the key is catching it quickly before it’s distributed. Swaine said it had been widely distributed.
He added, "There is definitely no magic solution, but what it does is send notifications to companies to alert them when material is posted on their service... so they can keep their networks clean."
Mora Grossman, a research professor at the University of Waterloo’s School of Computer Science who has studied the real-world implications of AI-generated images, said this would be the first case of students in Canada deepfaking other students in this way.
She said there have been a few recent cases in the United States, in New Jersey and Seattle, "but I haven’t heard it has reached Canada."
"It’s somewhat alarming and not difficult to do, and you can do it for free, there are many websites online."
Although the ability to swap faces onto other bodies has existed before, the technology was poor, Grossman said, "Now anyone can create convincing images."
The hard part now is figuring out how to control it.
She explained, "You’re dealing with people all over the world, and it’s very difficult to have jurisdiction over this person in a legal matter."
"It’s very hard, and I think this problem will get worse, not better."
Comments