Skip to content

Geoff Johnson: Legislation alone can't tackle online harms to kids

Parents need to play a role in ensuring children and teens aren’t exploited online — and that includes not running your child’s Instagram account where they can be preyed on by pedophiles
web1_04212024-vtc-johnson-col
A recent investigation by two New York Time's reporters looked at parents who operate social media accounts for their young daughters, often in hopes of turning the girls into influencers or models. Predictably, many of these accounts have attracted a following of men who acknowledge on other platforms that they are sexually attracted to children, writes Geoff Johnson. MICHAEL DWYER, AP FILES

Invasion of privacy and dignity is an ongoing problem across the country.

The Canadian Centre for Child Protection says ­millions of cases of suspected child-sexual-abuse ­material on sites such as Facebook, Instagram, TikTok and Pinterest are flagged each year, according to a January CBC News item.

Federal legislation known as the Online Harms Act (Bill 63) passed second reading as of March 12. Once implemented, it will amend the Criminal Code, the Canadian Human Rights Act and the Mandatory Reporting Act.

The latter imposes reporting duties on ­Internet ­service providers when they are advised of an ­Internet address where child-sexual-abuse material may be available to the public, or if they have reasonable grounds to believe that their Internet service is being or has been used to commit a child-sexual-abuse ­material offence.

Bill 63 puts some teeth into B.C.’s Intimate Images Act, which came into effect Jan. 29 and applies ­retroactively to March 6, 2023.

As reported by the CBC, B.C. is the ninth province in Canada to draft and enact an intimate images act, which provides a path for victims to regain control of their private images and for perpetrators to be held accountable.

The legislation should encourage the B.C. School Trustees Association to work with individual school boards to develop both local policies and regulations to govern the inappropriate use of social media by ­anybody on school property.

If only it were that simple. It is not.

A recent investigation by two New York Times reporters, Jennifer Valentino-DeVries and Michael H. Keller, analyzed 2.1 million Instagram posts, monitored months of online chats of professed pedophiles and interviewed more than 100 people, including parents and children.

DeVries and Keller unearthed a world of Instagram influencers whose accounts are managed by their ­parents. Although the site prohibits children under 13, parents can open so-called mom-run accounts for them, and those accounts can live on even when the girls become teenagers.

Their investigation revealed, unbelievably, ­examples of situations where parents operate accounts for their young daughters, often in hopes of turning the girls into influencers or models. Predictably, many of these accounts have attracted a following of men who acknowledge on other platforms that they are sexually attracted to children.

According to Times writer David Leonhardt, one ­calculation performed by an audience ­demographics firm for the NYT investigation found 32 million ­connections to male followers among the 5,000 accounts examined by the Times.

While there is still much that researchers don’t understand about uses and abuses of digital ­technology and smartphones, such technology, properly ­supervised, is here to stay and is probably not only necessary but healthy.

But in the absence of proper supervision, the notion that smartphones are beneficial or harmless to mental health on the whole — an argument that technology executives sometimes make — looks much weaker than it once did.

And that’s why simply passing legislation like the Intimate Images Act might not be enough to solve the problem.

As mentioned in a previous column, the widely ­distributed B.C. Adolescent Survey (also known as the McCreary Report) includes an analysis of some of the impacts of COVID-19 on B.C. youth ages 12-19.

Kids now are more likely to live on their screens, less likely to have in-person friends and more likely to have online friends whom they have never met in ­person.

Why is it so difficult, despite attempts to legislate it, to control the negative influence of social media on teenagers?

Natasha Tusikov, an assistant professor at York ­University and author of Chokepoints: Global Private Regulation on the Internet, notes that the few ­proposals governments have brought forward fail to tackle what she says is the heart of the problem: the business model.

Maximizing user engagement generates ­advertising revenue, so social-media companies make money regardless of “whether the content is excellent, or whether it’s terrible, disgusting, hateful content,” Tusikov writes.

In 2022, a Global News report by Rachel Gilmore quoted Sulemaan Ahmed, who has worked in ­senior positions with brands such as Apple, Sears and Air ­Canada and was responsible for driving their ­social-media initiatives globally, as saying: “I think parents have a responsibility, and government does, to ensure that when [children] are young and ­impressionable, that they don’t see certain things that could be traumatizing to them.”

Will legislation control what kids can see on social media? Probably not, but parents must.

[email protected]

Geoff Johnson is a former superintendent of schools.

P.S. This will be my final column — at least for the time being. At age 80-plus, it is time for a break. I would like to thank those folks at the Times Colonist for their ­continued guidance, patience with my ramblings and interest in issues surrounding public education.

I’d also like to thank those many folks who have read my stuff and responded encouragingly by email.