TORONTO –
Facebook whistleblower Frances Haugen feels “extremely hopeful” about society’s ability to push social media platforms into being safer but for change to come, she says these companies need to be motivated in a new way.
“We are not powerless,” said Haugen in an interview during a visit to Toronto, where she was due to speak at a conference aimed at online safety for kids.
“These systems are not impossible to fix. It’s just we lack the incentives today to have these platforms act in a positive way.”
Haugen’s criticism of social media platforms and the broader societal systems that have fostered them has mounted over the course of her more than 20 years in tech, including stops at Google, Hinge, Yelp and Pinterest.
She reached a brink in 2021 and disclosed thousands of internal documents from her previous employer Facebook, which detailed how the company prioritized profits over addressing problems it was aware of on its social network.
One of the internal studies she leaked showed 13.5 per cent of teen girls in the U.K. had suicidal thoughts that multiplied after joining Instagram. Another said 32 per cent of teen girls who felt bad about their bodies felt even worse after visiting the platform.
Three years later, Haugen doesn’t feel getting the platforms to change is a lost cause, even if the amount of research suggesting they’re harmful has only grown.
The trick, she said, lies in changing what she sees as the businesses’ current modus operandi: to make as much money as possible, even if it comes at a cost to kids, families and society.
She’d like to see the platforms instead be incentivized by data she wants them to compile and release monthly, resembling a scorecard of sorts.
The data would focus on 30 problems by spanning 100 metrics like how many kids received unwanted sexual communication on each platform in the last week.
She predicts the results would convince advertisers to spend on safer platforms — and quickly.
“If advertisers had this data, they would move fast,” Haugen said.
The data would also generate results by bolstering non-profits aimed at improving social networks, which Haugen said are underfunded.
“Part of the reason for that is it’s very hard to demonstrate you accomplished anything,” she said.
“In a world where you had those 100 metrics that were published every month … I guarantee you would have 100 non-profits founded because they would say ‘I’m going to go after this number.'”
Haugen’s vision for social media doesn’t stop at the release of data. She has a long wish list for the platforms, including several ideas that are aimed at protecting children but are just as likely to be useful for adults.
Among them is a suggestion that platforms could detect when users were up late scrolling through their feed and then ask the next day what time someone hoped to go to bed.
If they entered something like 11 p.m., Haugen said the company could then begin slowing down the app gradually two hours before.
“Around 11 p.m., they’d get tired and go to bed (because) it’s less satisfying,” Haugen said.
The idea is based on a reality Haugen said tech companies have known about for at least 20 years: “If you add a little latency into an experience, people use it less” but often stick around long-term.
She also has thoughts around what platforms could do to help people who get served reams of content that makes them sad or provokes negative thoughts.
Haugen thinks these issues could be improved if companies allowed users to reset their algorithms.
“Imagine you’re a kid and you start getting pulled down an eating disorder rabbit hole. You got shown an ad for a diet lollipop and they picked up that you lingered on there for a little while. You didn’t even click it and now you’re getting all this food and body dysmorphia content,” Haugen said.
“Imagine if that kid could just push a button and reset to the vanilla algorithm.”
The platforms could also detect when people have looked at upsetting content or posts that make them feel insecure for long periods of time. They could then present users with a pop-up noting that exposure and asking if they still want to be served that content.
While social media platforms haven’t implemented the kinds of tools Haugen dreams of, many have made changes directed at safety for kids.
TikTok gives one-hour screen time limits that can only be bypassed with a code from their parents. The platform also offers family pairing, which allows parents to link their accounts directly with their teens’ and ensure their kids’ TikTok settings are agreed upon as a family.
Snap has measures designed to keep teens from being contacted by anyone other than friends or people who already have their phone number and location-sharing is turned off by default.
“From the beginning, we made deliberate design choices to help prevent the spread of misinformation or harmful content — including moderating user content before it can reach a large audience – and we don’t have live streaming,” Snap’s head of communications, Tonya Johnson, said in an email.
Last week, Instagram said it would automatically default to giving teens private accounts that nudge them to spend less time logged on and have a sleep mode muting notifications between 10 p.m. and 7 a.m.
“We know parents want to feel confident that their teens can use social media to connect with their friends and explore their interests, without having to worry about unsafe or inappropriate experiences,” Meta said in a statement.
“Teen Accounts were designed to address those concerns and provide teens with an experience designed specifically for them, with automatic defaults in place so parents can be reassured that their teens have heightened protections in place,” the statement said.
TikTok did not respond to a request for comment.
While Haugen thinks the focus on making platforms safer for kids is important, what she has seen so far has plenty of loopholes. For example, she said Instagram’s sleep mode hours default to allowing kids to keep getting notifications during the school day, a period during which parents and teachers are trying to keep students off their phones.
The battle reached a new height last year when several Ontario boards sued Meta, TikTok and Snap for billions over accusations that the companies negligently design their products for compulsive use and rewire the way children think, behave and learn.
Many provinces including Ontario, Saskatchewan, Nova Scotia, Manitoba and Alberta, upped the ante even more, banning cellphones from classes this year.
Haugen applauded the moves.
“If you’re splitting your attention between math class and your phone, there’s no way you can keep up,” she said.
“There’s studies that show you lose like a half a year of instruction, having your phone in your lap because you just can’t focus.”
This report by The Canadian Press was first published Sept. 26, 2024.