Facebook Will Try to ‘Nudge’ Teens Away From Harmful Content

Reuters
By Reuters
October 10, 2021Science & Tech
share
Facebook Will Try to ‘Nudge’ Teens Away From Harmful Content
This illustration photo shows the Facebook logo on a smartphone in front of a computer screen in Los Angeles, Calif., on Aug. 12, 2021. (Chris Delmas/AFP via Getty Images)

WASHINGTON—A Facebook executive said Sunday that the company would introduce new measures on its apps to prompt teens away from harmful content, as lawmakers scrutinize how Facebook and subsidiaries like Instagram affect young people’s mental health.

Nick Clegg, Facebook’s vice president of global affairs, also expressed openness to the idea of letting regulators have access to Facebook algorithms that are used to amplify content. But Clegg said he could not answer the question whether its algorithms amplified the voices of people who had breached the U.S. Capitol on Jan. 6.

The algorithms “should be held to account, if necessary, by regulation so that people can match what our systems say they’re supposed to do from what actually happens,” Clegg told CNN’s “State of the Union.”

He spoke days after former Facebook employee and whistleblower Frances Haugen testified on Capitol Hill about how the company entices users to keep scrolling, harming teens’ well-being.

Facebook whistleblower Frances Haugen
Facebook whistleblower Frances Haugen appears before the Senate Commerce, Science, and Transportation Subcommittee during a hearing entitled ‘Protecting Kids Online: Testimony from a Facebook Whistleblower’ at the Russell Senate Office Building in Washington on Oct. 5, 2021. (Matt McClain/Pool via Getty Images)

“We’re going to introduce something which I think will make a considerable difference, which is where our systems see that the teenager is looking at the same content over and over again and it’s content which may not be conducive to their well-being, we will nudge them to look at other content,” Clegg told CNN.

In addition, “we’re introducing something called, ‘take a break,’ where we will be prompting teens to just simply just take a break from using Instagram,” Clegg said.

U.S. senators last week grilled Facebook on its plans to better protect young users on its apps, drawing on leaked internal research that showed the social media giant was aware of how its Instagram app damaged the mental health of youth.

Sen. Amy Klobuchar (D-Minn.), who chairs the Senate Judiciary Committee’s antitrust subcommittee, has argued for more regulation against technology companies like Facebook.

“I’m just tired of hearing ‘trust us’, and it’s time to protect those moms and dads that have been struggling with their kids getting addicted to the platform and been exposed to all kinds of bad stuff,” Klobuchar told CNN on Sunday after Clegg’s interview.

She said the United States needs a new privacy policy so that people can “opt in” if they favor allowing their online data to be shared. The United States also should update children’s privacy laws and its competition policy, and require tech companies to make their algorithms more transparent, Klobuchar said.

Clegg noted that Facebook had recently put on hold its plans for developing Instagram Kids, aimed at pre-teens, and was introducing new optional controls for adults to supervise teens.