(Reuters) – Instagram said on Tuesday it will be stricter about the types of content it recommends to teens in the photo-sharing app and will nudge them toward different areas if they dwell on one topic for a long time.
In a blog post, the social media service announced a slew of changes for teen users. Instagram head Adam Mosseri is to testify in a congressional hearing on Wednesday about protecting kids online.
Instagram and its parent company Meta Platforms Inc, formerly Facebook, have been under scrutiny over ways their services could cause issues around the mental health, body image and online safety of younger users.
In the post, Mosseri also said Instagram was switching off the ability for people to tag or mention teens who do not follow them on the app. He said that starting in January, teen Instagram users would be able to bulk delete their content and previous likes and comments.
He said Instagram was exploring controls to limit potentially harmful or sensitive material suggested to teens through its search function, hashtags, short-form video Reels and its ‘Suggested Accounts’ feature, as well as on its curated ‘Explore’ page.
The blog post also said that on Tuesday, Instagram was launching its ‘Take a Break’ feature in the United States, United Kingdom, Canada and Australia, which reminds people to take a brief pause from the app after using it for a certain amount of time.
It said Instagram in March next year would launch its first tools for parents and guardians to see how much time their teens spend on the app and set time limits.
Republican Senator Marsha Blackburn criticized the company’s product announcement as “hollow,” saying in a statement: “Meta is attempting to shift attention from their mistakes by rolling out parental guides, use timers, and content control features that consumers should have had all along.”
An Instagram spokeswoman said it would continue its pause on plans for a version of Instagram for kids. Instagram suspended plans for the project in September amid growing opposition to the project.
The move followed a Wall Street Journal report that said internal documents, leaked by former Facebook employee Frances Haugen, showed the company knew Instagram could have harmful mental health effects on teenage girls, for example on their views of body image. Facebook has said the leaked documents have been used to paint a false picture of the company’s work.
State attorneys general and lawmakers had also raised concerns about the kids-focused app.
Last month, a bipartisan coalition of U.S. state attorneys general said it had opened a probe of Meta for promoting Instagram to children despite potential harms.
(Reporting by Elizabeth Culliford in New York; Editing by David Gregorio and Dan Grebler)