The heightened measures arrive as social media companies face increased scrutiny over how their platform affects the mental health and well-being of younger users. This includes protecting children from predatory adults and scammers who ask — then extort— them for nude images.
Meta said teen users blocked more than a million accounts and reported another million after seeing a “safety notice” that reminds people to “be cautious in private messages and to block and report anything that makes them uncomfortable.”
Earlier this year, Meta began to test the use of artificial intelligence to determine if kids are lying about their ages on Instagram, which is technically only allowed for those over 13. If it is determined that a user is misrepresenting their age, the account will automatically become a teen account, which has more restrictions than an adult account. Teen accounts are private by default. Private messages are restricted so teens can only receive them from people they follow or are already connected to. In 2024, the company made teen accounts private by default.
Meta faces lawsuits from dozens of U.S. states that accuse it of harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.