Meta has launched new safety features to protect teenagers on its platforms, including tools to manage messages and one-tap block and report options.
The big picture; The company disclosed that it has removed approximately 635,000 accounts that were involved in leaving sexual comments or requesting sexual images from individuals under the age of 13, further indicating that 135,000 accounts were commenting inappropriately, with 500,000 connected to those accounts.
- This move comes amidst growing concerns about social media’s impact on the mental health of young users, with Meta aiming to safeguard children from potential predators and scammers seeking to exploit them for explicit content.
- In response, teenage users have blocked over a million accounts and reported an additional million after receiving safety notifications prompting them to exercise caution with private messages and report any unsettling content.
- Meta has also implemented artificial intelligence to detect instances where users falsify their age on Instagram, switching misrepresenting accounts to teen status – which entails tighter restrictions compared to adult accounts, ensuring that private messages are limited to contacts they follow or are already connected to.