Snapchat launched parental controls on its app last year through the new ‘Family Center’ feature. Today, the company announced through a post on its online Privacy and Safety Hub it will now add content filtering capabilities that will allow parents to restrict teens from being exposed to content identified as sensitive or suggestive.
To enable the feature, parents can toggle on the “Restric Sensitive Content” filter in Snapchat’s Family Center. Once enabled, teens will no longer see the blocked content on Stories and Spotlight — the platform’s short video section. The text under the toggle specifies that turning on this filter won’t have an impact on content shared in Chat, Snaps, and Search.
Accompanying this change, Snapchat is also publishing its content guidelines for the first time to give creators on Stories and Spotlight more insights into what kind of posts may be recommended on its platform and what content will now be considered “sensitive” under its community guidelines. The platform said that it had shared these guidelines with a set of creators under the Snap Stars program and with its media partners, but now the company is making them available to everyone through a page on its website.
The company already prohibits content like hateful content, terrorism, violent extremism, illegal activity, harmful false or deceptive information, harassment and bullying, threats of violence, and more from appearing on its platform. But now, the guidelines specify what content under various categories will be considered “sensitive.” This is content that may be eligible for recommendation but may be blocked from teen users under these new controls, or from others on the app based on their age, location or personal preferences.
For example, under the sexual content category, Snap explains that content will be considered “sensitive” if it includes “all nudity, as well as all depictions of sexual activity, even if clothed, and even if the imagery is not real” (such as in the case of AI images, as well as “explicit language” describing sex acts and other things related to sex, like sex work, taboos, genitalia, sex toy, “overtly suggestive imagery,” “insensitive or demeaning sexual content” and “manipulated media.”
It addresses what will be considered sensitive in other categories as well, including harassment, disturbing or violent content, false or deceptive info, illegal or regulated activities, hateful content, terrorism and violent extremism, and commercial content (overt solicitation to buy from non-approved creators). This includes a range of content, like depictions of drugs, engagement bait (“wait for it”), self-harm, body modifications, gore, violence in the news, graphic imagery of human physical maladies, animal suffering, sensationalized coverage of distributing incidents, like violent or sexual crimes, dangerous behavior, and much, much more.
The changes come long after a 2021 Congressional hearing where Snap was grilled about showing adult-related content in the app’s Discover feed such as invites to sexualized video games, and articles about going to bars or porn. As Senators rightly pointed out, Snap’s app was listed as 12+ in the App Store but the content it was sharing was clearly intended for a more adult audience. Even the video games it advertised, in some cases, were rated as being aimed at older users.
“We hope these new tools and guidelines help parents, caregivers, trusted adults and teens not only personalize their Snapchat experience, but empower them to have productive conversations about their online experiences,” the social media company said in a blog post.
However, while the new feature may go a long way to limit sensitive content from teen viewers in some areas, it doesn’t tackle one of the areas Congress had called out — the Discover feed. Here, Snap features publisher content, including those who publish content that could be considered “sensitive” under its guidelines. It’s frankly a lot of clickbait. And yet this area is not being addressed with the new controls.
Plus, the feature requires parents to take action by turning on a toggle they likely know nothing about.
In short, this is another example of how the lack of legislation and regulations regarding social media companies has led to self-policing, which doesn’t go far enough to protect young users from harm.
In addition to the content controls, Snap said that it is working on adding tools to give parents more “visibility and control” around teens’ usage of the new My AI chatbot.
Last month, the social network launched this chatbot powered by Open AI’s GPT tech under the Snapchat+ subscription. Incidentally, Snapchat’s announcement comes after the chatbot went rogue while chatting with a Washington Post columnist pretending to be a teen. The bot allegedly advised the columnist about hiding the smell of pot and alcohol while having a birthday party. Separately, researchers at the Center for Humane Technology found that the bot gave sex advice to a user pretending to be 13 years old.
The additional tools targeting the chatbot have not yet been rolled out.
Snapchat adds new parental controls that block ‘sensitive’ and ‘suggestive’ content from viewing by teens by Ivan Mehta originally published on TechCrunch