Digital Smarts - Google Implementing Policies to Protect Minors

You are here

Google is now blocking gender-, age- or interest-based targeting to children under 18, accepting minors' requests to have images removed from search results, and disabling location history within account settings. The company is also rolling out protections on its YouTube platform, such as defaulting video uploads by kids between 13 and 17 to a private setting and taking "overly commercial content" off of YouTube Kids.

 

Google says these new changes are based on new regulations being introduced in some countries, and that it wants to offer “consistent product experiences and user controls” globally. Requesting an image’s removal from Google’s image search won’t remove it from the web entirely, Google cautions, but it says this should give users more control over the spread of their images. Alongside its changes to ad targeting, Google also says it’s expanding safeguards to stop “age-sensitive ad categories” from being shown to teens.

 

The new features are being introduced on different timelines. The option to request that images be removed from Google’s image search, as well as changes to default YouTube video privacy settings, will roll out in the coming weeks. The new restrictions on ad targeting, SafeSearch changes, and tools to block content on Google Assistant-enabled smart devices are launching in the coming months.