Weeks after Instagram rolled out increased protections for minors using its app, Google is now doing the same for its suite of services, including Google search, YouTube, YouTube Kids, Google Assistant and others. The company this morning announced a series of product and policy changes that will allow younger people to stay more private and protected online and others that will limit ad targeting.
The changes in Google’s case are even more expansive than those Instagram announced, as they span across an array of Google’s products, instead of being limited to a single app.
Though Congress has been pressing Google and other tech companies on the negative impacts their services may have on children, not all changes being made are being required by law, Google says.
“While some of these updates directly address upcoming regulations, we’ve gone beyond what’s required by law to protect teens on Google and YouTube,” a Google spokesperson told TechCrunch. “Many of these changes also extend beyond any single current or upcoming regulation. We’re looking at ways to develop consistent product experiences and user controls for kids and teens globally,” they added.
In other words, Google is building in some changes based on where it believes the industry is going, rather than where it is right now.
On YouTube, Google says it will “gradually” start adjusting the default upload setting to the most private option for users ages 13 to 17, which will limit the…