As lawmakers and advocacy groups call for more child and teen safety online, tech giants are under pressure to provide more protection for minors.
In their latest effort to address the issue, Google last week unveiled a policy to increase control of minors over access to their images through the platform’s search tool.
Kids and teens, along with their parents and guardians, can request removal, in a few online steps, of their images from search results.
As a result, these images won’t appear in the images tab or as thumbnails in any feature in a Google search, said the company in a blog post.
They added that removing an image from Google results doesn’t remove it from the internet and that people should contact a site’s webmaster to ask that they remove the content, too.
“We believe this change will help give young people more control over their digital footprint and where their images can be found on Search,” Google said.
In October, Facebook, another tech giant, announced it would be doing more to protect kids on its platforms, including Instagram and What’s App, from harm and bullying.
The added safety measures also extended to protect politicians and public figures from unwanted harassment, particularly those that are sexual.
“We do not allow bullying and harassment on our platform, but when it does happen, we act,” Facebook said in a statement. “We also regularly pressure test these policies with our safety experts, making changes as…