Google says it plans additional privacy measures to protect teenage users on YouTube and its search engine, becoming the latest technology giant to adopt tougher standards in the face of criticism that companies are not doing enough to protect children.
In a blog post on Tuesday, Google announced that videos uploaded to YouTube by users 13 to 17 years old would be private by default, allowing the content to be seen only by the users and people they designate.
Google also will start to allow anyone under 18 years old, or a parent or guardian, to request the removal of that minor’s images from Google Image search results, the company said. It is unclear whether this process will be easy and responsive, considering Google’s historical reluctance to remove items from search results.
In addition, Google said it would turn off location history for all users younger than 18 and eliminate the option for them to turn it back on.
The company plans to roll out the changes in the “coming weeks,” it said.
There is growing bipartisan support in Washington to press technology companies to do more to protect children. In the last few months, two pieces of legislation, one in the House and one in the Senate, seek to update the Children’s Online Privacy Protection Act. The 1998 law, known as COPPA, restricts the tracking and targeting of children under 13 years old, and the bills would extend those protections to teenagers.
Google has repeatedly faced scrutiny over its handling of data…