Senator Markey, Representatives Castor and Trahan urge FTC to use its authority to force tech companies to comply with new platform policies
In response to the new UK Child and Adolescent Privacy Act, popular apps and websites recently announced significant changes to their official policies for young users.
“The need to protect young people from threats to online privacy is more urgent than ever. Since 2015, American children have spent nearly five hours a day staring at their screens, and the daily screen time of children and adolescents has increased by 50% or more during the coronavirus pandemic, ” lawmakers wrote in their letter. “We therefore encourage you to use all the tools at your disposal to carefully scrutinize companies’ data practices and ensure that they meet their public commitments.
In response to the AADC, Instagram publicly announced that it “defaults to forwarding young people to private accounts, making it harder for potentially suspicious accounts to find young people, [and] limit the options available to advertisers to reach young people with advertisements. Google and its affiliate YouTube have announced that they will “make product experiences suitable for children and teens” by changing the default video download setting for teens aged 13 to 17 to “private”; deactivate the position history (without the possibility of reactivating it) for users under the age of 18; and “block[ing] advertising targeting based on the age, gender or interests of those under 18 ”, among other changes. Last year – in the same vein before the enactment of the AADC – TikTok said it had disabled messaging for accounts under 16 and increased parental controls.
Lawmakers note, “These policy changes do not replace Congressional action on children’s privacy, but they are important steps in making the Internet safer for young users.”