Instagram is beginning to roll out changes to help verify the ages of its youngest US users after reports last year that its parent company, Meta, was overlooking its potentially negative effects on teens.
On Thursday, Instagram published a blog post outlining the ways it is testing to help verify a user’s age. It consisted of two new options: uploading a selfie video taken and submitted for age verification, and “social certification,” which requires three adults to verify the user’s age.
These options in the trial build on a previous requirement for users to upload their ID cards to confirm their age. However, this approach was fraught with privacy and security concerns, as well as being easily spoofed by submitting someone else’s ID or a fake ID.
“Understanding someone’s age online is a complex challenge for the entire industry. We want to work with others in our industry and with governments to establish clear standards for online age verification,” Instagram said in the blog post.
Instagram is partnering with a British identity verification firm called Yoti to determine if the submitted selfie video matches the user’s stated age. According to the company, Yoti is verified by the Age Check Certification Scheme, a UK-based program, and is also supported by German regulators.
Yoti builds its data sets using anonymous images taken from various people around the world who have consented to Yoti using their image for its software. Both companies maintain that the selfie video will only be kept until the user’s age is verified and then deleted.
As for the social verification program, a user will be required to ask mutual followers to confirm their age and they must be at least 18 years old and cannot vouch for multiple people at the same time. The three adults will then receive a request from Instagram to confirm the prospective user’s age within three days.
Since 2019, Instagram has asked new users to reveal their age, and in 2021, made it a requirement for everyone to sign up for an account.
After a trove of leaked Facebook documents revealed that the company conducted an internal investigation showing it was aware of Instagram’s potentially negative impact on teens, contradicting its assurances to the public, scrutiny of the service intensified.
Last year, Instagram began moving to develop an “Instagram for kids” that would include more parental involvement and stricter privacy on a user’s account. The company stopped the plan in September 2021 in favor of building better parental monitoring tools on the platform.