
The Instagram logo is seen on a cellphone in Boston, USA, Oct. 14, 2022. (AP Photo)
Instagram is testing artificial intelligence in Canada to identify users who may be underage, even if they list an adult birthdate.
The platform’s new AI system examines patterns such as account creation dates, user interactions, and content engagement to estimate the likelihood of someone being a teen or an adult. By analyzing these behaviors, Instagram hopes to protect younger users more effectively.
How the AI Works
Instagram’s AI model relies on behavioral data rather than just the birthdate entered. Teens and adults often interact differently with content and profiles. The system uses these differences to make informed predictions about a user’s true age.
If the AI suspects a teen has entered a false adult birthdate, the platform will automatically shift the account to a teen profile. Teen accounts come with special protections. These include limits on who can contact them, restrictions on certain content, and tools to monitor screen time.
Strengthening Teen Safety
Instagram began implementing these teen-specific accounts last year. The platform designed the measures to provide a safer, more controlled environment for young users. Features include default private accounts for teens and limitations on how advertisers can reach them.
The new AI is an additional layer of protection, aiming to prevent teens from bypassing age restrictions by entering misleading information.
Previous Measures and Challenges
Previously, Instagram could detect some cases of users lying about their age. Examples include situations where users initially entered a teen birthdate, then switched to an adult age, or when the platform noticed birthday interactions indicating the user was actually a minor.
However, these methods relied heavily on manual checks and user reports. The AI system is expected to streamline the process, making age verification more accurate and consistent.
Privacy Considerations
Instagram has emphasized that the AI does not use personal messages to determine age. Instead, it focuses on publicly available behavior patterns and interactions. The company says this approach balances user privacy with the need for enhanced safety measures for teens.
Looking Ahead
Canada is the first country where this AI pilot is being tested. Instagram has not announced whether the system will expand to other countries.
The move reflects growing concern about teen safety online, especially as younger users navigate social media platforms with limited understanding of privacy risks. By combining AI with existing safety measures, Instagram aims to create a more secure space for its youngest users.
Experts say the initiative could set a precedent for other social media platforms seeking to balance accessibility with protection. The company plans to monitor the AI’s effectiveness and adjust its approach based on results and user feedback.
Through this initiative, Instagram demonstrates a commitment to keeping teenagers safer while providing them with a tailored online experience.

