Australia is moving ahead with a world-first requirement that search engines like Bing and DuckDuckGo implement age-verification tools to protect minors from harmful online content. Under new draft industry standards released by the eSafety Commissioner, search services must either prevent access to material deemed inappropriate for children—such as pornography, self-harm, and extreme violence—or introduce mechanisms to verify a user’s age before allowing access to such content. While the rules stop short of mandating specific verification technologies, they demand measurable compliance and accountability from search providers.
Google has been exempt from the immediate requirements due to its ongoing legal challenge against the regulator’s existing takedown notice for explicit content, but it could still be subject to future enforcement. Privacy advocates and tech companies have raised concerns about the implications of age checks on user anonymity and data privacy, warning of potential overreach and surveillance. The Australian government is pressing forward by positioning the policy as a critical step in making the internet safer for children. A final decision on the draft standards is expected later this year following public consultation.
Background Links:
- eSafety Commissioner site – Age verification consultation
Details the 2023 roadmap, stakeholder consultations, technical assessments, and supporting documents underpinning the current regulatory push. eSafetyCommissioner - ACS/AI Australia – “Australians to face age checks from search engines”
Summarizes the requirement for age assurance systems for logged-in users under 18 within six months, default "safe search," and types of age verification being considered. InformationAge - Biometric Update – “Australia’s safety code for search tools takes effect”
Reports that the first three of nine online safety codes—covering search engines, hosting, and carriage services—have officially been registered as of July 7, 2025 Biometricupdate.com


