Sydney: Search engines in Australia will soon have to blur pornographic and violent images in some cases to limit the chances children accidentally encounter this content.
This is one of several rules outlined in a new online safety code covering internet search engines that comes into force on December 27.
Why are these rules being introduced?mIn 2022, Australia’s online safety body, eSafety, surveyed more than 1,000 Australians aged 16 to 18 years.
The research found that one in three were under age 13 when they were first exposed to pornography. This exposure was “frequent, accidental, unavoidable and unwelcome,” with content described by young people as “disturbing” and “in your face. The eSafety Commissioner, Julie Inman Grant, has said “a high proportion” of accidental exposure is through search engines, which are “the primary gateway to harmful content”.
The new code was co-developed by the Digital Industry Group Inc, an industry association representing tech companies including Google, Meta and Microsoft, and the Communications Alliance – the peak body of the Australian telecommunications industry.
The code was announced in July 2025, but has been in development since July 2024.
A single breach could result in fines of up to AUD 49.5 million. How will account holders’ ages be assured? The code requires providers of internet search engine services, such as Google and Microsoft (which owns Bing), to “implement appropriate age assurance measures for account holders” in Australia by 27 June 2026.
Age checks will identify whether search engine account holders are over or under 18.
Currently, children as young as 13 can create and hold a Google account, the search engine used by more than 90 per cent of Australians.