Australia’s 'Online Safety Act' to enforce age checks across social media

New internet regulations will require age verification on platforms to protect children from harmful content

Australians will soon face mandatory age verification and identity checks when using online services, as part of new regulations set to take effect from December.

Developed by the tech sector and eSafety commissioner, these rules aim to protect children from harmful content online, such as pornography and violent material.

The new codes, under the 'Online Safety Act', will require services like search engines, social media, and app stores to implement age assurance measures for all users. This could involve using account history, facial recognition, or bank card checks to verify users' ages, as reported by The Guardian.

Starting in December, search engines must activate "safe search" features for users under 18, filtering out inappropriate content. Platforms hosting harmful material, including self-harm and violent content, will also need to ensure children cannot access it.

The eSafety commissioner, Julie Inman Grant, spoke to the Guardian and has emphasised the importance of these safeguards to protect children.

"It's critical to ensure a layered safety approach, placing responsibility at key points in the tech ecosystem," she said.

While these rules target specific platforms, critics argue they grant excessive power to large tech companies.

Non-compliance could lead to fines of up to $49.5 million or removal from search results. Despite concerns, the eSafety commissioner’s office has reaffirmed that the codes are necessary to safeguard young internet users, particularly on search engines.

With the regulations set to be implemented in December, the full impact on internet users in Australia will soon unfold.

Load Next Story