The Online Safety Act 2021(Cth) commenced on 23 January 2022, replacing a patchwork of online safety legislation to create a more consistent and clearer regulatory framework. The Act gives new powers to the eSafety Commissioner, Julie Inman Grant, and aims to protect Australians from online harm and bullying.

Who does the Online Safety Act protect?

The Online Safety Act creates a world-first Adult Cyber Abuse Scheme to protect Australians aged over 18. The eSafety Commissioner can now fine or penalise individuals who post cyber-abuse material that targets adults.

The Online Safety Act also introduces a broader Cyberbullying Scheme to protect Australian children from harm that occurs on additional online services, expanding from the previous regime that only addressed harm on social media. The eSafety Commissioner may order service providers to remove illegal and restricted content such as child sexual exploitation material and terrorist content on any online service. This includes online game chats, websites, and direct messaging platforms.

What is considered adult cyber-abuse?

The Online Safety Act creates a much higher threshold for what is considered online bullying of adults, in comparison to how the Act deals with online bullying towards children.

The abuse must be both ‘intended to cause serious harm’, and ‘menacing, harassing or offensive in all the circumstances’ according to the eSafety agency (eSafety). This could include making realistic threats or placing individuals in real danger, being excessively malicious or unrelenting. Offensive, defamatory, or disagreeable comments alone will not be considered adult cyber-abuse under the Act.

The Government created this high threshold to address concerns of censorship and free speech. However, the eSafety Commissioner will provide support, information and advice to individuals impacted by online bullying where the content does not meet this high threshold.

What does the Online Safety Act do?

The Online Safety Act makes social media platforms, websites and online services more accountable for the online safety of their users. The eSafety Commissioner can now receive reports of bullying content or the non-consensual posting of intimate images where platforms such as Facebook or Twitter fail to remove the content first. If a platform fails to remove the bullying content or image within 24 hours of being notified by eSafety, they may be fined up to $555,000.

The eSafety Commissioner may also demand that internet service providers remove access to content that “promotes, incites, instructs in or depicts abhorrent violent conduct”. This could include content involving rape, torture, murder, attempted murder and terrorist acts.

What should I do if I am being bullied?

According to the Online Safety Act, you should first ask the website or platform to remove the bullying content from the site. If the website or platform does not remove the content, you can report it to the eSafety Commissioner here. An investigation will commence, and if successful, the eSafety Commissioner will inform the website or platform to remove the content within 24hours or risk a fine.

However, the Act only allows the eSafety Commissioner to order the removal of content and apply fines. If further action is required, you should inform the police.

If you want to find out more about what the new Act means for you or your business, you can visit the eSafety Website here.

If you have any questions about the Online Safety Act, get in touch with us.

 

This article was originally published by OneTrust DataGuidance here