Australia introduces new online safety standards to combat terror and child sexual abuse content

online safety

Children spend an increasing amount of time online. When online, children learn, imagine, and develop their social networks with the aid of computers, smartphones, and gaming consoles. Internet use can broaden horizons and ignite creativity around the globe when used correctly – and made accessible to all.

Despite these opportunities, there are also serious risks involved.

The use of social media and instant messaging platforms can expose kids to cyberbullying and other forms of peer to peer violence. Messages that encourage self harm and even suicide – including hate speech and violent content – may be exposed to children while browsing the internet.

New Online Safety Standards by the Australian Government

A new set of standards released by Australia’s online safety regulator appears to avoid a possible battle with Apple over iMessage, an encrypted messaging app that Apple uses. The new standards will address terrorist content and child abuse material without compromising end to end encryption, according to the regulator.

Julie Inman Grant, the eSafety commissioner, rejected two industry designed regulatory codes in June for not requiring email, cloud storage, or encrypted messaging services for detecting abuse aimed at children. A draft of the mandatory standards report was released by the eSafety commissioner Julie Inman Grant on Monday instead of a final proposal on relevant electronic services.

In addition to identifying and removing known child abuse and pro terror materials and other sexual exploitation materials from online services, the proposed industry draft industry codes and world leading codes also disrupt and deter the spread of new material of the same type. The view of service providers on eSafety and online content emphasizes that it does not advocate building back doors into end-to-end encrypted services to undermine privacy and security, saying it will only be required if technically feasible.

Inman Grant said that eSafety on Internet services does not expect companies to design systematic weaknesses or vulnerabilities into any of their end to end encrypted services. End to end encryption, however, does not absolve companies of responsibility and does not mean that nothing can be done about these crimes.

Accordingly, the plan by Australia’s online safety regulator may save the government from a similar battle Apple fought against it earlier this year based on what is “technically feasible”.

In response to local online safety laws, the discussion paper by the tech company threatened to remove iMessage from Australia on social media services, along with other encrypted communications apps and the online dating industry. At the end of September, the Australia’s government reneged on its plans and shelved them until scanning became technically feasible in relevant electronic services.

Technically feasible methods by the parliament for preventing harmful content include hashing – which gives known material a unique value and includes it in databases. Using hashing technology on its platforms to detect known content, Meta – the parent company of Instagram, Facebook, and WhatsApp – is an example of industry associations using hashing technology to detect child abuse materials. Tech companies like Apple reported just 234 cases of child sexual abuse in 2022, whereas the company reported 27m cases.

Reports suggest that in cases where something is considered unfeasible from a social media service, other measures would be required to meet the standard, including having clear and identifiable mechanisms for reporting users and identifying patterns in user behavior – not to mention reviewing encrypted communications from internet service providers. Standards are due to be enacted by April next year and are open for consultation until the end of December.

Despite the developments of draft standards and enforceable codes, Digital Rights Watch’s program lead, Samantha Floreani, remains concerned about the methods of eSafety reference. Privacy and security researchers have criticized such approaches for questionable effectiveness, false positive risk, increased vulnerability to security threats, and the possibility of expanding the use of such systems to police other types of content. Implementing these draft codes and new industry codes would compromise the safety of users’ digital information in the country.

In addition, organizations that use generative artificial intelligence will also be covered by clauses intended to prevent the technology industry from generating child sexual exploitation or pro terror content. Companies must use lists of terms associated with child sexual abuse material or hashes of the terms to detect and prevent AI from generating such content, and users should be warned about the risk and criminality involved when entering them to ensure the privacy of all Australians.

Further Recommendations

The Department of Infrastructure’s position paper on regional development recommends a significant overhaul of Australia’s cyber defenses, emphasizing the adoption of open source software and the establishment of a working group for parliamentary approval, aiming to enhance online search safety information across a broad range of websites, including Guardian Australia, with transparent terms of use.

With a strong commitment to online safety, the regulatory requirements encompass various online games, including a comprehensive review of the online content scheme, categorizing content into different categories, with a particular focus on addressing the worst content. The second set of codes, aligned with the requirements of the new industry codes, aims to assist industry players, particularly small businesses, in meeting the determination of the industry standards. The commencement of the new codes represents a tremendously important online safety milestone for all Australians, ensuring compliance with Commonwealth legislation and draft standards that cover a broad spectrum of online safety measures.

To address cyber threats more effectively, the proposed industry codes include a sixth code covering search engines, with meaningful action against new offenses such as cyber-attacks and domestic violence. Smaller businesses are encouraged to adopt proactive detection obligations, employing industry designed methods for service scans to enhance users’ digital security and mitigate the risk of illegal content and harmful content surfacing. Such approaches aim to ensure that online industry participants, especially small businesses and service providers, comply with the new draft industry standards outlined by eSafety.

The current consultation process on regulatory compliance includes a set of mandatory measures, addressing a security overhaul and the protection of privacy, especially concerning cloud storage services and AI driven content. As part of eSafety’s approach, the Bose Amendment determination reflects a direct response to sexual abuse organizations, aligning with the Australian Federal Government’s seven year strategy. The proposed changes, outlined in eSafety’s draft standards, also emphasize the cyber security workforce and cyber law enforcement funding. Prime Minister Anthony Albanese and Communications Minister Michelle Rowland are actively involved in shaping this comprehensive approach, which extends to large businesses as well.

Internet Safety Tips for Kids

According to Australia’s Online Safety Act, to ensure your children’s safety, fun, and security in the country, it is important to teach them about the risks associated with their online activity and internet services. It can, however, be difficult to keep kids safe and have regulatory guidance on the Internet playground. After all, there is no teacher to keep an eye on them — and you can’t always watch over them.

Here are some ways the proposed amendments can protect and help your children against the everyday dangers they may encounter on the internet.

  • 1. Avoid chatting with strangers

Every day, children come into contact with strangers who might expose them to pro terror content while playing online games or communicating on social media and app stores.

However, cybercriminals also exploit systematic vulnerabilities in comment threads, chat rooms, and private messages online. Children are tricked into providing personal information by avatars, which can then be used to steal their identity and money. The elderly and the vulnerable are particularly vulnerable to these phishing scams.

  • 2. Report and block online bullies

It is common for people to harass and taunt others on gaming sites and social media through unlawful content. Players like these are referred to as cyberbullies.

Cyberbullying in sectors of the online industry can be difficult to control and prevent. It is common for moderators to attempt to ban players from online games. Getting every player can be difficult with so many players. The critical infrastructure for responding to cyberbullies on social media is equally diverse. Even their definition of harassment and abhorrent violent material may differ from that of other platforms.

  • 3. Avoid clicking suspicious links

It is well known that children enjoy free music, software, and games. Links and email attachments are also more likely to be trusted by children.

Internet safety for children cannot be achieved without parental guidance, home affairs, and enforcement powers. You can help your children by simply talking to them about not automatically clicking “yes” buttons and walking away from bullies and cybercriminals. By monitoring their online activities, Internet security suites with parental controls complement this.

  • 4. Don’t overshare online

No matter how private something is, it can never really be removed from the internet. There may be no way for children to comprehend that what they say, show, or share on the Internet has a permanent nature. They should be aware, however, that this information lives on in a variety of ways.

Internet framework and cyber security make it impossible to permanently delete anything. Artifacts, like breadcrumbs, will always remain in the data of your worst online content. People can also store private data for a longer period than intended. Social media companies also share that someone can always save a picture, message, or other data that your child sends. Man in the middle attacks and spyware can even eavesdrop on your child’s devices.

Conclusion

Prevention is always better than cure, as we all know. The best way to deal with child sexual abuse and child sexual abuse material and ensure cyber security is to take meaningful steps to prevent it rather than repair the damage after it has happened. Prevention of child sexual abuse and terror involves creating alternative ways to tackle issues rather than only stopping them before they happen.

Children’s safety and growth while using the internet are enhanced by these mandatory codes, which protect them from online risks, build trust with their families, prepare service providers for potential incidents, comply with regulations, and demonstrate professionalism.