TRANSCRIPT:
"Social media has a big role to play in the lives of young Australians... These social media companies couldn't care less about Australian kids. So we have to."
That's Opposition Leader Peter Dutton, reacting to the federal government's proposal of a new digital duty of care to work alongside new social media age restrictions.
He says the government can't delay age restrictions any longer, and certainly can't rely on social media companies to look after young people.
Communications Minister Michelle Rowland says the plan has always been to reduce online harms.
"There have been questions about how age assurance technology will work and whether the enforcement is child proof. The Government does not pretend any solution from industry will be 100 per cent effective, or that children won’t find a way around it. The reality is: some will. But the fact is the normative value of the age limit will be immense, because parents will draw on this reference point during their kitchen table discussions with their children."
But some raised concerns about the legislation, arguing a ban would fail to address the source of harmful content.
The digital duty of care is the next step in the government's battle with tech companies, one of the key recommendations outlined in an independent statutory review of the Online Safety Act.
Lisa Given is a Professor of Information Sciences at R-M-I-T University in Melbourne.
She has told SBS that where the social media age limits fall short of targeting harmful content, the digital duty of care offers a potential solution.
"A ban doesn't solve the kind of content problem at source. The digital duty of care does try to do that. So effectively it puts the onus onto tech companies to be constantly monitoring for inappropriate information, pulling it down, and then addressing challenges when people put those forward. The ban for social media for kids under 16 doesn't deal with that type of content at source. What it does is effectively say, we don't want anyone under these ages able to access that kind of content. So I think a critical question is going to be if the duty of care legislation comes in and we can make the platforms safer for everyone, is the social media ban for children still necessary?"
Michelle Rowland says the digital duty of care will put the legal responsibility of keeping Australians safe online into the hands of tech companies.
"Our legislation will also contain positive incentives as part of an exemption framework to encourage safe innovation, and also provide for access to social media type services that enable education or health support for young people. Social media platforms that can demonstrate they meet set criteria and do not employ harmful features, or provide positive benefits for children, may apply to the regulator for approval."
A duty of care is a legal obligation to take all reasonable steps to avoid causing harm to others.
For example, a shop or restaurant owner has a duty of care to their customers and can be held legally responsible if someone injures themselves by slipping on wet floors.
Doctors have a duty of care to their patients, teachers to their students, and employers to their employees.
In these cases, however, the harm in question is often a tangible one, like an injury or damage to property.
Online, harmful content is far more vague, harms caused by online content may not be realised until well after the material was consumed.
But the Minister says they will target specific forms of online harm.
"To complement the overarching Digital Duty of Care, the Albanese Government will legislate enduring categories of harm, which we propose could include: Harms to young people; Harms to mental wellbeing; The instruction and promotion of harmful practices; and Other illegal content, conduct and activity. An essential component of satisfying the Duty of Care is undertaking regular risk assessments against the enduring harms."
The proposed legislation aligns with existing approaches adopted in the United Kingdom and the European Union.
The E-U's Digital Services Act establishes what's known as a 'notice and action regime', requiring platforms to restrict content violating either their terms of service or the laws of an E-U member state.
If 'very large online platforms' or search engines fail to comply, they can in turn be fined up to 6 per cent of their annual global revenue.
While the government is yet to announce what kind of penalties would apply to companies in breach of the duty of care, Professor Given says the laws make it so complaints can't simply be ignored.
"So effectively it says to the tech companies, you've got problems, there's potentially harmful content, there's misinformation online, and we need you to figure out what the solution is. But the other piece of it, the way that it's been enacted in Europe and the United Kingdom is that it actually puts a lot of power in the hands of consumers so an individual can complain about content. There are, in Europe for example, each country has its own digital services coordinator where complaints can also be lodged and at the end of the day, people can also take things to the courts".
While the European Commission is yet to issue any fines, they have opened up formal proceedings against a host of platforms including TikTok and X.
In the U-K, the online safety bill promised to make the U-K the safest place in the world to be online.
But it's proving to be more difficult than first expected.
Concerns have been raised surrounding the potential that this legislation could have a chilling effect on free speech and freedom of expression, with the intangibility of what constitutes harmful content.
Professor Given says while it's certainly something to keep an eye on, it hasn't been a major problem overseas.
"I think that there have been people raising those kinds of concerns, but we don't actually have any clear case examples coming out of Europe or the UK at this point. I think that we will see over time whether tech companies decide to kind of impose a more stringent approach to this. There could well be challenges about information that's been inappropriately taken down as well as content that should be taken down."