Ruminants: For the Love of the Law
Long-form conversations with the people shaping law in Australia.
Ruminants is a legal podcast for people who want more than surface-level commentary.
Hosted by Louis White, the series brings together leading lawyers, law firm partners, academics, regulators, and senior figures working across government and public policy to examine how law is evolving in real time. From workers’ compensation reform and copyright law to artificial intelligence, environmental regulation, sports tribunals, and the future of legal practice, each episode explores the decisions, institutions, and pressures shaping the legal landscape in Australia.
This is not a podcast about black-letter summaries. It is about how influence, reform, and legal power operate in practice — and where the law may be heading next.
It is about how the law actually works — and where it may be heading.
Louis White is a journalist and Juris Doctor graduate from UNSW with a long-standing interest in law, public policy, and institutional accountability.
His interview style focuses on open-ended, conversational discussion, allowing guests the time and space to explore ideas in depth rather than through rehearsed commentary. Through Ruminants, he brings together leading voices from across the legal profession to examine not only what the law is — but what it is becoming.
Produced by Angela Stretch and published by BarNet Open Law. The team that brings you JADE and Jasmine and Ledger.
Ruminants: For the Love of the Law
Dr Rob Nicholls: Social media regulation, platform power, and digital accountability
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Dr Rob Nicholls examines the growing legal and political battle surrounding digital platforms including Google and Meta.
The episode explores misinformation, online harms, free speech, regulatory intervention, and the increasing tension between governments and technology companies.
Nicholls discusses whether existing legal frameworks are capable of dealing with global digital platforms and how artificial intelligence may further complicate regulation and liability.
Dr Rob Nicholls is a Senior Research Associate, Centre for AI, Trust and Governance, University of Sydney
Ruminants is a legal podcast series that goes beyond the headlines to unpack the ideas shaping law, policy, and public life. Hosted by Louis White and produced by BarNet/Open Law.
Produced by Angela Stretch and published by BarNet Open Law. The team behind the JADE legal research platform and other legal informatics.
Music by Out To The World, Bulletin World, courtesy of Epidemic Sound.
Ruminants is a legal podcast series that goes beyond the headlines to unpack the ideas shaping law, policy, and public life. Hosted by Louis White and produced by BarNet/Open Law.
Produced by Angela Stretch and published by BarNet Open Law. The team behind the JADE legal research platform and other legal informatics.
Music by Out To The World, Bulletin World, courtesy of Epidemic Sound.
Welcome to Ruminance, the legal podcast exploring the ideas, decisions, and forces shaping the profession in Australia today. Hosted by Louis White and produced by Barnett and Open Law, each episode brings together leading voices from across the legal community, from judges and barristers to academics and regulators for thoughtful, in-depth conversations on everything from AI and regulatory reform to major court decisions and the evolving role of lawyers.
SPEAKER_01I am Louis White, a seasoned journalist who has covered major news stories in Australia and England for the past three decades. I've always been fascinated with the law, and in 2025, I completed my jurist doctor at the University of New South Wales, which has enabled me to further my practical understanding and knowledge of the law, and now I'm the host of the Rumnutes Podcast series. Welcome Rob to the podcast. Great to be here. Thank you. So we'll just start with a bit of background about a recent case in America. A Los Angeles jury found Meta and Google liable for intentionally designing predictive features on Instagram and YouTube that harmed a young user's mental health. In a March 2026 landmark verdict, the companies were found negligent for causing addiction-related issues with a six million total damage award, of which 70% was Meta were responsible and 30% Google. At the center of this shift is a series of lawsuits filed in California, often referred to as the LA social media addiction cases, where plaintiffs allege that platforms like Meta and Google deliberately design their products to be addictive, particularly for young users. These cases draw on product liability, negligence, and failure to warn arguments. Essentially framing social media platforms not just as publishers but as designers of potentially harmful products. So the question is: if platforms can be liable for addiction related harm in the US, what does that mean for Australia?
SPEAKER_02I'm going to start by digging into some of the issues that flow from that case, because I think it's important to frame how we do it in Australia. So one of the reasons why this case was successful and other cases had not been was the focus specifically on the platform. So in the US, Section 230 of the Communications Decency Act essentially means that platforms are not responsible for third-party content. So if I try to take an action saying, well, I'm addicted to cat videos on YouTube and it's ruined my life, Google would come back and say, yes, yes, that's very sad, but that's third-party content, and we're not liable for that. So the what happened in this case is that the young woman concerned drew a very narrow approach saying it's not the content, it's the platform itself. It's the infinite scroll, it's the approach which means that uh I, as the the plaintiff, spent 16 hours a day as a young woman on Instagram, and that that caused harm. My parents couldn't do anything about it, um, and that harm included suicidal ideation, body dysmorphia, a series of mental health harms. And as you say, there were six million dollars uh awarded, three million of that was damages for those harms, and three million was punitive damages because the platforms were likely to have understood what was going on. Now that case was taken in in negligence, primarily in the US, but I think here we take a slightly different view. If a product is defective, there is strict liability in Australia. So a manufacturer or importer of a product, which is the manufacturer, is strictly liable for the safety of a product. In Australia, software is defined to be a good, so it is a product. It's interesting, the EU also has strict product liability, and it's had recently had a legislative change in 2024 to make it clear that software is a product. Interesting that it occurred before this case. So if I'm here in Australia, I'm going to say, well, using the material that was the US discovery and which was became public in court, I know that there were issues with the two products, YouTube and Instagram, and that those products were faulty. They had to be faulty because what they caused was addiction in young people who were susceptible to addiction. So using the Australian consumer law, I'm going to say, well, actually, those products are defective. If they're defective and I suffer a harm, then if that harm causes a damage, I have a claim in damages against the manufacturer of that defective product. So my argument would be the social media platforms are addictive, that must be a design flaw because nobody would supply a product which was designed to be harmful, so there's a design fault, and that uh somebody has been damaged by that design fault. In practice, I'm sure we'd end up with a class action or a series of class actions where the plaintiffs in that cloud, those class actions would say, due to the design fault, I have had mental health issues that flowed from the addictive nature of the product. That addictive aspect of the product was a dark design fault, and therefore I have a claim for damage. The class action itself would look for people who have used social media platforms, have to use them on a device rather than using the web version because it's the software that is the product that's potentially defective. But you would run that action as a class action saying that harm has flown from a product defect. And the same theory of the case could apply in the European Union as well.
SPEAKER_01So these are product liability claims. Absolutely. Yep. But are you saying that if you did it through the web and not through an app, could you still make that claim?
SPEAKER_02No. And the the reason for that is that it's the product that is defective, or the argument is the product that's defective is the app, the software that is used primarily by people using uh social media on phones rather than uh web-based, because there is no there's no no product in that web-based service, because it's a service, as opposed to the software, which is a good.
SPEAKER_01So that became a very technical legal case down the line, couldn't it?
SPEAKER_02It could, but how many people do you know who use Insta on anything other than their phones? Yes. So uh so if I'm on the defendant, if I'm acting for the defendant, I'm gonna be saying, uh yeah, have you ever used uh the web-based version? Are you sure you didn't just use it on Chrome on your phone rather than using uh the app? But I think I'm sure that the uh the class action firm that brings the class together would do their homework on the the the group with that forms that class.
SPEAKER_01Uh Rob, thank you. Change law and these will now become product liability claims around the world, or how will it be phrased going forward?
SPEAKER_02Yeah, so it where the in the US, where there isn't strict product liability, it will continue being negligence torts and run at a state level, because that's typically where um where the courts have jurisdiction to hear those cases. In countries like us, like many other countries outside of the US where product liability is strict, then yeah, go product liability. You could do it, you could go for a negligence tort separately, but the product liability is much easier to run and much harder for the other side to defend.
SPEAKER_01And so how does the idea of addictive design play legally?
SPEAKER_02Well, it sounds very similar uh to tobacco, to alcohol, and to gambling. So how would you how would you do it? In practice, is probably end up with the social media platforms having health warnings for people over the age of 18 saying this product is addictive. Do you remember when tobacco uh laws first came in, there was just a small sticker on the side of cigarette packets? Something similar to that. Uh something not dissimilar to the way that uh alcohol is labelled or can be labelled, particularly in other countries, and the warnings that uh you win some, you lose more that appear on gambling as well. So for over 18s, it's essentially we'll stick a health warning on it, and I think that Meta and Google will be looking at putting those health warnings on. For under 18s, it becomes more complicated. It's also where addiction is more likely. In this country, well, under 16s, there's a social media ban in any case. So it's really 16 and 17-year-olds that both Metra and Google will probably end up looking at slightly different product design. So instead of having infinite scroll on Instagram, if you have a mandatory five-minute break every hour or every two hours, that breaks the potential cycle for addiction. In fact, all of the same things that are recommended for poker machine to stop poker machine addictions, they still all apply in the social media context.
SPEAKER_01So then is the emphasis on the plaintiff to prove harm and causation?
SPEAKER_02Because liability is strict, there's an argument to say no, all you have to do is show that there was harm created from the faulty product. So this is why it's different to the negligence approach. So first negligence first show that there's been harm. Product liability shows that there is a fault in the product, exactly the same way as the toaster that you bought that didn't work the way it was supposed to. Hopefully, you just need your money back, but if you've got an electric shock, you've got an action against the supplier of that faulty product. It looks much more like that than having to prove negligence or even prove a willful design. It's just a design fault.
SPEAKER_01And so with Australian law how it stands today, would a claim like that succeed?
SPEAKER_02In my view, it has a good prospect of success. And I can't promise any more than that.
SPEAKER_01So, what are the key friction points? Duty of care, causation, statutory protections?
SPEAKER_02So you've got the statutory protection that comes from the Australian consumer law. I think the duty of care. Um the duty of care was recommended uh in two places. A review of the Online Safety Act, which was completed by Delia Rickard in December 2024, and the government responded um in on the 13th of April in 2026. And there was a joint parliamentary inquiry into social media in Australia, and that was also a recommendation more broadly than just the Online Safety Act. So the government's response to the the Ricard review is to say yes, we will introduce a duty of care. So this will change the emphasis that the social media platforms will have a responsibility of care to the users of those platforms. Now, the response of the government also gets a bit checkered after that because uh Delia Ricard put forward a whole series of potential groups that were more likely to be more susceptible to problems on social media platforms, and that only got a um a uh partly a noted and partly uh support in principle, which, like most government responses, noted and support in principle means we ain't gonna do nothing about it. Um whereas the uh duty of care itself got a support. So we are likely to see legislative change in that area. And indeed, the the government response says there will be legislative change there.
SPEAKER_01Aaron Powell Would that be a new act, do you think? Will it be strengthening the Online Security Act?
SPEAKER_02The Online Safety Act, yeah. It will be an amendment to the Online Safety Act. And in fact, that's precisely how the uh the government response frames it.
SPEAKER_01And how do you think if a case came to the Australian courts now, similar to what happened in the US, they would rule? I know it's speculative, but just curious.
SPEAKER_02Yeah, no, if it I think if you tried it in negligence, it would be complicated, which is why doing it through strict product liability is more useful. If if I was on the other side, my defense would be, ah, but it's not just the software, it's uh software as effectively software as a service, it's a whole bunch of services, and then the response would be, yep, sure. But the services aren't the thing that's causing the addiction, it's the software that's causing the addiction and therefore the potential harm.
SPEAKER_01And so you proposed strict liability as a framework. What would that act, what would that actually mean for platforms?
SPEAKER_02It means that they would actually have to think about does anything in the design of their product, their software product, have a potential for causing harm? And what we've seen that flows from the US case, which is would be useful in Australia, is that the discovery process in California led to the revelations about internal documentation within each of Meta and Google about their product design and that product design looking for, if you're being charitable, maximal engagement. And if you're not being charitable, addiction or the potential for addiction. Now, why? Why would you do this? Well, actually, in effect, what you're doing is saying, here's a free product, uh, there is zero price on this product, but what we're trying to do is grab your attention. And then on the other side of the platform, we're saying to potential advertisers, look, I have a group of people who pay a lot of attention to this platform. Because they pay a lot of attention, they're valuable and more valuable than people who are simply uh skimming through a website or looking at a particular another uh form of interaction. So it's capturing the value of that attention, and you capture that value by having a micro auction for advertising to address that attention. So platform design necessarily wants to grab attention, it wants high levels of engagement. The issue is to make sure that that engagement keeps its value by having advertisers on board, but does not lead to addiction. And I think the the issue that the platforms just need to deal with to avoid uh uh actions in Australia and everywhere else in the world is to have breaks which say that yes, we're selling advertising, but advertisers, don't you worry that there's going to be a reputation for addiction for this product because we've got anti-addictive approaches being taken. So I think we'll see shifts in product design and shifts in in the way that the product is sold, probably leading to the platforms getting more money for their advertising. But that's uh that's the nature of things.
SPEAKER_01But what could be like anti-addictive devices? What what could they do practically?
SPEAKER_02Just breaks. So a little bit of a little bit of friction makes a huge difference. So if it even if it's you know a five-minute break every two hours, that helps. If um the instead of having infinite scrolling, you have scrolling that stops at a point or after a set of time, and you actually have to think about shifting to another inquiry, another look on Insta. You have to go back to your home page, something that actually just reduces that uh monotonic approach, monotonic or catatonic approach, which is uh scroll, scroll, scroll, just in the same way that poker machines have that tap, tap, tap addiction, and that the approach there is uh time limits on machines, so that you actually break the potential for addiction by shifting from uh a basically a reactive response to one where you actually have to think for a little while.
SPEAKER_01And so would platforms simply be liable because harm occurred, or was it more detailed than that?
SPEAKER_02For the strict liability, yes, they're liable if harm occurred. I think the issue would be uh in this country, very unlikely that for there to be punitive damages. And w what is the damage? So, how do you actually work out the the nature of the damage and the value of the damage? And so, yes, the costs of psychologists if you uh were having some sort of addictive addiction um treatment, but it's a bit less clear that uh what those harms are in terms of your honour the val the cost, the damage was X thousand dollars, and I think we'll we'll genuinely see significantly smaller potential uh damages cases in Australia, which is why you'd probably run them as a class action because to get to the value that a litigation funder would need to see.
SPEAKER_01And and how would we define the scope of harm?
SPEAKER_02It would almost certainly be a mental health harm that's occurred to a number of people. The bits that get harder are well, I was addicted, so I lost my job, or um, I was addicted and that led to some other financial consequence, but those are the types of issues that the courts would need to think about.
SPEAKER_01So we would need psychologists and all sorts of
SPEAKER_02Yes. Yeah. That this is that this is a faulty product and has the potential for causing addiction. Yes. Which again, this is why potentially a class action approach makes more sense, because if you're going to be having ex experts appearing before the court, especially in a a consumer law matter, you probably want to have a small number of experts. You probably want to hot tub them and see where whether both sides where both sides agree and where they differ to determine if there is this uh if the addiction that was found in the California courts is a design fault and therefore this uh liability on the supplier of the product.
SPEAKER_01And so therefore we're going to see fundamental product redesign, aren't we, going forward?
SPEAKER_02Yeah, I think we are. Um and I I think I think though we've actually already started to see that. So METAs um in introduced, particularly in the European Union, uh approaches which are insta-designed for kids, so that uh where social media access is is not prohibited, that children have an experience which is uh more curated, safer, and has a lower likelihood of addiction.
SPEAKER_01And so in Australia, speaking about that, we have the Online Safety Act, but it which imposes obligations, but it doesn't create a general duty of care. Does that need to be addressed?
SPEAKER_02Yeah, and this is the um the government report that was issued on 13 April agrees that there should be a duty of care, and that duty of care um I think the risk, the only risk with the imposition of a duty of care is that uh it needs to be very clear and comprehensive and not overly simplified. And well, who would carry that burden? The social media platforms. So if you if you're going to have millions of users, um your responsibility is to ensure that the well-being of those users is uh is maximized.
SPEAKER_01And do you see issues with with individual countries have their own safety acts and you've got multinational companies, how hard then is it to bring someone to jurisdiction and to start legal action and successfully win?
SPEAKER_02I think that's that is a real issue, and we've seen this in the action that the online safety commissioner took against X a couple of years ago after the stabbing of uh a bishop in a Western Sydney um Western Sydney church, where X essentially said, Well, the video of the stabbing of the bishop um was not harmful, and why should we take it down? Now, why why would X say it's not harmful? Well, basically, the shot of the bishop was a shot of the bishop celebrating mass, so you saw his back. The assailant um then went to stab him in the back, you saw a stabbing motion, there wasn't very much blood to see, there was not very much that looked violent, and so X's automated system for determining whether this was socially acceptable content said, well, if it's in a movie, you might get a PG rating, you can get an MA rating. And so it's it meets our guidelines, and it doesn't meet guidelines in the context of a terrorist action, but that part X didn't accept. X argued in court that it it was it was X, it wasn't Twitter and uh therefore couldn't be sued. In the end, the the court found that there was there was nexus with the the issue, but that X had acted reasonably by taking down access to the video in Australia, even though it could be obtained by simply using a VPN from other parts of the world. So there is a there's a a real problem which is does the provider of the service have presence in Australia that can sue and be sued in relation to all of the services it offers in Australia. And the one of the recommendations from the Online Safety Act review was that there should be licensing of platforms. Why? So that you've got a presence in Australia and therefore you can enforce the relevant parts of the Online Safety Act. That was rejected in the government review, and I think that's entirely problematic. So how do you know who you're going to enforce against is a real problem, and how do you take action against an entity which is not donating in Australia and which says that service must be given in another country where the issues of the Online Safety Act are treated differently jurisdictionally, saying, well, committee should apply, and therefore we can't be expected to do the things in the US that we're expected to do in Australia. These are all problematic issues, and they're fundamental to the way that platform businesses are or should be regulated in Australia.
SPEAKER_00You're listening to Ruminance for the love of the law. Conversation shaping Australian law from principle to practice, insight, reflection, legal thought in motion.
SPEAKER_01There's no easy solution though, is there?
SPEAKER_02I oh I I think there is. You you've licensed them. And the license, I'm not saying charge a lot of money, it's just you say in order to be a content service provider that provides platform service, social media services. So you take two definitions, one from the Telecommunications Act, another from the Online Safety Act. If you meet those two requirements, you need to have a license. The obligations of your license is to have an entity that can sue and be sued in Australia and is responsible for all of the products and services supplied in your name in Australia. That's it, you might charge a dollar so that the for the license, but that's a perpetual license for a buck. It just means that actions can be taken in Australia in relation to harms in Australia. So if Dr. Norman Swan is uh faked on social media, which happens quite a lot, um, and is in those deep fakes suggesting therapies that would never come out of Dr. Norman Swan's mouth, he has an action in Australia rather than trying to think about having to sue internationally in the same way that Andrew Forrest has had to sue in the US for uh for deep fakes showing him promoting crypto scams. See, having jurisdictional nexus is really important. It's important for online safety, it's important for uh individuals within Australia who are being misrepresented, and it's important to ensure that advertising meets the expectations of society in Australia, which may be different to that in other jurisdictions.
SPEAKER_01So why was the licenses then recommendation rejected?
SPEAKER_02It was said that it might potentially be problematic under the uh Australia-US Free Trade Agreement. That's I can't resist. That's the agreement that says that we will agree on in on tariffs between ourselves and has been breached. But um nevertheless, if we want to say there's been no repudiation of the US-Australia Free Trade Agreement, it's in place. You have to think about if you're introducing a license that looks as if it's uh only affecting US firms, well, yes, you have to have an e there might be a potential issue. There isn't, because TikTok's there as well, so it's by dance as well. So this is not just a get the US approach. It's going to apply to other businesses outside of the US. And if there was a fabulously successful Australian native social media firm, they would also be affected. So I think it's a a very, very conservative approach that says, well, there's a potential there that the US trade representative will say that we've done something wrong. Well, firstly, read the agreements, and secondly, well, a little bit of backbone might say, well, this isn't problematic. In my view, it's it's not problematic. It's just that the concern that there might be an issue means that the government won't actually address it in the way that it needs to be addressed.
SPEAKER_01And and we've talked about the shift of social license on platforms. What what do you think has brought that about?
SPEAKER_02When Facebook was a place where you could share news about you and your community with your Facebook friends, there was a high level of social license associated with that. This looked like something that was good. I think more recently the issues have have shifted, that there is not the clear, you know, Instagram is good, Facebook is good, YouTube is good, TikTok is good. It's much more, yeah, these are very big businesses making a very, very large amount of money from advertising, that doesn't mean that they're necessarily working in my best interest. So that social license, I think, has shifted. There's a lot more cynicism about platforms and social media platforms than they used to be. And so giving a free kick to social media platforms by saying, yeah, we're not going to require that you're licensed. And yes, if the e-safety commissioner has real problems trying to enforce the Online Safety Act, well, yeah, that's really a problem. As opposed to saying, well, no, we've got a regulator. She's asked for teeth, she's got some teeth. Why would you put something in her way when she wants to enforce the law that was adopted by the legislature of this country?
SPEAKER_01And do you see, you know, going forward, the battle between regulators, private companies, governments, you know, public servants empowered with uh certain decision making just getting more and more murky?
SPEAKER_02I don't, but I don't for this reason. That actually the almost all of the power is the power to take action in courts. So I actually have a a lot of confidence that the federal relevant federal courts will ensure that that potential murkiness goes away by considering cases on their merits. I think where it gets problematic is where you can't take an entity to court because it has no presence in Australia, yet it's responsible for delivering services which have caused some harm. And we haven't addressed that problem. That requires a bit of political will, not terribly much political will, perhaps a little bit more political courage to say, yeah, okay, the social media companies and the these businesses are huge, they're huge in Australia, but they still need to comply with Australian norms, and if that means requiring presence in Australia in order for there to be regulatory enforcement, well we'll we'll make take that step and require that presence.
SPEAKER_01So basically, social media platforms should be treated like regular utilities, shouldn't they?
SPEAKER_02Perhaps as a utility, um, probably not as an essential service, but a bit like how would you treat telcos? Well, how do you treat telcos? You say, well, if you're a carriage service provider, that is you're providing services, um, telco type services to the public, you need to be a member of the telecommunications industry ombudsman scheme in order for the complaints mechanisms to work. If you're a telco that uses infrastructure, you need to have a carrier license and your obligations go up. There is a parallel regime in the Telecommunications Act for content services providers. It's just it's parallel at one level, so the content service providers are class licensed, they have to uh comply with the obligations of the class. There just isn't that next step of saying really big content service providers, that is people who provide uh social media platforms and that type of service don't need to have an individual license the same way that carriers like Telstra Optus and TPG Vodafone do because they're using infrastructure. So we've got this idea in the legislation, in the Broadcasting Services Act, in the Telco Act, that says people who have more, basically, businesses that have more impact are more highly regulated, but we seem to be reluctant to apply that same principle to social media platforms. Do you think that's just the cultural thing and that it will change going forward? Well, I hope it will change going forward. So I think it goes back to that social uh license question that you asked me. That actually, while you know, while Meta was was still Facebook and was uh was young, we could tolerate Zuckerberg's move fast and break things type approach. It's a nascent industry, they're being innovative. Regulation could style that s could stifle that innovation. But we've moved on from there. These are now huge businesses. Um why why don't we treat them in the same way that other large businesses are treated in this country? Why why should there be a a special rule, which effectively it is, that we we don't treat these the same way that we treat big players in any other sector?
SPEAKER_01And the AECCCC have proposed reform, but that's stalled. Why is that, do you think?
SPEAKER_02Why is it stalled? Um I I honestly don't know. I was part of a Chatham House rule workshop at the end of 2024. So you know, a year and a half, almost a year and a half ago, a long time ago, where the questions suggested that it was only the legislation was only weeks away from being drafted. But Treasury didn't come up with that. It's really important that competition law deals with the fact, as as it did in the telco sector, as it does, well didn't does in telco, as it didn't does in the energy sector, you have some very big businesses which you might actually say, well, they have a significant degree of market power, and therefore we should treat them in a way that says we know that power could be abused. Don't think you're abusing it, but it means that we look exant look at you from an ex ante from before you do something wrong position, and that gives the highest protection to two things. One, competition, and two, consumers. So the ACCC, this is this is not an abnormal thing to do. It's not world-leading legislation, it's just saying, well, it it's it's logical, it's consistent with what's been done in the past. Uh, the government committed to do it before the last election, after the last election, and they're still committed. It's just that we haven't seen any exposure draft legislation.
SPEAKER_01So, what's the cost to the delay?
SPEAKER_02I think that the key thing is that there's now a view that, well, perhaps we don't need to worry until we're caught. So the what's the point of ex ante regulation? It's to say we're watching you, so you will be caught. Ex post regulation, you're only caught if you've if you've done something wrong. So what we're doing is encouraging testing what the edges are to competition law and to consumer law by large platforms that should and should, according to the government and according to treasury, but not according to the law, be treated in a way that looks at them on an ex ante basis. And do you entrench market power by doing nothing? Absolutely. Yeah. Um so if you've got some if you've got a competition authority that and a government that says we need to treat certain businesses differently, we need to treat them on an ex from a competition law perspective on an ex ante basis, because of their market power, well, if you don't do it, that market power gets further entrenched. In some areas, so the the merger reforms have slightly changed that that balance, but it's the general they're just subject to general competition law. If you had that ex ante regime, you would reduce the risk of entrenchment of market power and certainly reduce the risk of abuse of that market power.
SPEAKER_01And do you think age restrictions are enough? We've seen the change in legislation in Australia, but it hasn't been that effective when you talk to you read the surveys and polls and speak to parents about forcing under-16s nervous social media ban.
SPEAKER_02Yeah, so actually the the e-safety commissioner's own report says a couple of things. One, her her survey of parents found that surprisingly, a big chunk of kids weren't using social media before the social media ban, but that after the social media ban, two-thirds of the kids who were using it are still using it. But and her enforcement uh approach, I think, has been pretty smart. She said, look, here are five things that the uh social media platforms are doing wrong. Uh number one, basically allowing you as many times as you want to try and prove that you're over uh 16. So this is not like forgetting a password. You either are or you aren't. And it's partly because, you know, let's say you were found to be. Under 16, well, there should be just some easy appeal mechanism where you can say, Well, I I'm actually older than that, and and here is some proof, rather than going through multiple times where my own very small samples say kids try, oh well, let's try a facial check when I look like I'm under 16, you know, try it with a mask, try it with mum's makeup, trying putting up somebody else's image until you get through. That that doesn't make any sense at all. Um, another one that she she found was that um even though the social media companies knew somebody was a particular age, for example, 13, which is uh Meta's uh old threshold, when could you go onto social media? You needed to be 13. And so celebrations of and photos on Insta of my my 13th birthday party, which was last year, you'd still you might come to the conclusion that that person is either 13 or if you've got the date wrong, might be 14, they're certainly under 16. Why would you give them an opportunity to try to prove to you that they're over 16? These are very simple parts of the product that I think uh the safety commissioner has been very smart in pointing out and saying, here's a list of five things that you're doing wrong, um, and actually just fixing that would stop me from having to uh then go off to the federal court uh for enforcement, because that 60% of kids would drop quite significantly. Actually, VPN sales will also go up, but that's a separate issue.
SPEAKER_01But do you think legislation like that is addressing addressing symptoms rather than causes?
SPEAKER_02I think that I think that if you've got to the fundamental thing, which is you can't use your sch I'm I'm in New South Wales, you can't use your phone in school. You drop your phone off when you go into school, you pick it up on the the way out. If you also have that, well social media is only available if you're over, if you're 16 or over, you you haven't addressed all of the issues. Because why why what's the fundamental thing here? It's harm, it's online bullying, it's addictive behaviour. You're not totally fixing it, but you're doing something which is helping to fix it. And the responsibility under the the relevant legislation is on the social media companies, so your enforcement has to be, well, make it work. Um are you doing everything reasonable? Well, I've just given you five things that you we found you're doing which aren't reasonable. Only aren't reasonable in my opinion, but I'm prepared to go off to the federal court to see if a judge will agree with what I think is reasonable. Um so those five things will be addressed. I think we'll see another report in another four or five months, which will have another five things. In the end, is it going to be perfect? Absolutely not. Is it going to reduce online harms for under 16s? Probably, and probably in a good way.
SPEAKER_01And last question, if you can make one reform tomorrow, what would it be?
SPEAKER_02It would be to introduce licensing for social media companies.
SPEAKER_01You're not surprised by that, uh Rob, thank you for joining the Ruminous Podcast series today. Thank you. Thanks, Lear.
SPEAKER_00Support for this episode comes from Jade by Barnett, providing real-time Australian case law, integrated citation tracking, and intelligent legal search. Jade, research that keeps pace with the law.