Take 10 - 6 December 2024
Welcome to RPC's Media and Communications law update. This month's edition on key media developments and the latest cases.
House of Lords' Future of News Report
On 25 November, the House of Lords' Communications and Digital Select Committee published its Future of News report following an inquiry in early 2024 into issues affecting the UK media. The Committee's report made a number of recommendations to help counteract some of the challenges likely to be faced by media publishers. Of particular concern was an emergence of a "two-tier" media environment, where a growing percentage of society will have limited engagement with professionally produced news, favouring more easily accessible, and potentially lower-quality, free information.
Generative Artificial Intelligence
The report urgently calls for the development of an AI regime in the UK, aimed at encouraging innovation whilst protecting publisher copyright. The Committee acknowledges that the UK must remain competitive in AI development, but they insist that a framework is needed to govern how news content is used to train generative AI. In the Committee's view, this framework “must include transparency mechanisms that enable rights holders to check whether their data has been used”, with meaningful sanctions for non-compliance. They also call for a mechanism that would enable creative industries to strike mutually beneficial deals with tech firms. The report suggests that, in light of tech companies acting increasingly as publishers, they may need to be regulated as such and it questions the decision to exclude online intermediaries from the media plurality regime.
Recommendations to tackle SLAPPs
The report accuses the Government of "failing to prioritise" anti-SLAPP legislation, blaming a lack of political will despite various consultations and the availability of viable legislative options. The Committee has called on the Government to publish draft legislative proposals by the 2025 summer recess, suggesting that the Victims, Courts, and Public Protection Bill, announced in the recent King's Speech, could be used to expedite the process. Whilst applauding the progress made by the SRA on SLAPPs, the Committee criticised the exclusion of law firms accused of malpractice from the SRA's evaluations. The Committee also suggested that the SRA's fining powers should be raised from £25,000 to £250 million, noting that this figure accords with the level of fines the SRA can impose on other types of licensed bodies. Concern was also cited regarding the potential use of illicit money to fund SLAPPs, and it was suggested that s.327 Proceeds of Crime Act 2002 should be amended to address this issue.
Mis/disinformation
Whilst the Committee welcomes initiatives to improve public trust in information, they insist a balance must be struck with protecting freedom of speech. They caution against being overzealous with counter-mis/disinformation strategy that places too much emphasis on measures in the Online Safety Act, such as algorithmic tweaking which may not address the root causes of supply and demand. To address the issue, the Committee has suggested four areas of prioritisation: (1) ensuring a financially sustainable news sector which upholds factual understanding; (2) engaging media organisations in protocols for responding to foreign interference, especially in elections; (3) adopting stronger deterrence measures against adversaries, such as using cyber capabilities; and (4) strengthening media literacy with a clear, coordinated strategy such as by integrating the topic in school curricula.
Too late for anonymity in medical negligence claim
Mr Justice Nicklin has rejected an anonymity application in a medical negligence case, emphasising the importance of open justice and the difficulty of imposing an anonymity order when information about the Claimant and the case was already in the public domain. The claim was filed in March 2023 and had already progressed through a number of stages, including the filing of Particulars of Claim which contained extensive details about the Claimant's disability. Last month, the Claimant's solicitor issued, for the first time and without notice, an anonymity application, citing concerns over the Claimant's vulnerability and his and his family's Article 8 rights. This application was prompted by a journalist's inquiry regarding publishing an article about the Claimant's case, based on a copy of the Particulars of Claim they had obtained.
Nicklin J confirmed that s.37 Senior Courts Act 1981 provides the statutory basis for granting the reporting restriction, as opposed to CPR 39.2(4) or s.6 HRA 1998 [125]. He also indicated that JX MX, the authority widely relied on for anonymity applications, was inconsistent with principles of open justice and failed to consider and apply the correct tests [108] to [114]. Despite the fact the Claimant was a child, who was "allegedly vulnerable to exploitation", and the proceedings were likely to involve consideration of private medical information, Nicklin J considered that the Claimant had not discharged its burden of showing by clear and cogent evidence that anonymity was necessary. Crucially, the Judge highlighted that both the Claimant's mother and solicitor had previously voluntarily disclosed substantial information to the media about the Claimant and the claim, and the proceedings had been conducted to date without an anonymity order [132]. He held that it was not practically possible to secure meaningful anonymity for the Claimant [133] and, conversely, there was "a clear and continuing public interest" in the claim [137] and an anonymity order would "represent a disproportionate interference with the Article 10 rights of the [media publishers] and the public generally" [141]. The Claimant has been granted permission to appeal and his identity therefore remains anonymous.
Libel claim struck out over failure to prove Defendant was publisher
The High Court has dismissed claims for libel and malicious falsehood on the basis that the Claimant failed to prove the Defendant was responsible for the allegedly defamatory publications. The Claimant, a solicitor and sole practitioner, had previously acted for the Defendant who was dissatisfied with her services, prompting him to file complaints with the SRA and the Legal Ombudsman, which were not upheld. The Claimant alleged that the Defendant, whose name was Christopher John Henry, had posted three negative reviews on her "Google business profile" under the aliases "Chris H", "John H", and "P.R". However, Deputy Master Marzec found that there was no evidence to prove that the Defendant was responsible for the reviews, particularly given the Claimant acknowledged that she had received negative reviews from other clients [36] and the reviews complained of were "of a fairly generic kind that could have been posted by any unhappy client" [39]. Surprisingly, the Claimant had not made an application for third-party disclosure to prove who had authored the reviews. The Judge also dismissed the Claimant's allegation of malice, finding that there was nothing in the reviews to indicate that the reviewer was not honestly expressing their views.
First annual risk assessment due under the Digital Services Act
As of last week, the 19 very large online platforms (VLOPs) and very large online search engines (VLOSEs) designated under the Digital Services Act (DSA) in April 2023 were due to publish their first annual risk assessments and audit reports under Article 42 of the DSA. The reports will detail the risk assessments carried out by the providers of VLOPs and VLOSEs, aimed at identifying and assessing the risks arising from their services, including dissemination of illegal content and disinformation, as well as particular risks to minors using the service. The report must set out the measures VLOPs and VLOSEs have implemented to mitigate the risks identified. The European Commission's Questions and Answers page provides useful information for VLOPs and VLOSEs, including guidance on what must be published and the timeframes for doing so, how providers should use redactions in respect of confidential information, and the consequences if information is redacted that does not fall under Article 42(5) of the DSA.
Australia approves world's strictest laws on children's social media use
The Australian Parliament has approved a law which will ban children under the age of 16 from using social media, representing the highest minimum age for social media use globally. Unlike similar proposals in other countries, the law does not include exemptions for existing users or children that have parental consent. The Australian Communications Minister will determine which social media companies will be in-scope of the new law, following advice from country's internet regulator. However, the minister has indicated that the ban will include Snapchat, Facebook, Instagram, TikTok and X, but not gaming and messaging platforms or sites that can be accessed without an account, such as YouTube. Unspecified age verification technology will be used to implement the restrictions, and service providers will be responsible for ensuring such technology is employed on their platforms. The law, which will take at least a year to come into force, could impose fines up to A$50m (£25.7m) for non-compliance by tech companies caught by the legislation. Critics have warned that restrictions could be easily circumvented, for example by using a VPN, and concerns have also been raised over the impact on privacy and social connection. Interestingly, a similar law in Utah was found to be unconstitutional and was overturned by a federal judge, and research into the efficacy of a law in France banning under 15s from social media without parental consent indicates that almost half of users circumvented the ban through a VPN.
Ofcom's Plan of Work for 2025/26
On 4 December, Ofcom published its draft Plan of Work for 2025/26 which details regulatory priorities for the next financial year. Two of Ofcom's stated priorities concern (1) promoting media reliability and accountability and (2) implementing the UK's new online safety framework. In respect of media accountability, Ofcom will seek to ensure audiences have access to a plurality of trusted news that is duly accurate and impartial, that competition between media providers for audiences is fair and open, and audiences are protected from harm whilst freedom of speech is safeguarded. One of their key projects will include implementing the Media Act. Regarding online safety, Ofcom have indicated that, whilst continuing their work to implement the Online Safety Act (OSA), their focus will shift to ensuring providers comply with their legal obligations to protect users. Ofcom have also published an update on its estimated timeframes for implementation of the OSA, indicating that service providers will need to comply with their illegal content safety duties from March 2025 and their child protection safety duties from July 2025. The draft Plan of Work is open for consultation until 5pm on 29 January 2025 with feedback events scheduled in Belfast, London, Cardiff, and Edinburgh. The final version of the plan is expected in March 2025.
Metropolitan Police refers itself to ICO following potential data breach
The Met Police has referred itself to the ICO after sending an email to alleged victims of the Westminster honeytrap scandal which contained the email addresses of other victims. The Met have said that the email was "sent in error" and that the police officers involved would like to "personally apologies" to the victims. The ICO are reviewing the information provided and an indication as to next steps is awaited, but the ICO has commented that "people have the right to expect that organisations will handle their personal information securely and responsibly".
Keir Starmer's stances on free speech questioned
During last week's PMQs, in the context of calls for legislation to combat Islamophobia, Labour MP Tahir Ali raised the question of whether the Prime Minister would commit to implementing measures to ban desecration of all religious texts and prophets. The proposal has been interpreted by some as advocating for a blasphemy law. Keir Starmer condemned religious hate speech and confirmed his commitment to tackling hatred but did not expressly reject the MP's proposal, raising concern amongst free speech advocates over his commitment to freedom of speech and the importance of maintaining the right to criticise religion. On X, Sir David Davis said: "For centuries, one of the most important features of Britain’s freedom of speech is the absolute right to criticise religion. Freedom of speech is fundamental to everything we have and everything we stand for. I regret Keir Starmer did not make that clear to Mr Ali at PMQs”.
Quote of the fortnight
“The new Government is failing to prioritise anti-SLAPP legislation. This is troubling and has serious potential consequences for press freedom and the future of the news industry. There has already been a public consultation. Viable legislative options and precedents exist. What is missing now is political will. Its absence reflects poorly on the new Government’s values and commitment to justice. We are not persuaded that the complexity of the issue, or the need for cross-government engagement, are a valid excuse for lengthy delays. The Government should publish draft legislative proposals before the 2025 summer recess and allow time for proper scrutiny. If necessary it should explore using the Victims, Courts and Public Protection Bill, announced in the recent King’s Speech, as a vehicle.”
Future of News Report
Stay connected and subscribe to our latest insights and views
Subscribe Here