Publishing & social media Archives - Society for Computers & Law https://www.scl.org/category/publishing-social-media/ Society for Computers & Law Thu, 01 May 2025 19:23:58 +0000 en-GB hourly 1 https://wordpress.org/?v=6.8.1 https://www.scl.org/wp-content/uploads/2024/02/cropped-scl-150x150.png Publishing & social media Archives - Society for Computers & Law https://www.scl.org/category/publishing-social-media/ 32 32 This Week’s Techlaw News Round-up https://www.scl.org/this-weeks-techlaw-news-round-up-51/ Fri, 02 May 2025 09:00:00 +0000 https://www.scl.org/?p=18639 UK law Ofcom issues guidance on mandatory age checks for pornographic content services Last week we wrote about Ofcom’s new guidance on protecting children under the Online Safety Act 2023.  Among other things, it requires age assurance requirements for online services allowing pornographic content. From 25 July 2025, affected services must implement ‘highly effective age...

Read More... from This Week’s Techlaw News Round-up

The post This Week’s Techlaw News Round-up appeared first on Society for Computers & Law.

]]>
UK law
Ofcom issues guidance on mandatory age checks for pornographic content services

Last week we wrote about Ofcom’s new guidance on protecting children under the Online Safety Act 2023.  Among other things, it requires age assurance requirements for online services allowing pornographic content. From 25 July 2025, affected services must implement ‘highly effective age assurance’ measures to prevent under-18s from accessing such content. The requirements apply to services within scope of the Online Safety Act 2023, with Ofcom sending notification letters to hundreds of services whose primary purpose is hosting pornographic material.

Ofcom launches consultation on extending Online Safety Act user controls

Ofcom is consulting about amendments to the Illegal Content Codes of Practice under the Online Safety Act. The amendments would extend blocking and muting controls and comment disabling features to smaller user-to-user service providers likely to be accessed by children. The consultation ends on 22 July 2025.

Ofcom establishes Online Information Advisory Committee under Online Safety Act 2023

Ofcom has established its Online Information Advisory Committee under section 152 of the Online Safety Act 2023. Five expert members have been appointed to the committee for three-year terms.  The Committee will advise on misinformation and disinformation matters from 1 May 2025. The Committee will support Ofcom’s statutory duty to ensure platforms address illegal content and child-harmful material through appropriate systems, without making decisions on individual content.

CMA publishes guidance on 4Ps under the Digital Markets Competition Regime

The CMA has set out how the CMA plans to implement the so-called 4Ps under the digital markets competition regime. Through pace, predictability, proportionality and process, it says that it will promote business trust and confidence, encourage investment and innovation and deliver positive outcomes for UK businesses and consumers.  It sets out the approach the CMA will take, including how the CMA will pursue deeper collaboration with stakeholders to inform its work; ensure transparency around prioritisation of investigations and interventions and deliver efficient and streamlined processes to ensure stakeholders can meaningfully engage with its work. 

FCA publishes engagement paper for AI live testing

The Financial Conduct Authority has published an engagement paper for its proposal for AI Live Testing. The proposal builds on the FCA’s new five-year strategy which sets out how it aims to support growth through a tech-positive approach. It also aims to support the FCA to be a smarter regulator by embracing data and technology to be more effective and efficient. The FCA has asked for feedback on the engagement paper by 10 June 2025.

ICO issues statement following ramsomware attack on British Library

In October 2023, the British Library reported a ransomware attack to the ICO, which escalated because of the lack of multi-factor authentication on an administrator account.  Following the incident, the British Library published a cyber incident review in March 2024, which provided an overview of the cyber-attack and key lessons learnt to help other organisations that may experience similar incidents.  Having carefully considered this particular case, the Information has Commissioner decided that, due to its current priorities, further investigation would not be the most effective use of its resources. It has provided guidance to the British Library, which has reassured the ICO about its commitment to continue to review and ensure that appropriate security measures are in place to protect people’s data. 

The post This Week’s Techlaw News Round-up appeared first on Society for Computers & Law.

]]>
Ofcom sets out principles and methods for designation of television selection services under Media Act 2024 https://www.scl.org/ofcom-sets-out-principles-and-methods-for-designation-of-television-selection-services-under-media-act-2024/ Mon, 28 Apr 2025 12:15:00 +0000 https://www.scl.org/?p=18510 Ofcom has set out the principles and methods that it intends to follow when preparing its recommendations to the Secretary of State on designating connected TV platforms, as part of its work to implement the Media Act 2024. It says that it is critical that viewers can easily find and discover the diverse range of...

Read More... from Ofcom sets out principles and methods for designation of television selection services under Media Act 2024

The post Ofcom sets out principles and methods for designation of television selection services under Media Act 2024 appeared first on Society for Computers & Law.

]]>
Ofcom has set out the principles and methods that it intends to follow when preparing its recommendations to the Secretary of State on designating connected TV platforms, as part of its work to implement the Media Act 2024.

It says that it is critical that viewers can easily find and discover the diverse range of high-quality content public service broadcasters (PSBs) offer for UK audiences, including trusted and accurate news. The Media Act 2024 introduced a new online availability and prominence regime for how PSB TV players – such as BBC iPlayer, ITVX, Channel 4 stream, 5, STV player, S4C Clic – are distributed on connected TV platforms – referred to in the Act as television selection services.

Television selection services designated by the Secretary of State for Culture, Media and Sport will be required to ensure designated PSB TV players and their content are available, prominent, and easily accessible. BBC iPlayer will be automatically designated under the legislation, but Ofcom will designate the other PSB TV players.

Following a consultation, Ofcom has now set out the principles and methods it intends to follow in preparing its recommendations to the Secretary of State on which television selection services should be designated.

Ofcom will:

  • Proceed with its proposed principles and methods for assessing the number of users of services in the UK.
  • Consider the number of people using such services and the manner of use;
  • Consider a service to be capable of functioning as a regulated service if it can carry the designated PSB players, can present TV players and programmes with different levels of prominence, and can include features to ensure players and programmes are accessible to people with disabilities.

In Summer 2025, it will consult on our recommendations on the designation of television selection services, before submitting its final report to the Secretary of State.

The post Ofcom sets out principles and methods for designation of television selection services under Media Act 2024 appeared first on Society for Computers & Law.

]]>
Ofcom publishes final guidance on protecting children under Online Safety Act 2023 https://www.scl.org/ofcom-publishes-final-guidance-on-protecting-children-under-online-safety-act-2023/ Mon, 28 Apr 2025 09:20:00 +0000 https://www.scl.org/?p=18465 Ofcom has published its final guidance on protecting children under the Online Safety Act 2023.  This follows consultation, including with children. The guidance includes more than 40 measures for tech firms to meet their duties under the Online Safety Act. These will apply to sites and apps used by UK children in areas such as...

Read More... from Ofcom publishes final guidance on protecting children under Online Safety Act 2023

The post Ofcom publishes final guidance on protecting children under Online Safety Act 2023 appeared first on Society for Computers & Law.

]]>
Ofcom has published its final guidance on protecting children under the Online Safety Act 2023.  This follows consultation, including with children.

The guidance includes more than 40 measures for tech firms to meet their duties under the Online Safety Act. These will apply to sites and apps used by UK children in areas such as social media, search and gaming. The steps include preventing minors from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography. Online services must also act to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.

Ofcom’s Codes demand a ‘safety-first’ approach in how tech firms design and operate their services in the UK. The measures include:

  • Safer feeds. Personalised recommendations are children’s main pathway to encountering harmful content online. Any provider that operates a recommender system and poses a medium or high risk of harmful content must configure their algorithms to filter out harmful content from children’s feeds.
  • Effective age checks. The riskiest services must use highly effective age assurance to identify which users are children. This aims to ensure that they can protect them from harmful material, while preserving adults’ rights to access legal content. That may involve preventing children from accessing the entire site or app, or only some parts or kinds of content. If services have minimum age requirements but are not using strong age checks, they must assume younger children are on their service and ensure they have an age-appropriate experience.
  • Fast action. All sites and apps must have processes in place to review, assess and quickly tackle harmful content when they become aware of it.
  • More choice and support for children. Sites and apps are required to give children more control over their online experience. This includes allowing them to indicate what content they don’t like, to accept or decline group chat invitations, to block and mute accounts and to disable comments on their own posts. There must be supportive information for children who may have encountered, or have searched for harmful content.
  • Easier reporting and complaints. Children must have straightforward ways to report content or complain, and providers should respond with appropriate action. Terms of service must be clear so children can understand them.
  • Strong governance. All services must have a named person accountable for children’s safety, and a senior body should annually review the management of risk to children.

Providers of services likely to be accessed by UK children now have until 24 July to finalise and record their assessment of the risk their service poses to children, which Ofcom may request. They should then implement safety measures to mitigate those risks, From 25 July 2025, they should apply the safety measures set out in our Codes to mitigate those risks.

If companies fail to comply with their new duties, Ofcom has the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK.

In recent weeks, it has been suggested that the UK government is coming under pressure from the US government to reduce the protections in the Online Safety Act as part of a UK-US trade deal. In addition, the government has been keen that regulators prioritise growth. However, the Times reported on 24 April that Peter Kyle, the technology secretary, said that he was not afraid to encourage Ofcom to use their powers to fine technology companies over breaches.

Ofcom has also announced that it is consulting on proposals that seek to expand blocking and muting user accounts and disabling comments measures in the Illegal Content Codes to a wider range of services. This is because it now considers that it would be proportionate for these measures to apply to certain smaller services that are likely to be accessed by children. The consultation ends on 22 July.

The post Ofcom publishes final guidance on protecting children under Online Safety Act 2023 appeared first on Society for Computers & Law.

]]>
European Commission finds Meta in breach of the Digital Markets Act https://www.scl.org/european-commission-finds-meta-in-breach-of-the-digital-markets-act/ Fri, 25 Apr 2025 12:19:45 +0000 https://www.scl.org/?p=18463 The European Commission has fined Meta €200 million for breaching the Digital Markets Act. Under the Digital Markets Act, gatekeepers must seek users’ consent for combining their personal data between services. Those users who do not consent must have access to a less personalised but equivalent alternative. In November 2023, Meta introduced a binary “Consent...

Read More... from European Commission finds Meta in breach of the Digital Markets Act

The post European Commission finds Meta in breach of the Digital Markets Act appeared first on Society for Computers & Law.

]]>
The European Commission has fined Meta €200 million for breaching the Digital Markets Act.

Under the Digital Markets Act, gatekeepers must seek users’ consent for combining their personal data between services. Those users who do not consent must have access to a less personalised but equivalent alternative.

In November 2023, Meta introduced a binary “Consent or Pay” advertising model. Under this model, EU users of Facebook and Instagram had a choice between consenting to the use of their personal data for personalised advertising or paying a monthly subscription for an ad-free service.

The Commission found that this did not comply with the DMA, as it did not give users the required specific choice to opt for a service that uses less of their personal data but is otherwise equivalent to the ‘personalised ads’ service. In addition, Meta’s model did not allow users to exercise their right to freely consent to the combination of their personal data.

In November 2024, after numerous exchanges with the Commission, Meta introduced another version of the free personalised ads model, offering a new option that Meta says uses less personal data to display advertisements. The Commission is currently assessing this new option and continues its dialogue with Meta, requesting the company to provide evidence of the impact that this new ads model has in practice.

Without prejudice to this ongoing assessment, the Commission’s latest decision finding non-compliance concerns the time period during which end users in the EU were only offered the binary ‘Consent or Pay’ option between March 2024, when the DMA obligations became legally binding, and November 2024, when Meta’s new ads model was introduced.

The fine imposed on Meta also considers the gravity and duration of the non-compliance, while noting that this is one of the first non-compliance decisions adopted under the DMA.

In better news for Meta, the Commission also found that Meta’s online intermediation service Facebook Marketplace should no longer be designated under the DMA. The decision follows a request submitted by Meta on 5 March 2024 to reconsider the designation of Marketplace. Following a careful assessment of Meta’s arguments and because of Meta’s additional enforcement and continued monitoring measures to counteract the business-to-consumer use of Marketplace, the Commission found that Marketplace had less than 10,000 business users in 2024. Meta therefore no longer meets the relevant threshold giving rise to a presumption that Marketplace is an important gateway for business users to reach end users.

According to the German news outlet Tageschau, Joel Kaplan, Chief Global Affairs Officer at Meta has claimed that the European Commission wants to hinder successful US firms. In addition, he said that it would cost Meta a billion dollars to change its business model and in so doing would provide a worse service to its customers.  The Commission’s decision to levy a fine while it is still reviewing Meta’s revised model has also excited some comment.

The post European Commission finds Meta in breach of the Digital Markets Act appeared first on Society for Computers & Law.

]]>
Select Committee report on British film and high-end television https://www.scl.org/select-committee-report-on-british-film-and-high-end-television/ Fri, 25 Apr 2025 09:18:00 +0000 https://www.scl.org/?p=18461 The House of Commons Culture, Media and Sport Committee has published a report on British film and high-end TV which includes several conclusions and recommendations which will be of interest to tech lawyers. The Committee has considered how the responsible use of artificial intelligence (AI) tools might transform the industry. For AI to be a...

Read More... from Select Committee report on British film and high-end television

The post Select Committee report on British film and high-end television appeared first on Society for Computers & Law.

]]>
The House of Commons Culture, Media and Sport Committee has published a report on British film and high-end TV which includes several conclusions and recommendations which will be of interest to tech lawyers.

The Committee has considered how the responsible use of artificial intelligence (AI) tools might transform the industry. For AI to be a positive force in film and HETV, the government must strengthen the copyright framework by requiring licensing of creative works in all cases where they are used to train AI models. It also says that the government must also protect our screen heritage. Screen archives face barriers to connecting the public with the UK’s filmmaking culture, and the Committee recommends the government explore a statutory deposit scheme for the moving image, minor changes to copyright legislation and the introduction of a national screen heritage strategy to put archives on a stronger footing.

HETV

The Committee says that the success of the UK’s HETV sector relies on continuing to attract inward investment while maintaining a vibrant domestic industry underpinned by strong intellectual property rights. Yet the dynamic between independent producers and subscription video-on-demand (SVoD) platforms is not sustainable, and successful production companies are being damaged by deals that deny them the ability to fully monetise their IP. While the differences in business models mean it may not be appropriate to extend the existing terms of trade as they stand for public service broadcasters to streamers, similar mechanisms must be considered.  As a result, the Committee recommends the government immediately commissions research on how regulatory measures could be applied to SVoD platforms to ensure that independent production companies developing IP in the UK maintain a minimum level of ownership over those rights.

AI

Industry guidelines based around protecting human creativity in the use of generative AI are welcome, but the film and TV sectors are calling out for help to embrace the growth potential of generative AI in a way that is fair, responsible and legally compliant.

The Committee says that at the next Spending Review, the government should fund the British Film Institute’s development of an AI observatory and tech demonstrator hub to enable it to provide effective leadership around the industry’s use of AI.

The government’s AI Sector Champion for the creative industries, once appointed, should work with the industry to develop an AI certification scheme for the ethical use of generative AI in film and HETV. In setting out guidelines for the responsible use of generative AI, the scheme should consider the interests of copyright holders, creative workers and audiences. To ensure compliance and protect the industry from irresponsible use of AI tools, the government should mandate certification for UK-based broadcasters or productions claiming tax incentives and National Lottery funding.

Proposed “opt out” regime and copyright reform

Getting the balance between AI development and copyright wrong will undermine the growth of the UK’s film and HETV sectors, and wider creative industries. The Committee says that proceeding with an ‘opt-out’ regime stands to damage the UK’s reputation among inward investors. The government should abandon its preference for a data mining exception for AI training with rights reservation model, and instead require AI developers to license any copyrighted works before using them to train their AI models. Although the film and HETV industry may be motivated to protect performers’ interests, with the history of collective bargaining agreements equipping it do so, that situation is not common across all the creative industries. The UK’s patchwork of copyright, intellectual property and data protection legislation is failing to protect performers from the nefarious use of generative AI technologies, such as unauthorised voice cloning and deepfakes. The government should legislate to prevent historical contract waivers from being interpreted to allow the use of recorded performances by AI tools.

Within the next six months the government should also conduct a review of the Copyright, Designs and Patents Act 1988 and the UK’s GDPR framework to consider whether further legislation is needed to prevent unlicensed use of data for AI purposes.

The Committee has also repeated its predecessor Committee’s calls for the government to implement the Beijing Treaty within the next six months, including extending unwaivable moral rights to audiovisual performances. The Government should introduce targeted copyright exemptions that allow for greater access to archive material without harming copyright holders. Those include adjusting legislation concerning ‘dedicated terminals’, broadening the definition of ‘educational establishments’, amending the ‘2039’ rule, and introducing exemptions for orphan works and commercially unavailable works.

The post Select Committee report on British film and high-end television appeared first on Society for Computers & Law.

]]>
This Week’s Techlaw News Round-up https://www.scl.org/this-weeks-techlaw-news-round-up-50/ Fri, 25 Apr 2025 08:57:01 +0000 https://www.scl.org/?p=18485 UK law Courts and Tribunals Judiciary publishes updated AI guidance and introduces Copilot Chat for judges The Courts and Tribunals Judiciary has published updated guidance to help judicial office holders to use AI. It updates and replaces the guidance document issued in December 2023. It sets out key risks and issues associated with using AI...

Read More... from This Week’s Techlaw News Round-up

The post This Week’s Techlaw News Round-up appeared first on Society for Computers & Law.

]]>
UK law
Courts and Tribunals Judiciary publishes updated AI guidance and introduces Copilot Chat for judges

The Courts and Tribunals Judiciary has published updated guidance to help judicial office holders to use AI. It updates and replaces the guidance document issued in December 2023. It sets out key risks and issues associated with using AI and some suggestions for minimising them. Examples of potential uses are also included. Any use of AI by or on behalf of the judiciary must be consistent with the judiciary’s overarching obligation to protect the integrity of the administration of justice. The guidance also introduces a private AI tool, Microsoft’s “Copilot Chat”, which is now available on judicial office holders’ devices through eJudiciary. This guidance applies to all judicial office holders under the Lady Chief Justice and Senior President of Tribunal’s responsibility, their clerks, judicial assistants, legal advisers/officers and other support staff.

Ofcom investigates misuse of telephone numbers

Ofcom is investigating if communications provider Primo Dialler has misused numbers sub-allocated to it, including to perpetrate scams. Ofcom allocates telephone numbers, usually in large blocks, to telecoms firms. They can then transfer the numbers to individual customers or other businesses. In line with Ofcom’s consumer protection rules and industry guidance, phone companies must not misuse numbers which have been sub-allocated to them. Services must also ensure numbers are being used correctly in accordance with the National Telephone Numbering Plan. Ofcom believes that the numbers sub-allocated to Primo Dialler are potentially being misused, including to facilitate scams. Its investigation will seek to establish whether Primo Dialler is complying with its obligations, specifically neral Conditions B1.8, B1.9(b), B1.9(c), and the Communications Act S128(5). The investigation falls under Ofcom’s enforcement programme, launched last year, looking specifically at phone and text scams. The aim of the programme is to protect customers by supporting best practice in the use of phone numbers and to ensure providers are following Ofcom’s rules. If Ofcom has reasonable grounds to suspect that rules have been broken, it may launch further investigations.

Ofcom takes action regarding “Global Titles” in mobile sector

Mobile operators use Global Titles as routing addresses for the exchange of signalling messages between 2G and 3G mobile networks and to support their provision of mobile services. Ofcom has now announced new rules to ban their leasing. This is because criminals can use Global Titles to intercept and divert calls and messages, and obtain information held by mobile networks. This could, for example, enable them to intercept security codes sent by banks to a customer via SMS message. In extreme cases they can be exploited to track the physical location of individuals anywhere in the world. The ban on entering new leasing arrangements is effective immediately. For leasing that is already in place, the ban will come into force on 22 April 2026. This will give legitimate businesses who currently lease Global Titles from mobile networks time to make alternative arrangements.  Alongside this, Ofcom has published new guidance for mobile operators on their responsibilities to prevent the misuse of their Global Titles.

ICO fines law firm £60,000 following cyber attack

The ICO has fined Merseyside-based DPP Law Ltd (DPP) £60,000, following a cyber attack that led to highly sensitive and confidential personal information being published on the dark web. It found that DPP failed to put appropriate measures in place to ensure the security of personal information held electronically. This failure enabled cyber hackers to gain access to DPP’s network, via an infrequently used administrator account which lacked multi-factor authentication and steal large volumes of data. DPP specialises in law relating to crime, military, family fraud, sexual offences, and actions against the police. The very nature of this work means it is responsible for both highly sensitive and special category data, including legally privileged information. As the information stolen by the attackers revealed private details about identifiable individuals, the ICO highlights that DPP has a responsibility under the law to ensure it is properly protected. In June 2022, DPP suffered a cyber-attack which affected access to the firm’s IT systems for over a week. A third-party consulting firm established that a brute force attempt gained access to an administrator account that was used to access a legacy case management system. This enabled cyber attackers to move laterally across DPP’s network and take over 32GB of data, a fact DPP only became aware of when the National Crime Agency contacted the firm to advise information relating to their clients had been posted on the dark web. DPP did not consider that the loss of access to personal information constituted a personal data breach, so did not report the incident to the ICO until 43 days after they became aware of it.

ICO fines compensation company £90,000 for unlawful marketing calls

The ICO has also fined AFK Letters Co Ltd (AFK) £90,000 for making more than 95,000 unsolicited marketing calls to people registered with the Telephone Preference Service, in a clear breach of electronic marketing laws. AFK writes letters seeking compensation and refunds for its customers. Between January and September 2023, AFK used data collected through its own website and a third-party telephone survey company to make 95,277 marketing calls without being able to demonstrate valid and specific consent from the people contacted. Despite AFK claiming it could not provide evidence of consent because it deleted all customer data after three months, when challenged by the ICO, it was also unable to provide consent records for several calls made within a three-month timeframe. AFK’s third-party data supplier was using consent statements which did not specifically name AFK when asking the public for consent to be called. Additionally, AFK’s own privacy policy only mentioned contact by email, and did not state that people would also receive phone calls. The ICO’s investigation found that AFK failed to comply with Regulation 21 of the Privacy and Electronic Communications Regulations.

EU law

European Commission consults on revision of EU Cybersecurity Act

The European Commission is consulting about revising the 2019 EU Cybersecurity Act. The consultation focuses on the European Union Agency for Cybersecurity mandate, the European Cybersecurity Certification Framework, and ICT supply chain security. It aims to simplify cybersecurity rules and streamline reporting obligations. The consultation ends on 20 June 2025.

Irish Data Protection Commission announces inquiry into X

The DPC has announced an inquiry into the processing of personal data comprised in publicly-accessible posts posted on the ‘X’ social media platform by EU/EEA users, for the purposes of training generative AI models, in particular the Grok Large Language Models (LLMs). The inquiry will examine compliance with the GDPR, including the lawfulness and transparency of the processing. Grok is the name of a group of AI models developed by xAI. They are used, among other things, to power a generative AI querying tool/chatbot, which is available on the X platform. Like other modern LLMs, the Grok LLMs have been developed and trained on a wide variety of data. The DPC’s inquiry considers a range of issues concerning the use of a subset of this data which was controlled by X, that is, personal data in publicly accessible posts posted on the X social media platform by EU/EEA users. The purpose of the inquiry is to determine if the personal data was lawfully processed to train the Grok LLMs. The DPC has notified X of its decision to conduct the inquiry under Section 110 of the Irish Data Protection Act 2018.

Coimisiún na Meán publishes Strategy Statement and Work Programme

Coimisiún na Meán has published its first three-year strategy, which sets out its vision for the media landscape in Ireland. The Strategy Statement 2025-2027 is accompanied by a 2025 Work Programme, which lists priority projects across Coimisiún na Meán’s remit of online safety, media sector development and regulation.  The Strategy Statement 2025-2027 is built on six key outcomes: children, democracy, trust, diversity and inclusion and public safety. Among the priority projects outlined in Coimisiún na Meán’s 2025 Work Programme are the development of a pilot programme for children at imminent risk of harm from online content, the development of an Election Integrity Strategy across all media sources, the creation of educational materials relating to online hate, the preparation of a new Broadcasting Services Strategy and a revised Media Plurality Policy, and the continuation of the Sound & Vision and Journalism funding Schemes.

The post This Week’s Techlaw News Round-up appeared first on Society for Computers & Law.

]]>
Ofcom launches first investigation of individual service provider under Online Safety Act 2023 https://www.scl.org/ofcom-launches-first-investigation-of-individual-service-provider-under-online-safety-act-2023/ Tue, 22 Apr 2025 09:11:55 +0000 https://www.scl.org/?p=18402 Ofcom has launched an investigation into whether the provider of an online suicide forum has failed to comply with its duties under the Online Safety Act 2023. This is the first investigation opened into an individual online service provider under the OSA. Specifically, Ofcom is investigating whether this provider has failed to: Due to its...

Read More... from Ofcom launches first investigation of individual service provider under Online Safety Act 2023

The post Ofcom launches first investigation of individual service provider under Online Safety Act 2023 appeared first on Society for Computers & Law.

]]>
Ofcom has launched an investigation into whether the provider of an online suicide forum has failed to comply with its duties under the Online Safety Act 2023.

This is the first investigation opened into an individual online service provider under the OSA. Specifically, Ofcom is investigating whether this provider has failed to:

  • put appropriate safety measures in place to protect its UK users from illegal content and activity;
  • complete – and keep a record of – a suitable and sufficient illegal harms risk assessment; and
  • adequately respond to a statutory information request.

Due to its sensitive nature, Ofcom has decided not to name the provider and forum.  However, the BBC has reported that it has been investigating the same forum and believes it is linked to 50 deaths in the UK.

Legal obligations under the Online Safety Act

Providers of services in scope of the OSA had until 16 March 2025 to assess how likely people in the UK are to encounter illegal content on their service, and how their service could be used to commit or facilitate ‘priority’ criminal offences – including encouraging or assisting suicide.

On 17 March, duties came into force that mean providers must now take steps to protect their UK users from illegal content and activity, including by using proportionate measures to:

  • mitigate the risk of their service being used to commit or facilitate a priority offence;
  • prevent individuals from encountering priority illegal content; and
  • swiftly take down illegal content once they become aware of it.

Ofcom’s codes of practice and guidance set out ways providers can comply with these duties. Providers are also required to respond to all statutory information requests from Ofcom in an accurate, complete and timely way.

Ofcom’s starting point in driving compliance is to give service providers an opportunity to engage with its compliance teams about what they need to do under their new duties. However,  failure to comply with the new online safety duties or adequately respond to its information requests may result in enforcement action, and it will not hesitate to take swift action where it suspects there may be serious breaches.

In this case, it says that it has made several attempts to engage with this service provider about its duties under the Act and issued a legally binding request to submit the record of its illegal harms risk assessment to it. Having received a limited response to its request, and unsatisfactory information about the steps being taken to protect UK users from illegal content, Ofcom has therefore launched its investigation. It will now gather and analyse evidence to decide if a contravention has occurred. If its assessment indicates a compliance failure, Ofcom will issue a provisional notice of contravention to the provider, who can then make representations on its findings, before Ofcom makes its final decision.

Ofcom says that it will provide an update on this investigation as soon as possible.

The post Ofcom launches first investigation of individual service provider under Online Safety Act 2023 appeared first on Society for Computers & Law.

]]>
This Week’s Techlaw News Round-Up https://www.scl.org/this-weeks-techlaw-news-round-up-48/ Fri, 04 Apr 2025 08:39:07 +0000 https://www.scl.org/?p=18065 UK law Secretaries of State reply to Select Committees’ joint response to copyright and AI consultation The Secretaries of State for Science, Innovation and Technology and for Culture, Media and Sport have replied to the February 2025 CMS and SIT Committees’ joint response to the government’s consultation on AI and copyright. They have shared the...

Read More... from This Week’s Techlaw News Round-Up

The post This Week’s Techlaw News Round-Up appeared first on Society for Computers & Law.

]]>
UK law
Secretaries of State reply to Select Committees’ joint response to copyright and AI consultation

The Secretaries of State for Science, Innovation and Technology and for Culture, Media and Sport have replied to the February 2025 CMS and SIT Committees’ joint response to the government’s consultation on AI and copyright. They have shared the Committees’ joint response with officials at the Intellectual property Office to ask them to consider the Committee’s comments.  They also said that there have been over 11,500 responses to the consultation.  The government is carefully reviewing responses and has not made any decisions yet. The implementation of any text and data mining exception depends on having workable technical solutions in place for rights reservation. The government will not proceed with legislation unless and until these technical requirements are met.

Ofcom sets out 2025/26 Plan of Work and longer-term blueprint to support economic growth

Ofcom has issued its Plan of Work for 2025/26 which outlines its strategic priorities aimed at enhancing communication services, ensuring online safety, and promoting competition in the media and telecommunications sectors across the UK. The plan focuses on four main priorities: ‘Internet and post we can rely on’, ‘Media we trust and value’, ‘We live a safer life online’, and ‘Enabling wireless in the UK economy’. Key initiatives include supporting investment in gigabit-capable broadband, improving telecoms network security, reforming the universal postal service, implementing the Media Act, enforcing content standards, establishing the Online Safety regime, managing radio spectrum efficiently, and facilitating innovation in mobile and satellite services. Ofcom also says that it will address the unique needs of each home nation, ensuring tailored approaches and stakeholder engagement. The plan emphasizes collaboration with domestic and international partners, investing in data and technology capabilities, and using evidence-based regulation to inform policy decisions.

Adult sites start rolling out age assurance

Ofcom has indicated that providers of online pornography are implementing highly effective age assurance across thousands of sites, in response to Ofcom’s enforcement programme in this area. Earlier this year, Ofcom wrote to hundreds of providers, collectively covering thousands of sites that publish their own pornographic content, telling them about their new obligations under Part 5 of the Online Safety Act to implement highly effective age assurance to prevent children from accessing this material. So far, it says that it has had positive engagement from across the sector and several providers have implemented highly effective age assurance in response to its enforcement programme. It is currently reviewing compliance plans and implementation timescales for other services in scope of these duties. It is also assessing the age assurance measures of providers who have not responded, and several services have been referred to Ofcom’s enforcement team, who will consider in the coming weeks whether formal enforcement action is appropriate. Details of any new investigations will be published on the Ofcom website. By July 2025, all services that allow pornography, including sites that allow user-generated pornographic content, will need to have highly effective age-checks in place to protect children from accessing it.

Patents Court considers patent validity and infringement and FRAND terms claims validly served out of jurisdiction

In Mediatek Inc and others v Huawei Technologies Co Ltd and another [2025] EWHC 649 (Pat), the Patents Court decided that the court had validly permitted service on a defendant out of the jurisdiction regarding actions concerning validity and infringement of telecommunications patents, and the fair, reasonable and non-discriminatory (FRAND) terms for a global cross-licence. Huawei wanted to license its SEPs at the chipset rather than device level. MediaTek brought proceedings against Huawei in the Patents Court, and among other things wanted determination of a global FRAND licence. Huawei pointed to fact that the relevant acts took place around China as well as the existence of parallel proceedings brought by Huawei and MediaTek in China.

The post This Week’s Techlaw News Round-Up appeared first on Society for Computers & Law.

]]>
Online Safety Act in force: platforms must start tackling illegal material from 17 March 2025 https://www.scl.org/online-safety-act-in-force-platforms-must-start-tackling-illegal-material-from-17-march-2025/ Wed, 19 Mar 2025 11:14:03 +0000 https://www.scl.org/?p=17867 As of 17 March, online platforms must start putting in place measures to comply with the duties to p[revent illegal harms under the Online Safety Act.  Ofcom has launched its latest enforcement programme to assess industry compliance. Providers of services in scope of the OSA had until 16 March to carry out a suitable and...

Read More... from Online Safety Act in force: platforms must start tackling illegal material from 17 March 2025

The post Online Safety Act in force: platforms must start tackling illegal material from 17 March 2025 appeared first on Society for Computers & Law.

]]>
As of 17 March, online platforms must start putting in place measures to comply with the duties to p[revent illegal harms under the Online Safety Act.  Ofcom has launched its latest enforcement programme to assess industry compliance.

Providers of services in scope of the OSA had until 16 March to carry out a suitable and sufficient illegal harms risk assessment, to understand how likely it is that users could encounter illegal content on their service, or, in the case of “user-to-user” services, how they could be used to commit or facilitate certain criminal offences.

From 17 March, the next set of illegal harms duties have come into force. Therefore, platforms must start implementing appropriate measures to remove illegal material quickly when they become aware of it, and to reduce the risk of “priority” criminal content from appearing in the first place.

Ofcom will be assessing platforms’ compliance with their new illegal harms obligations under the OSA, and launching targeted enforcement action where it uncovers concerns.

Ofcom will also initially prioritise the compliance of sites and apps that may present particular risks of harm from illegal content due to their size or nature, for example because they have a large number of users in the UK, or because their users may risk encountering some of the most harmful forms of online content and conduct

Given the acute harm caused by the spread of online child sexual abuse material (CSAM), assessing providers’ compliance with their safety duties in this area has been identified as one of its early priorities for enforcement. Ofcom’s evidence shows that file-sharing and file-storage services are particularly susceptible to being used for the sharing of image-based CSAM.  It is now assessing the safety measures being taken, or that will soon be taken, by file-sharing and file-storage providers to prevent offenders from disseminating CSAM on their services.

Ofcom has written to a number of these services to put them on notice that it will shortly be sending them formal information requests regarding the measures they have in place, or will soon have in place, to tackle CSAM, and requiring them to submit their illegal harms risk assessments to us.  It highlights its investigatory and enforcement powers including the ability to levy hefty fines if organisations do not comply.

Ofcom expects to make additional announcements on formal enforcement action over the coming weeks.

The post Online Safety Act in force: platforms must start tackling illegal material from 17 March 2025 appeared first on Society for Computers & Law.

]]>
Tackling non-consensual intimate image abuse https://www.scl.org/tackling-non-consensual-intimate-image-abuse/ Tue, 11 Mar 2025 14:00:00 +0000 https://www.scl.org/?p=17766 The House of Commons Women and Equalities Select Committee has issued a report saying that possessing non-consensual intimate images (NCII) should be made an offence, putting it on the same legal footing as child sexual abuse material (CSAM). Non-consensual intimate image (NCII) abuse occurs when intimate content is produced, published, or reproduced without consent, often...

Read More... from Tackling non-consensual intimate image abuse

The post Tackling non-consensual intimate image abuse appeared first on Society for Computers & Law.

]]>
The House of Commons Women and Equalities Select Committee has issued a report saying that possessing non-consensual intimate images (NCII) should be made an offence, putting it on the same legal footing as child sexual abuse material (CSAM).

Non-consensual intimate image (NCII) abuse occurs when intimate content is produced, published, or reproduced without consent, often online.

NCII abuse can also include material that is considered “culturally intimate” for the victim, such as a Muslim woman being pictured without her hijab. The Committee says that the government should expand the legal definition to include such images.

The OSA creates criminal offences for individuals relating to NCII and places duties on regulated search services and user-to-user services (e.g. social media), including a requirement to take down NCII content. Ofcom also has powers to enforce providers’ compliance with the Act, like imposing fines and applying for service restriction orders. While many platforms remove NCII content voluntarily, some fail to comply with requests to take material down. Around 10% of content remains online, invariably hosted on sites based overseas. The Committees believe that the new regulatory regime overseen by Ofcom is unlikely to have much impact on such sites.  This is because Ofcom’s current enforcement powers are too slow and not designed to help individuals get NCII on non-compliant websites taken down. In such circumstances, access to those sites should be blocked. For internet infrastructure providers to take this threat seriously and block access to websites that refuse to comply, NCII should be brought in line with child sexual abuse material (CSAM) in law.

The government should bring forward amendments to the Crime and Policing Bill to make possession of NCII an offence. The Government should also create voluntary guidance for internet infrastructure providers on tackling NCII, like it has for CSAM.

The government should also take a holistic approach to legislating against NCII abuse by introducing a swift and inexpensive statutory civil process, as has been established in other jurisdictions. In addition there should be a registry of NCII content that internet infrastructure providers are requested to prevent access to, similar to the current arrangements for CSAM. The statutory regime should enable civil courts to make orders, including designating an image as NCII content and ordering its inclusion on the registry, as well as requiring an individual to delete any such images.

The government should also set up an Online Safety Commission, like the eSafety Commission in Australia, with a focus on support for individuals. The new Commission would be able to apply for and send such court orders and oversee the NCII registry.

There have been cases where, following the criminal justice process, perpetrators have had devices containing the NCII returned to them, which is harrowing for victims. The Committee says that the government, Sentencing Council and Crown Prosecution Service must each take steps to ensure that those charged with NCII offences are deprived of that material.

The Revenge Porn Helpline has launched a free ‘hashing’ tool designed to protect people from NCII abuse. Hashing generates a digital fingerprint that uniquely identifies an image or video. This is distributed to participating platforms to allow them to prevent that content being uploaded. The Committee expresses disappointment that some major platforms have so far not joined the 13 currently participating platforms and says that they should do so urgently. The Committee welcomes Ofcom’s plans to consult on expansions to its Codes of Practice that would include proposals on the use of hashing.

Synthetic NCII, known also as ‘deepfakes’, refers to any sexual or nude media created using AI that represents the likeness of another individual without their consent. The Government’s plans to criminalise their creation are welcome. However, the Committee says that the offence must be based on the lack of consent of the victim, not motivation of the perpetrator. The creation and use of nudification apps should also be criminalised.

The OSA represents considerable progress in this area, as do the additional offences included in the Crime and Policing Bill and Data (Use and Access) Bill, but significant gaps in the legislative and regulatory framework remain.

The post Tackling non-consensual intimate image abuse appeared first on Society for Computers & Law.

]]>