Privacy & Data Protection Archives - Society for Computers & Law https://www.scl.org/category/privacy-data-protection/ Society for Computers & Law Thu, 01 May 2025 19:23:58 +0000 en-GB hourly 1 https://wordpress.org/?v=6.8.1 https://www.scl.org/wp-content/uploads/2024/02/cropped-scl-150x150.png Privacy & Data Protection Archives - Society for Computers & Law https://www.scl.org/category/privacy-data-protection/ 32 32 This Week’s Techlaw News Round-up https://www.scl.org/this-weeks-techlaw-news-round-up-51/ Fri, 02 May 2025 09:00:00 +0000 https://www.scl.org/?p=18639 UK law Ofcom issues guidance on mandatory age checks for pornographic content services Last week we wrote about Ofcom’s new guidance on protecting children under the Online Safety Act 2023.  Among other things, it requires age assurance requirements for online services allowing pornographic content. From 25 July 2025, affected services must implement ‘highly effective age...

Read More... from This Week’s Techlaw News Round-up

The post This Week’s Techlaw News Round-up appeared first on Society for Computers & Law.

]]>
UK law
Ofcom issues guidance on mandatory age checks for pornographic content services

Last week we wrote about Ofcom’s new guidance on protecting children under the Online Safety Act 2023.  Among other things, it requires age assurance requirements for online services allowing pornographic content. From 25 July 2025, affected services must implement ‘highly effective age assurance’ measures to prevent under-18s from accessing such content. The requirements apply to services within scope of the Online Safety Act 2023, with Ofcom sending notification letters to hundreds of services whose primary purpose is hosting pornographic material.

Ofcom launches consultation on extending Online Safety Act user controls

Ofcom is consulting about amendments to the Illegal Content Codes of Practice under the Online Safety Act. The amendments would extend blocking and muting controls and comment disabling features to smaller user-to-user service providers likely to be accessed by children. The consultation ends on 22 July 2025.

Ofcom establishes Online Information Advisory Committee under Online Safety Act 2023

Ofcom has established its Online Information Advisory Committee under section 152 of the Online Safety Act 2023. Five expert members have been appointed to the committee for three-year terms.  The Committee will advise on misinformation and disinformation matters from 1 May 2025. The Committee will support Ofcom’s statutory duty to ensure platforms address illegal content and child-harmful material through appropriate systems, without making decisions on individual content.

CMA publishes guidance on 4Ps under the Digital Markets Competition Regime

The CMA has set out how the CMA plans to implement the so-called 4Ps under the digital markets competition regime. Through pace, predictability, proportionality and process, it says that it will promote business trust and confidence, encourage investment and innovation and deliver positive outcomes for UK businesses and consumers.  It sets out the approach the CMA will take, including how the CMA will pursue deeper collaboration with stakeholders to inform its work; ensure transparency around prioritisation of investigations and interventions and deliver efficient and streamlined processes to ensure stakeholders can meaningfully engage with its work. 

FCA publishes engagement paper for AI live testing

The Financial Conduct Authority has published an engagement paper for its proposal for AI Live Testing. The proposal builds on the FCA’s new five-year strategy which sets out how it aims to support growth through a tech-positive approach. It also aims to support the FCA to be a smarter regulator by embracing data and technology to be more effective and efficient. The FCA has asked for feedback on the engagement paper by 10 June 2025.

ICO issues statement following ramsomware attack on British Library

In October 2023, the British Library reported a ransomware attack to the ICO, which escalated because of the lack of multi-factor authentication on an administrator account.  Following the incident, the British Library published a cyber incident review in March 2024, which provided an overview of the cyber-attack and key lessons learnt to help other organisations that may experience similar incidents.  Having carefully considered this particular case, the Information has Commissioner decided that, due to its current priorities, further investigation would not be the most effective use of its resources. It has provided guidance to the British Library, which has reassured the ICO about its commitment to continue to review and ensure that appropriate security measures are in place to protect people’s data. 

The post This Week’s Techlaw News Round-up appeared first on Society for Computers & Law.

]]>
SCL Podcast “Technology & Privacy Laws Around The World” – Episode 5: Australia and New Zealand https://www.scl.org/scl-podcast-technology-privacy-laws-around-the-world-episode-5-australia-and-new-zealand/ Wed, 30 Apr 2025 11:03:27 +0000 https://www.scl.org/?p=18584 In two common law nations where regulation intersects with digital innovation, and with relatively small populations, Australia and New Zealand offer distinct yet complementary perspectives on technology regulation and privacy law. How do their legal systems address issues of safety in the digital age, privacy rights, and the interests of Indigenous communities? And in what...

Read More... from SCL Podcast “Technology & Privacy Laws Around The World” – Episode 5: Australia and New Zealand

The post SCL Podcast “Technology & Privacy Laws Around The World” – Episode 5: Australia and New Zealand appeared first on Society for Computers & Law.

]]>
In two common law nations where regulation intersects with digital innovation, and with relatively small populations, Australia and New Zealand offer distinct yet complementary perspectives on technology regulation and privacy law.

How do their legal systems address issues of safety in the digital age, privacy rights, and the interests of Indigenous communities? And in what ways do they align with, or diverge from, international standards set by Europe and the United States?

In this episode, host Mauricio Figueroa is joined by three experts to discuss the policy and normative landscape of Australia and New Zealand. Tune in for an interesting conversation and through-provoking conversation about privacy and tech in these two countries. Listen to the episode here: https://bit.ly/3Yquyz8

The Panel:

Mauricio Figueroa is a legal scholar and educator. His area of expertise is Law and Digital Technologies, and has international experience in legal research, teaching, and public policy. He is the host of the SCL podcast “Privacy and Technology Laws Around the World”.

Andelka Philipps is an academic and writer and her research interests are broadly in the areas of Technology Law, Privacy and Data Protection, as well as Medical Law, Intellectual Property, Cyber Security, and Consumer Protection. She has taught in law schools in four countries: the United Kingdom; the Republic of Ireland; New Zealand; and Australia. She is currently an Affiliate with the Bioethics Institute Ghent, Ghent University, Belgium and an Academic Affiliate with the University of Oxford’s Centre for Health, Law and Emerging Technologies (HeLEX). She is also an Associate Editor for the Journal of the Royal Society of New Zealand (JRSNZ), the first to be appointed from the discipline of Law. www.andelkamphillips.com

John Swinson is a former partner of a major international law firm and has 30 years of law firm experience in NY and Australia, with principle focus on technology law and intellectual property law. He is a Professor of Law at The University of Queensland, where he teaches privacy law, cybersecurity law, and Internet & IT law.

Raffaele Ciriello is Senior Lecturer in Business Information Systems at the University of Sydney, whose research focuses on compassionate digital innovation and the ethical and societal impacts of emerging technologies. His work critically examines issues of digital responsibility, decentralised governance, and public interest technology, with recent projects spanning AI companions, blockchain infrastructures, and national digital sovereignty.

About the podcast

Join host Mauricio Figueroa and guests on a tour of tech law from across the globe. Previous episodes have focused on the use of ‘robot judges’ in several jurisdictions and developments in India, the USA and Japan. Future episodes will look at South America, Africa and Europe.

Where to listen

The post SCL Podcast “Technology & Privacy Laws Around The World” – Episode 5: Australia and New Zealand appeared first on Society for Computers & Law.

]]>
Ofcom publishes final guidance on protecting children under Online Safety Act 2023 https://www.scl.org/ofcom-publishes-final-guidance-on-protecting-children-under-online-safety-act-2023/ Mon, 28 Apr 2025 09:20:00 +0000 https://www.scl.org/?p=18465 Ofcom has published its final guidance on protecting children under the Online Safety Act 2023.  This follows consultation, including with children. The guidance includes more than 40 measures for tech firms to meet their duties under the Online Safety Act. These will apply to sites and apps used by UK children in areas such as...

Read More... from Ofcom publishes final guidance on protecting children under Online Safety Act 2023

The post Ofcom publishes final guidance on protecting children under Online Safety Act 2023 appeared first on Society for Computers & Law.

]]>
Ofcom has published its final guidance on protecting children under the Online Safety Act 2023.  This follows consultation, including with children.

The guidance includes more than 40 measures for tech firms to meet their duties under the Online Safety Act. These will apply to sites and apps used by UK children in areas such as social media, search and gaming. The steps include preventing minors from encountering the most harmful content relating to suicide, self-harm, eating disorders and pornography. Online services must also act to protect children from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.

Ofcom’s Codes demand a ‘safety-first’ approach in how tech firms design and operate their services in the UK. The measures include:

  • Safer feeds. Personalised recommendations are children’s main pathway to encountering harmful content online. Any provider that operates a recommender system and poses a medium or high risk of harmful content must configure their algorithms to filter out harmful content from children’s feeds.
  • Effective age checks. The riskiest services must use highly effective age assurance to identify which users are children. This aims to ensure that they can protect them from harmful material, while preserving adults’ rights to access legal content. That may involve preventing children from accessing the entire site or app, or only some parts or kinds of content. If services have minimum age requirements but are not using strong age checks, they must assume younger children are on their service and ensure they have an age-appropriate experience.
  • Fast action. All sites and apps must have processes in place to review, assess and quickly tackle harmful content when they become aware of it.
  • More choice and support for children. Sites and apps are required to give children more control over their online experience. This includes allowing them to indicate what content they don’t like, to accept or decline group chat invitations, to block and mute accounts and to disable comments on their own posts. There must be supportive information for children who may have encountered, or have searched for harmful content.
  • Easier reporting and complaints. Children must have straightforward ways to report content or complain, and providers should respond with appropriate action. Terms of service must be clear so children can understand them.
  • Strong governance. All services must have a named person accountable for children’s safety, and a senior body should annually review the management of risk to children.

Providers of services likely to be accessed by UK children now have until 24 July to finalise and record their assessment of the risk their service poses to children, which Ofcom may request. They should then implement safety measures to mitigate those risks, From 25 July 2025, they should apply the safety measures set out in our Codes to mitigate those risks.

If companies fail to comply with their new duties, Ofcom has the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK.

In recent weeks, it has been suggested that the UK government is coming under pressure from the US government to reduce the protections in the Online Safety Act as part of a UK-US trade deal. In addition, the government has been keen that regulators prioritise growth. However, the Times reported on 24 April that Peter Kyle, the technology secretary, said that he was not afraid to encourage Ofcom to use their powers to fine technology companies over breaches.

Ofcom has also announced that it is consulting on proposals that seek to expand blocking and muting user accounts and disabling comments measures in the Illegal Content Codes to a wider range of services. This is because it now considers that it would be proportionate for these measures to apply to certain smaller services that are likely to be accessed by children. The consultation ends on 22 July.

The post Ofcom publishes final guidance on protecting children under Online Safety Act 2023 appeared first on Society for Computers & Law.

]]>
European Commission finds Meta in breach of the Digital Markets Act https://www.scl.org/european-commission-finds-meta-in-breach-of-the-digital-markets-act/ Fri, 25 Apr 2025 12:19:45 +0000 https://www.scl.org/?p=18463 The European Commission has fined Meta €200 million for breaching the Digital Markets Act. Under the Digital Markets Act, gatekeepers must seek users’ consent for combining their personal data between services. Those users who do not consent must have access to a less personalised but equivalent alternative. In November 2023, Meta introduced a binary “Consent...

Read More... from European Commission finds Meta in breach of the Digital Markets Act

The post European Commission finds Meta in breach of the Digital Markets Act appeared first on Society for Computers & Law.

]]>
The European Commission has fined Meta €200 million for breaching the Digital Markets Act.

Under the Digital Markets Act, gatekeepers must seek users’ consent for combining their personal data between services. Those users who do not consent must have access to a less personalised but equivalent alternative.

In November 2023, Meta introduced a binary “Consent or Pay” advertising model. Under this model, EU users of Facebook and Instagram had a choice between consenting to the use of their personal data for personalised advertising or paying a monthly subscription for an ad-free service.

The Commission found that this did not comply with the DMA, as it did not give users the required specific choice to opt for a service that uses less of their personal data but is otherwise equivalent to the ‘personalised ads’ service. In addition, Meta’s model did not allow users to exercise their right to freely consent to the combination of their personal data.

In November 2024, after numerous exchanges with the Commission, Meta introduced another version of the free personalised ads model, offering a new option that Meta says uses less personal data to display advertisements. The Commission is currently assessing this new option and continues its dialogue with Meta, requesting the company to provide evidence of the impact that this new ads model has in practice.

Without prejudice to this ongoing assessment, the Commission’s latest decision finding non-compliance concerns the time period during which end users in the EU were only offered the binary ‘Consent or Pay’ option between March 2024, when the DMA obligations became legally binding, and November 2024, when Meta’s new ads model was introduced.

The fine imposed on Meta also considers the gravity and duration of the non-compliance, while noting that this is one of the first non-compliance decisions adopted under the DMA.

In better news for Meta, the Commission also found that Meta’s online intermediation service Facebook Marketplace should no longer be designated under the DMA. The decision follows a request submitted by Meta on 5 March 2024 to reconsider the designation of Marketplace. Following a careful assessment of Meta’s arguments and because of Meta’s additional enforcement and continued monitoring measures to counteract the business-to-consumer use of Marketplace, the Commission found that Marketplace had less than 10,000 business users in 2024. Meta therefore no longer meets the relevant threshold giving rise to a presumption that Marketplace is an important gateway for business users to reach end users.

According to the German news outlet Tageschau, Joel Kaplan, Chief Global Affairs Officer at Meta has claimed that the European Commission wants to hinder successful US firms. In addition, he said that it would cost Meta a billion dollars to change its business model and in so doing would provide a worse service to its customers.  The Commission’s decision to levy a fine while it is still reviewing Meta’s revised model has also excited some comment.

The post European Commission finds Meta in breach of the Digital Markets Act appeared first on Society for Computers & Law.

]]>
This Week’s Techlaw News Round-up https://www.scl.org/this-weeks-techlaw-news-round-up-50/ Fri, 25 Apr 2025 08:57:01 +0000 https://www.scl.org/?p=18485 UK law Courts and Tribunals Judiciary publishes updated AI guidance and introduces Copilot Chat for judges The Courts and Tribunals Judiciary has published updated guidance to help judicial office holders to use AI. It updates and replaces the guidance document issued in December 2023. It sets out key risks and issues associated with using AI...

Read More... from This Week’s Techlaw News Round-up

The post This Week’s Techlaw News Round-up appeared first on Society for Computers & Law.

]]>
UK law
Courts and Tribunals Judiciary publishes updated AI guidance and introduces Copilot Chat for judges

The Courts and Tribunals Judiciary has published updated guidance to help judicial office holders to use AI. It updates and replaces the guidance document issued in December 2023. It sets out key risks and issues associated with using AI and some suggestions for minimising them. Examples of potential uses are also included. Any use of AI by or on behalf of the judiciary must be consistent with the judiciary’s overarching obligation to protect the integrity of the administration of justice. The guidance also introduces a private AI tool, Microsoft’s “Copilot Chat”, which is now available on judicial office holders’ devices through eJudiciary. This guidance applies to all judicial office holders under the Lady Chief Justice and Senior President of Tribunal’s responsibility, their clerks, judicial assistants, legal advisers/officers and other support staff.

Ofcom investigates misuse of telephone numbers

Ofcom is investigating if communications provider Primo Dialler has misused numbers sub-allocated to it, including to perpetrate scams. Ofcom allocates telephone numbers, usually in large blocks, to telecoms firms. They can then transfer the numbers to individual customers or other businesses. In line with Ofcom’s consumer protection rules and industry guidance, phone companies must not misuse numbers which have been sub-allocated to them. Services must also ensure numbers are being used correctly in accordance with the National Telephone Numbering Plan. Ofcom believes that the numbers sub-allocated to Primo Dialler are potentially being misused, including to facilitate scams. Its investigation will seek to establish whether Primo Dialler is complying with its obligations, specifically neral Conditions B1.8, B1.9(b), B1.9(c), and the Communications Act S128(5). The investigation falls under Ofcom’s enforcement programme, launched last year, looking specifically at phone and text scams. The aim of the programme is to protect customers by supporting best practice in the use of phone numbers and to ensure providers are following Ofcom’s rules. If Ofcom has reasonable grounds to suspect that rules have been broken, it may launch further investigations.

Ofcom takes action regarding “Global Titles” in mobile sector

Mobile operators use Global Titles as routing addresses for the exchange of signalling messages between 2G and 3G mobile networks and to support their provision of mobile services. Ofcom has now announced new rules to ban their leasing. This is because criminals can use Global Titles to intercept and divert calls and messages, and obtain information held by mobile networks. This could, for example, enable them to intercept security codes sent by banks to a customer via SMS message. In extreme cases they can be exploited to track the physical location of individuals anywhere in the world. The ban on entering new leasing arrangements is effective immediately. For leasing that is already in place, the ban will come into force on 22 April 2026. This will give legitimate businesses who currently lease Global Titles from mobile networks time to make alternative arrangements.  Alongside this, Ofcom has published new guidance for mobile operators on their responsibilities to prevent the misuse of their Global Titles.

ICO fines law firm £60,000 following cyber attack

The ICO has fined Merseyside-based DPP Law Ltd (DPP) £60,000, following a cyber attack that led to highly sensitive and confidential personal information being published on the dark web. It found that DPP failed to put appropriate measures in place to ensure the security of personal information held electronically. This failure enabled cyber hackers to gain access to DPP’s network, via an infrequently used administrator account which lacked multi-factor authentication and steal large volumes of data. DPP specialises in law relating to crime, military, family fraud, sexual offences, and actions against the police. The very nature of this work means it is responsible for both highly sensitive and special category data, including legally privileged information. As the information stolen by the attackers revealed private details about identifiable individuals, the ICO highlights that DPP has a responsibility under the law to ensure it is properly protected. In June 2022, DPP suffered a cyber-attack which affected access to the firm’s IT systems for over a week. A third-party consulting firm established that a brute force attempt gained access to an administrator account that was used to access a legacy case management system. This enabled cyber attackers to move laterally across DPP’s network and take over 32GB of data, a fact DPP only became aware of when the National Crime Agency contacted the firm to advise information relating to their clients had been posted on the dark web. DPP did not consider that the loss of access to personal information constituted a personal data breach, so did not report the incident to the ICO until 43 days after they became aware of it.

ICO fines compensation company £90,000 for unlawful marketing calls

The ICO has also fined AFK Letters Co Ltd (AFK) £90,000 for making more than 95,000 unsolicited marketing calls to people registered with the Telephone Preference Service, in a clear breach of electronic marketing laws. AFK writes letters seeking compensation and refunds for its customers. Between January and September 2023, AFK used data collected through its own website and a third-party telephone survey company to make 95,277 marketing calls without being able to demonstrate valid and specific consent from the people contacted. Despite AFK claiming it could not provide evidence of consent because it deleted all customer data after three months, when challenged by the ICO, it was also unable to provide consent records for several calls made within a three-month timeframe. AFK’s third-party data supplier was using consent statements which did not specifically name AFK when asking the public for consent to be called. Additionally, AFK’s own privacy policy only mentioned contact by email, and did not state that people would also receive phone calls. The ICO’s investigation found that AFK failed to comply with Regulation 21 of the Privacy and Electronic Communications Regulations.

EU law

European Commission consults on revision of EU Cybersecurity Act

The European Commission is consulting about revising the 2019 EU Cybersecurity Act. The consultation focuses on the European Union Agency for Cybersecurity mandate, the European Cybersecurity Certification Framework, and ICT supply chain security. It aims to simplify cybersecurity rules and streamline reporting obligations. The consultation ends on 20 June 2025.

Irish Data Protection Commission announces inquiry into X

The DPC has announced an inquiry into the processing of personal data comprised in publicly-accessible posts posted on the ‘X’ social media platform by EU/EEA users, for the purposes of training generative AI models, in particular the Grok Large Language Models (LLMs). The inquiry will examine compliance with the GDPR, including the lawfulness and transparency of the processing. Grok is the name of a group of AI models developed by xAI. They are used, among other things, to power a generative AI querying tool/chatbot, which is available on the X platform. Like other modern LLMs, the Grok LLMs have been developed and trained on a wide variety of data. The DPC’s inquiry considers a range of issues concerning the use of a subset of this data which was controlled by X, that is, personal data in publicly accessible posts posted on the X social media platform by EU/EEA users. The purpose of the inquiry is to determine if the personal data was lawfully processed to train the Grok LLMs. The DPC has notified X of its decision to conduct the inquiry under Section 110 of the Irish Data Protection Act 2018.

Coimisiún na Meán publishes Strategy Statement and Work Programme

Coimisiún na Meán has published its first three-year strategy, which sets out its vision for the media landscape in Ireland. The Strategy Statement 2025-2027 is accompanied by a 2025 Work Programme, which lists priority projects across Coimisiún na Meán’s remit of online safety, media sector development and regulation.  The Strategy Statement 2025-2027 is built on six key outcomes: children, democracy, trust, diversity and inclusion and public safety. Among the priority projects outlined in Coimisiún na Meán’s 2025 Work Programme are the development of a pilot programme for children at imminent risk of harm from online content, the development of an Election Integrity Strategy across all media sources, the creation of educational materials relating to online hate, the preparation of a new Broadcasting Services Strategy and a revised Media Plurality Policy, and the continuation of the Sound & Vision and Journalism funding Schemes.

The post This Week’s Techlaw News Round-up appeared first on Society for Computers & Law.

]]>
European Commission issues Apple with fine under Digital Markets Act https://www.scl.org/european-commission-issues-apple-with-fine-under-digital-markets-act/ Thu, 24 Apr 2025 12:18:05 +0000 https://www.scl.org/?p=18459 The European Commission has announced that it has decided to close its investigation into Apple’s user choice obligations under the Digital Markets Act (DMA). In less good news for Apple, it has also decided that Apple’s steering rules breach the DMA and fined it 500 million euros.  It has also made a preliminary ruling regarding...

Read More... from European Commission issues Apple with fine under Digital Markets Act

The post European Commission issues Apple with fine under Digital Markets Act appeared first on Society for Computers & Law.

]]>
The European Commission has announced that it has decided to close its investigation into Apple’s user choice obligations under the Digital Markets Act (DMA). In less good news for Apple, it has also decided that Apple’s steering rules breach the DMA and fined it 500 million euros.  It has also made a preliminary ruling regarding Apple’s contract terms for alternative apps.

Closure of investigation into Apple’s user choice obligations

The Commission closed the investigation against Apple regarding the DMA obligation that gives users in the EU the opportunity to easily uninstall any software applications and change default settings on iOS, as well as choosing their default web browser from a choice screen. This follows what the European Commission describes as a constructive dialogue between the Commission and Apple. As a result, Apple changed its browser choice screen, streamlining the user experience of selecting and setting a new default browser on iPhone. Apple also made it easier for users to change default settings for calling, messaging, call filtering, keyboards, password managers, and translation services on iPhones. A new menu now allows users to adjust their default settings in one centralised location, streamlining the customisation process. In addition, users can now uninstall several Apple pre-installed apps, such as Safari, a functionality which was previously unavailable. The Commission will keep monitoring Apple’s measures and continue its regulatory dialogue to ensure full and effective user choice, as required by the DMA.

Commission’s fine regarding Apple’s steering rules

Under the DMA, app developers distributing their apps via Apple’s App Store should be able to inform customers, free of charge, of alternative offers outside the App Store, steer them to those offers and allow them to make purchases.

The Commission has found that Apple fails to comply with this obligation. Due to several restrictions imposed by Apple, app developers cannot fully benefit from the advantages of alternative distribution channels outside the App Store. Similarly, consumers cannot fully benefit from alternative and cheaper offers as Apple prevents app developers from directly informing consumers of such offers. It has failed to demonstrate that these restrictions are objectively necessary and proportionate. The Commission has ordered Apple to remove the technical and commercial restrictions on steering and to refrain from perpetuating the non-compliant conduct in the future, which includes adopting conduct with an equivalent object or effect.

The fine imposed on Apple takes into account the gravity and duration of the non-compliance. Apple has indicated that it will appeal the fine.

Preliminary findings on Apple’s contract terms

Under the DMA, Apple is required to allow for the distribution of apps on its iOS operating system by means other than through the Apple App Store. In practical terms, this means that Apple should allow third party app stores on iOS and apps to be downloaded to the iPhone directly from the web.

Following an investigation, the Commission takes the preliminary view that Apple failed to comply with this obligation in view of the conditions it imposes on app (and app store) developers. Developers wanting to use alternative app distribution channels on iOS are disincentivised from doing so as this requires them to opt for business terms which include a new fee (Apple’s Core Technology Fee). Apple also introduced overly strict eligibility requirements, hampering developers’ ability to distribute their apps through alternative channels. Finally, according to the Commission, Apple makes it overly burdensome and confusing for end users to install apps when using such alternative app distribution channels.

Therefore, the Commission has preliminarily found that Apple has failed to demonstrate that the measures put in place are strictly necessary and proportionate. Apple can now respond.

The post European Commission issues Apple with fine under Digital Markets Act appeared first on Society for Computers & Law.

]]>
Ofcom launches first investigation of individual service provider under Online Safety Act 2023 https://www.scl.org/ofcom-launches-first-investigation-of-individual-service-provider-under-online-safety-act-2023/ Tue, 22 Apr 2025 09:11:55 +0000 https://www.scl.org/?p=18402 Ofcom has launched an investigation into whether the provider of an online suicide forum has failed to comply with its duties under the Online Safety Act 2023. This is the first investigation opened into an individual online service provider under the OSA. Specifically, Ofcom is investigating whether this provider has failed to: Due to its...

Read More... from Ofcom launches first investigation of individual service provider under Online Safety Act 2023

The post Ofcom launches first investigation of individual service provider under Online Safety Act 2023 appeared first on Society for Computers & Law.

]]>
Ofcom has launched an investigation into whether the provider of an online suicide forum has failed to comply with its duties under the Online Safety Act 2023.

This is the first investigation opened into an individual online service provider under the OSA. Specifically, Ofcom is investigating whether this provider has failed to:

  • put appropriate safety measures in place to protect its UK users from illegal content and activity;
  • complete – and keep a record of – a suitable and sufficient illegal harms risk assessment; and
  • adequately respond to a statutory information request.

Due to its sensitive nature, Ofcom has decided not to name the provider and forum.  However, the BBC has reported that it has been investigating the same forum and believes it is linked to 50 deaths in the UK.

Legal obligations under the Online Safety Act

Providers of services in scope of the OSA had until 16 March 2025 to assess how likely people in the UK are to encounter illegal content on their service, and how their service could be used to commit or facilitate ‘priority’ criminal offences – including encouraging or assisting suicide.

On 17 March, duties came into force that mean providers must now take steps to protect their UK users from illegal content and activity, including by using proportionate measures to:

  • mitigate the risk of their service being used to commit or facilitate a priority offence;
  • prevent individuals from encountering priority illegal content; and
  • swiftly take down illegal content once they become aware of it.

Ofcom’s codes of practice and guidance set out ways providers can comply with these duties. Providers are also required to respond to all statutory information requests from Ofcom in an accurate, complete and timely way.

Ofcom’s starting point in driving compliance is to give service providers an opportunity to engage with its compliance teams about what they need to do under their new duties. However,  failure to comply with the new online safety duties or adequately respond to its information requests may result in enforcement action, and it will not hesitate to take swift action where it suspects there may be serious breaches.

In this case, it says that it has made several attempts to engage with this service provider about its duties under the Act and issued a legally binding request to submit the record of its illegal harms risk assessment to it. Having received a limited response to its request, and unsatisfactory information about the steps being taken to protect UK users from illegal content, Ofcom has therefore launched its investigation. It will now gather and analyse evidence to decide if a contravention has occurred. If its assessment indicates a compliance failure, Ofcom will issue a provisional notice of contravention to the provider, who can then make representations on its findings, before Ofcom makes its final decision.

Ofcom says that it will provide an update on this investigation as soon as possible.

The post Ofcom launches first investigation of individual service provider under Online Safety Act 2023 appeared first on Society for Computers & Law.

]]>
This Week’s Techlaw News Round-Up https://www.scl.org/this-weeks-techlaw-news-round-up-49/ Fri, 11 Apr 2025 08:33:54 +0000 https://www.scl.org/?p=18181 UK law Electronic Communications (Networks and Services) (Designated Vendor Directions) (Penalties) Order 2025 The Electronic Communications (Networks and Services) (Designated Vendor Directions) (Penalties) Order 2025 SI 2025/443 has been made.  It makes consequential amendments to the Electronic Communications (Networks and Services) (Penalties) (Rules for Calculation of Turnover) Order 2003, SI 2003/2712 which in summary covers...

Read More... from This Week’s Techlaw News Round-Up

The post This Week’s Techlaw News Round-Up appeared first on Society for Computers & Law.

]]>
UK law
Electronic Communications (Networks and Services) (Designated Vendor Directions) (Penalties) Order 2025

The Electronic Communications (Networks and Services) (Designated Vendor Directions) (Penalties) Order 2025 SI 2025/443 has been made.  It makes consequential amendments to the Electronic Communications (Networks and Services) (Penalties) (Rules for Calculation of Turnover) Order 2003, SI 2003/2712 which in summary covers how certain penalties are calculated in relation to turnover under the Communications Act 2003. It came into force on 3 April 2025.

CAP and BCAP update advertising codes to align with Digital Markets Act 2024

CAP and BCAP have published amendments to their advertising codes, which took effect on 8 April 2025.  The amendments align the Codes with the unfair commercial practices provisions in the Digital Markets, Competition and Consumers Act 2024 which came into force on 6 April. The changes include new rules on drip pricing and fake reviews.  Both the CMA and the ASA will delay enforcement on fake reviews for three months. The ASA has also said that it will align its enforcement on drip pricing with the CMA’s approach.

DSIT and NCSC launch new Cyber Governance Code of Practice for board

The Department for Science, Innovation and Technology (DSIT) and National Cyber Security Centre (NCSC) has published a new Cyber Governance Code of Practice on 8 April 2025, following industry consultation in 2024. The Code outlines actions for boards and directors to manage cyber security risks across five areas: risk management, strategy, people, incident planning, and assurance. It forms part of a wider governance package that includes training and implementation toolkit, primarily targeting medium and large organisations. The Code was developed in response to data showing 74% of large businesses experienced cyber attacks in the past year.

EU law

European Commission’s Expert Group on B2B data sharing and cloud computing contracts publishes final report

The European Commission’s Expert Group on B2B data sharing and cloud computing contracts has published its final report.  It contains non-binding model contractual terms on data access and use, as well as standard contractual clauses for cloud computing contracts under Article 41 of the EU Data Act.

Joint letter published on the EU’s need for AI liability rules

Several civil society organisations and BEUC have written to Executive Vice President Virkkunen and Commissioner McGrath to share their concerns that the AI liability directive proposal (AILD) is being withdrawn and to urge them to begin preparatory work on new AI liability rules. They seek at the very least a non-fault based liability approach that will make it easier for consumers who are harmed by an AI system to seek compensation.

European Commission launches AI Continent Action Plan

The European Commission has launched its AI Continent Action Plam.  It revolves around five pillars: building a large-scale AI data and computing infrastructure; increasing access to large and high-quality data, developing algorithms and fostering AI adoption in strategic EU sectors, strengthening AI skills and talents and simplifying regulation. The Commission will also launch the AI Act Service Desk, to help businesses comply with the AI Act. It will serve as the central point of contact and hub for information and guidance on the AI Act. In May it will consult on its Data Union Strategy.

European Commission consults on cloud and AI policies in the EU

The European Commission is consulting on the preparatory work for the Cloud and AI Development Act and the single EU-wide cloud policy for public administrations and public procurement. The Commission seeks views on the EU’s capacity in cloud and edge computing infrastructure, especially in light of increasing data volumes and demand for computing resources, both fuelled by the rise of compute-intensive AI services. As well as this, the Commission seeks views on the use of cloud services in the public sector.  It ends on 4 June 2025.

European Commission launches public consultation and call for evidence on the Apply AI Strategy

The Commission’s AI Office has called for evidence and is consulting on its Apply AI Strategy, planned to be published later this year. The Apply AI Strategy is part of President von der Leyen’s Political Guidelines to make Europe a global leader in AI innovation. The Strategy will serve as a blueprint for the full adoption of AI in EU strategic sectors. In particular, the Apply AI Strategy aims to foster the integration of AI technologies into strategic sectors. These sectors include advanced manufacturing; aerospace; security and defence; agri-food; energy; environment and climate; mobility and automotive; pharmaceutical; biotechnology; robotics; electronic communications; advanced material design; and cultural and creative industries. The consultation aims to identify priorities, current challenges to the uptake of AI in specific sectors as well as potential solutions and policy approaches. The consultation also includes specific questions on the challenges in the AI Act implementation process and how the Commission and member states can support stakeholders better in implementing the legislation. The consultation ends on 4 June 2025.

Commission updates guidelines on responsible use of generative AI in research

The European Commission’s Directorate-General for Research and Innovation has published the second version of its guidelines on responsible use of generative AI in research. One of the goals of the guidelines is that the scientific community uses generative AI responsibly. They take into account key principles on research integrity as well as existing frameworks for the use of AI in general and in research specifically.  The principles include honesty, reliability, respect and accountability. It is also consulting on its AI in Science Strategy. The consultation ends on 5 June 2025.

The post This Week’s Techlaw News Round-Up appeared first on Society for Computers & Law.

]]>
ICO reports on use of children’s data in financial services sector https://www.scl.org/ico-reports-on-use-of-childrens-data-in-financial-services-sector/ Wed, 09 Apr 2025 11:24:53 +0000 https://www.scl.org/?p=18069 The ICO’s Assurance department has recently approached organisations in the financial services sector to review how they process information. The review looked at how they use children’s data and how they use AI and automated decision making. The ICO also wanted to gather views on experiences of implementing good data protection practice, compliance challenges, competing...

Read More... from ICO reports on use of children’s data in financial services sector

The post ICO reports on use of children’s data in financial services sector appeared first on Society for Computers & Law.

]]>
The ICO’s Assurance department has recently approached organisations in the financial services sector to review how they process information. The review looked at how they use children’s data and how they use AI and automated decision making.

The ICO also wanted to gather views on experiences of implementing good data protection practice, compliance challenges, competing regulatory or legislative priorities and any general data protection concerns.

Recital 38 of the UK GDPR says that

“children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data.”

The ICO23 strategic plan identifies children as a vulnerable group and protecting them through the responsible use of their information is a current priority.

The review of children’s data processing focussed on the following areas:

  • Governance: the measures in place to control the processing of children’s data.
  • Transparency: the information given to children which tells them what their data will be used for.
  • Use of information: that information is processed, for what purpose and which lawful basis is used.
  • Individual Rights: how individual rights relating to children’s data are handled, whether received from children, parents 1 or other third parties.
  • Age Verification: the methods used to identify, and verify the age of, children.
  • Further contact and marketing:how children are contacted about their accounts and information provided to them about other products and services.

The review focussed on these areas with all participants so that common themes could be identified and included in the report for the benefit of other organisations who carry out similar processing. It summarises evidence of good practice; evidence of risks to data protection compliance; and instances where the ICO found that improvements may be necessary to data practices.

Key findings

Children are important customers for many financial services. Several participants highlighted children’s products as a key area of focus for development as they represent the future customer base for the wider range of products and services offered. The review of processing of children’s data provided the following key findings.

Governance

Most organisations had policies in place to control the use of children’s information. However, there was limited monitoring of compliance with these policies. Nearly all organisations provided data protection training to staff. However, less than a fifth included specific training about the use of children’s information.

Transparency

Only half of organisations reported having age-appropriate privacy information. However, the ICO said the number that it considered to have effective age appropriate privacy information was lower. The examples of privacy information that were suitable for children included age-appropriate language and engaging descriptions of how organisations use their information. The ICO said that the approach taken by several organisations appeared to have passed their own transparency responsibilities onto parents. As a result, there was a significant risk that children are recorded as agreeing to terms and conditions or privacy information that they do not actually understand. Providing privacy information was also often a onetime only exercise and is not revisited as children age and their understanding increases.

Use of information

Most organisations regularly reviewed the categories of children’s data collected to ensure it was limited to what is necessary, particularly for special categories of data. There were effective controls in place to prevent excessive data collection or purpose creep across all organisations observed. Consent was used for some purposes for processing.  However, some organisations asked for parents to provide the consent on behalf of their child in the first instance but failed to keep this consent under review. This means as the child gets older and their ability to understand the processing for themselves increases, the original consent is likely to become invalid until it is refreshed and obtained from the child. 

Individual rights

Respondents reported that requests to exercise the individual rights set out in UK GDPR by, or on behalf of, children are infrequent and low in volume. However, as a result of the issues found with explaining privacy information and their rights to children, parents’ wishes often, unfairly, supersede those of children. In several cases the decision whether to accept requests for children’s information from the child or their parent is made using a predetermined age limit rather than an assessment of the child’s competence.

Age verification

Processes to verify the age of children were robust across all organisations.

Contact (including marketing)

Many organisations provided administrative communications. Nearly all had a policy that prevents marketing to children. The ICO noted that there is limited distinction between parents and children when communications were provided, which was sometimes based simply on whose contact information is available. This creates a high risk of non-compliance with communications and marketing requirements.

The findings of the review of the use of AI and automated decision making in the financial services sector are contained in a separate report.

The post ICO reports on use of children’s data in financial services sector appeared first on Society for Computers & Law.

]]>
ICO publishes anonymisation guidance https://www.scl.org/ico-publishes-anonymisation-guidance/ Tue, 08 Apr 2025 10:34:56 +0000 https://www.scl.org/?p=18079 The Information Commissioner’s Office has published guidance about anonymisation and pseudonymisation techniques, aimed at helping organisations handle personal data in compliance with data protection laws.  Anonymisation is the process of representing data in such a way that individuals are not identifiable.  Once data is anonymised, it falls outside the scope of data protection laws. Pseudonymisation...

Read More... from ICO publishes anonymisation guidance

The post ICO publishes anonymisation guidance appeared first on Society for Computers & Law.

]]>
The Information Commissioner’s Office has published guidance about anonymisation and pseudonymisation techniques, aimed at helping organisations handle personal data in compliance with data protection laws. 

Anonymisation is the process of representing data in such a way that individuals are not identifiable.  Once data is anonymised, it falls outside the scope of data protection laws.

Pseudonymisation involves replacing identifiable information with a pseudonym, but the data can still be linked back to an individual if additional information is available. 

The ICO’s guidance aims to help organisations comply with the UK GDPR and the Data Protection Act 2018 (DPA 2018).  It also considers the implications of freedom of information legislation.

It emphasises the concept of identifiability and says that it is crucial to assess whether data can be linked to an individual, considering all means reasonably likely to be used for identification. It also considers the so-called “Motivated Intruder Test” – this assesses if a determined person with access to resources could identify individuals from anonymised data.  It is a practical approach to evaluating the risk of re-identification. It also covers the “spectrum of Identifiability”: Identifiability is context-specific and can change over time due to technological advancements and the availability of additional data.

The guidance outlines two main approaches to anonymisation:

  • Generalisation: Reducing the specificity of data, such as grouping ages into ranges.
  • Randomisation: Adding noise to data to reduce the certainty that it relates to a specific individual.

Pseudonymisation techniques include:

  • Hashing: Transforming data into a fixed-length output using a hash function.
  • Encryption: Using symmetric or asymmetric encryption to protect data.
  • Tokenisation: Replacing sensitive data with randomly generated tokens.

The benefits are enhancing data security and privacy; supporting compliance with data protection principles and enabling data sharing and processing for research and other purposes. The risks are that

  • Ineffective anonymisation can lead to re-identification.
  • Pseudonymised data remains personal data and must be protected accordingly.

The guidance stresses the importance of robust governance measures, including:

  • Data Protection Impact Assessments (DPIAs): these are crucial for identifying and mitigating risks associated with anonymisation and pseudonymisation.
  • Transparency: organisations must be clear about their anonymisation processes and the purposes for which data is anonymised.
  • Staff Training: ensuring that staff involved in data processing understand the techniques and risks associated with anonymisation and pseudonymisation.

 The guidance includes practical case studies demonstrating the application of anonymisation and pseudonymisation techniques. For example, pseudonymising employee data for recruitment analytics and using trusted third parties for market insights.

The post ICO publishes anonymisation guidance appeared first on Society for Computers & Law.

]]>