Uncategorised Archives - Society for Computers & Law https://www.scl.org/category/uncategorised/ Society for Computers & Law Wed, 13 Mar 2024 16:22:16 +0000 en-GB hourly 1 https://wordpress.org/?v=6.8.1 https://www.scl.org/wp-content/uploads/2024/02/cropped-scl-150x150.png Uncategorised Archives - Society for Computers & Law https://www.scl.org/category/uncategorised/ 32 32 SCL AI for Schools Programme https://www.scl.org/scl-ai-for-schools-programme/ Wed, 03 May 2023 14:17:06 +0000 https://production.scl.org/2023/05/03/scl-ai-for-schools-programme/ An SCL outreach initiative aimed at helping underprivileged 6th form students consider a career in the tech law sector The SCL AI for Schools Programme is an outreach initiative aimed at helping underprivileged 6th form students, who may never have heard of tech law, consider a career in the sector. It has been conceived by...

Read More... from SCL AI for Schools Programme

The post SCL AI for Schools Programme appeared first on Society for Computers & Law.

]]>
An SCL outreach initiative aimed at helping underprivileged 6th form students consider a career in the tech law sector

The SCL AI for Schools Programme is an outreach initiative aimed at helping underprivileged 6th form students, who may never have heard of tech law, consider a career in the sector.

It has been conceived by SCL Trustee Matthew Lavy, Barrister, 4 Pump Court and Iain Munro, Barrister, 4 Pump Court.

AI is the topic focus because of the opportunities for interesting and relatable use cases. The chosen scenarios make it easy to show students why the law of AI matters and the profound effect AI could have upon their lives.

Matthew and Iain have run several sessions with 6th formers, most recently with the Sutton Trust at Warwick University. They have evaluated what works and what doesn’t and refined the programme content accordingly. We are now ready to roll out this initiative more widely.

We are therefore looking for SCL members who are interested in presenting this training in schools and colleges. Potential presenters could be SCL student ambassadors to senior partners or barristers. We are keen to attract as diverse an audience as possible and are looking for confident and competent presenters to help us take this initiative forwards. If you already have a relationship with a school in a disadvantaged area, even better.

We have all the materials to help you deliver the programme to students: scenario slides, notes and suggested questions and a training video.

To learn more about the scheme – please watch the launch event training video

Please see below to:

  • Download the presenter slides available in PDF format and PowerPoint
  • Download the presenter notes to accompany the slides

If you would like to get involved and volunteer to be an SCL AI for Schools presenter, please contact hello@scl.org


Society for Computers and Law A company limited by guarantee 1133537 Registered Charity No. 266331 VAT Registration No. 115 4840 85 Registered in England and Wales Registered office: Unit 4.5, Paintworks, Arnos Vale, Bristol, BS4 3EH.
 
© 2023, Society for Computers and Law

The post SCL AI for Schools Programme appeared first on Society for Computers & Law.

]]>
Tech aspects of 2023 Budget https://www.scl.org/12833-tech-aspects-of-2023-budget/ Fri, 17 Mar 2023 00:00:00 +0000 https://production.scl.org/2023/03/17/12833-tech-aspects-of-2023-budget/ The Chancellor of the Exchequer has issued the 2023 Budget and published the outcome of Sir Patrick Vallance's Pro-Innovation Regulation of Technologies Review....

Read More... from Tech aspects of 2023 Budget

The post Tech aspects of 2023 Budget appeared first on Society for Computers & Law.

]]>
The Chancellor of the Exchequer has issued the 2023 Budget. This article sets out the aspects of interest to technology lawyers.

Regulation of Technologies Review

At the Autumn Statement 2022 the government asked Sir Patrick Vallance to lead the Pro-Innovation Regulation of Technologies Review. The government has now announced that it will be taking forward all Sir Patrick’s recommendations on the regulation of emerging digital technologies. These include:

  • a multi-regulator sandbox;
  • amending the Computer Misuse Act 1990 to include a statutory public interest defence;
  • announcing a clear policy position on the relationship between intellectual property law and generative AI to provide confidence to innovators and investors;
  • facilitating greater industry access to public data, and prioritising wider data sharing and linkage across the public sector;
  • bringing forward the Future of Transport Bill to unlock innovation across automated transport applications;
  • working with the Civil Aviation Authority to establish an operating standard for drones, moving away from relying on operators to prove they are safe;
  • the ICO should update its guidance to clarify when an organisation is a controller, joint controller or processor for processing activities relating to AI as a service;
  • in line with the Commons Science and Technology Select Committee’s recommendation in their report on the UK space strategy and UK satellite infrastructure, the UK government should implement a variable liability approach to granting licences by June 2023.

The report also said that the government should avoid regulating emerging digital technologies too early, to avoid the risk of stifling innovation.

The government has now asked Sir Patrick to report on how regulators can better support innovation, and the government’s new Chief Scientific Adviser, Professor Dame Angela McLean, will oversee future reviews into creative industries, advanced manufacturing, and the regulator growth duty.

Budget announcements

The UK government says that it will support growth in the sectors it considers key for the future: green industries; digital technologies; life sciences; creative industries and advanced manufacturing.

In addition, it says that further investment is needed in infrastructure for research and innovation. Powerful computing capability is essential to progress in AI research. However, the Future of Compute Review, which was published earlier in March, found that the UK’s AI community has immediate requirements for large-scale, accelerator-driven compute to remain internationally competitive. It said that urgent action is needed to bolster the UK’s compute infrastructure and create a world-class compute ecosystem.

The government is following two key recommendations of the Future of Compute Review: it will invest, subject to the usual business case processes, in the region of £900 million to build an exascale supercomputer and it will establish a new AI Research Resource.

The government says that recent developments in AI, such as the launch of ChatGPT and the announcement of Google Bard, have shown the powerful potential for technologies which are based upon foundation models, including large language models. The UK government will establish a taskforce to advance UK sovereign capability in foundation models, including large language models.

The government will award a £1 million prize every year for the next ten years to researchers that drive progress in critical areas of AI.

Quantum technologies are expected to have transformative and wide-ranging impacts, through the development of new types of computers, secure communications, and wider improvements to sensing, imaging and timing. The Quantum Strategy sets out a quantum research and innovation programme. The government will invest a total of £2.5 billion over 10 years, focusing on four goals: ensuring the UK is home to significant quantum science and engineering; supporting businesses through innovation funding opportunities and by providing access to high quality R&D facilities; driving the use of quantum technologies in the UK; and creating a national and international regulatory framework.

The government has allocated £100 million funding for the Innovation Accelerators programme to 26 transformative R&D projects. It has also committed to lead on the regulation of AI and on the future of web technology. The government will work to maximise the potential of Web3 and to spur UK growth and innovation, alongside empowering individuals to influence how their data is used, and managing downside risks to privacy, security and harms.

The post Tech aspects of 2023 Budget appeared first on Society for Computers & Law.

]]>
Ministry of Justice issues final report on electronic execution https://www.scl.org/12832-ministry-of-justice-issues-final-report-on-electronic-execution/ Thu, 16 Mar 2023 00:00:00 +0000 https://production.scl.org/2023/03/16/12832-ministry-of-justice-issues-final-report-on-electronic-execution/ The report considers the challenges of using electronic signatures and makes recommendations for reform....

Read More... from Ministry of Justice issues final report on electronic execution

The post Ministry of Justice issues final report on electronic execution appeared first on Society for Computers & Law.

]]>
The Industry Working Group on Electronic Execution of Documents has published its final report, which considers the challenges arising from the use of electronic signatures in cross-border transactions and how best to use electronic signatures to optimise their benefits when set against the risk of fraud. It sets out the Group’s further recommendations for reform.

The Report’s main recommendations for reform, or work towards reform, are as follows:

  • Enhanced certification through the role of the ICO and a review of the National Cyber Security Centre Technical Assurance Principles initiative.
  • Self-certification involving ICO/Department of Science, Innovation and Technology or another government body working as a moderator that:

develops a set of signing platform ‘basic performance standards’

publishes the standards on a ‘dedicated/go-to’ webpage that is easily locatable for prospective platform users;

invites signing platforms to confirm whether they meet the standards;

publishes a list of signing platforms that submit self-certification on a go-to webpage;

confirms listings annually.

  • Work towards uniformity of approach to e-signing and online identification by way of an international standard or mutual recognition. The Group recommends that the UK consider adopting the UNCITRAL Model Law on Electronic Signatures, at least in some form. It says that Article 12 of the Model Law is particularly instructive on the point of cross-jurisdictional recognition, and UK movement on this point, given its significance as an international commercial hub, would send a strong signal, and perhaps act as an incentive, to other jurisdictions
  • UK government consideration of wholesale adoption of e-signatures for all purposes, and investigation into modernising any area where wet ink signatures are mandated.
  • Review by the Law Commission of the law of deeds with a view to the abolition of at least some of their current requirements.
  • Removal of the requirement for a third party to be involved in making a statutory declaration and allow declarations to be made wholly electronically in a straightforward manner and without risk of invalidity.
  • The establishment by the UK government, or a suitable Department, of a standing body similar to the Industry Working Group, comprising both legal, industry and academic membership that is able to focus solely on these issues and to keep abreast of developments as they occur.

For background, see here and here.

The post Ministry of Justice issues final report on electronic execution appeared first on Society for Computers & Law.

]]>
Can Machine Learning and black box systems properly exercise contractual discretion? https://www.scl.org/12793-can-machine-learning-and-black-box-systems-properly-exercise-contractual-discretion/ Mon, 23 Jan 2023 00:00:00 +0000 https://production.scl.org/2023/01/23/12793-can-machine-learning-and-black-box-systems-properly-exercise-contractual-discretion/ An increasing number of contractual decisions are now being made with or by machine learning. Tom Whittaker questions whether such activity meets the implied duty to exercise discretion properly and suggests some ways to mitigate the risks....

Read More... from Can Machine Learning and black box systems properly exercise contractual discretion?

The post Can Machine Learning and black box systems properly exercise contractual discretion? appeared first on Society for Computers & Law.

]]>
The courts may impose restrictions on, and subsequently scrutinise, how a party exercises its discretion under a contract. Traditionally, only humans have been capable of exercising that discretion – the current law reflects this. But what if machine learning were used to assist with, or exercise, that discretion? Will those using machine learning systems to exercise contractual discretion be able to justify their decisions?

These are not questions for the future. Machine Learning (ML) is already being used to assist and make real-world decisions. However, the technology is quickly evolving and being used in a greater variety of use cases. Regulators and institutions, including the Financial Conduct Authority and Prudential Regulation Authority,1 and separately the Medicines and Healthcare products Regulatory Agency,2 are looking at whether existing regulations are suitable for the rapidly evolving technology and its expanding set of use cases. Similarly, existing laws may require a different approach for evolving technologies.

In this article we look at:

  • potential issues arising where contractual discretion exercised with the assistance of or by machine learning systems is scrutinised, and
  • what those procuring, developing and deploying ML systems will want to consider to address those challenges and justify their decisions.3

An implied duty to exercise contractual discretion properly

These courts may imply a contractual duty that when a contract gives party A, usually but not always in the stronger negotiating position, the discretion to make a decision that affects both party A and party B, then party A must exercise that discretion rationally.

If the duty is implied the court will scrutinise the decision-making, looking at:

(a) the process – have “the right matters been taken into account

(b) the outcome – is “the result so outrageous that no reasonable decision-maker could have reached it“?4

This is sometimes referred to as the Braganza duty after a 2015 Supreme Court case.5 In that case, BP, having the contractual discretion to decide how Mr Braganza had died, was held to have wrongly concluded that he had committed suicide, which coincidentally released BP from the obligation to pay his widow death-in-service benefits. It was held that BP’s decision-making process did not take into account the right matters and the outcome was not reasonable in the circumstances resulting in BP’s decision being reversed.

Knowing when the Braganza duty will be implied is not always straightforward. It requires, amongst other things, consideration of:

(i) the contractual wording;

(ii) the nature of the discretion being exercised;

(iii) whether the party exercising discretion has a conflict of interest; and

(iv) whether it is necessary to imply the term.

However, what is relevant to our questions is that the Braganza duty has been found in various types of contract and sectors, for example: employment;6 financial services;7 and professional services.8 Whether or not that decision was made entirely by a human or with computer assistance has not been a relevant factor in case law to date. So is it possible that an organisation exercising contractual discretion with or by an ML system will have a Braganza duty implied and their decision-making scrutinised?

What is Machine Learning?

Machine learning is usually seen as a sub-set of artificial intelligence. The key points relevant to how the law concerning contractual discretion operates are:

  • ML allows a system to learn and improve from examples without all its instructions being explicitly programmed.
  • An ML system is trained to carry out a task by analysing large amounts of training data and building a model that it can use to process future data, extrapolating its knowledge to unfamiliar situations.

Issues with scrutinising decisions made by ML systems

Determining how an ML system actually made a decision may not be possible because it is a ‘black box’ – ‘a system [..] that can be viewed in terms of its inputs and outputs, without any knowledge of its internal workings‘.9 The risk is that a black box ML system lacks explainability – a phrase often interchanged with ‘interpretability’ and ‘intelligibility’ – the ability to present or explain an ML system’s decision-making process in terms that can be understood by humans.10

The black box problem should be expected with the current approaches as there is a trade-off between technical performance and the ability for humans to explain how an ML system produces its outputs; the greater the performance, the lower the explainability and vice versa.11

Regulations or standards may require a specified balance between, or a minimum level of, technical performance and explainability in particular for safety or liability reasons. For example, emerging regulation for automated vehicles points to specified data requirements for purposes including explainability of decision-making for legal purposes.

These are both drivers behind the field of Explainable AI (also known as XAI) – the ability to explain or to present in understandable terms to a human.12 There are ways to increase the explainability of ML systems, including: ML models which are by their nature interpretable; decomposable ML systems, where the ML’s analysis is structured in stages and interpretability is prioritised for the steps that most influence the output; or tools, such as proxy models which provide a simplified version of a complex ML system.13

However, it is foreseeable that developers, purchasers and/or users of an ML system may prioritise technical performance over explainability because they prioritise the accuracy of the outcome over understanding how it was achieved.14 For example, it is reasonable for a patient to be more concerned with the accuracy of a cancer diagnosis that understanding how it was reached: whether or not their cancer diagnosis is accurate is of significant and direct use to their ability to make an informed decision about what to do next; being able to explain how the ML system worked to arrive at a diagnosis does not have similar use for them.15 Also, it is foreseeable that in some circumstances XAI tools may only provide limited technical benefit and may not be a commercial option.

A court may question whether it was reasonable for an ML system developer or purchaser to have struck a particular balance between the ML system’s technical performance and explainability where it had a choice to do so (e.g. it was not prescribed in legislation or in contract). Relevant factors may include:

  • which stakeholders are involved and their respective experience, skills and resources available – factors already relevant to determining whether a Braganza duty should be implied;
  • the nature and magnitude of any potential harm or benefits resulting from the decision which require explanation;
  • the relationship between performance and explainability – what is the cost/benefit or increasing or decreasing either?

However, whether or not the choice of a black box ML system was reasonable does not change the issues faced when trying to scrutinise the decision made. The decision-maker will still risk being unable to meet the burden of proof upon them to explain why and how it made a decision and, if they are unable to do so, that the court concludes that the decision had been made by simply “throwing darts at a dart board” (Hills v Niksun Inc [2016] EWCA Civ 115).

If the ML system is still a black box, could a court approach a problem by looking at how the ML system was intended to work, either at the point of design or decision? Consider the Singporean case of B2C2 Ltd v Quoine Pte Ltd [2019]. Whilst B2C2 concerned (amongst other things) the approach to unilateral mistake, rather than contractual discretion, the court had to consider ‘intention or knowledge […] underlying the mode of operation of a machine‘ – that is, why it did what it did.

In B2C2 the court held:

‘in circumstances where it is necessary to assess the state of mind of a person in a case where acts of deterministic computer programs are in issue, regard should be had to the state of mind of the programmer of the software of that program at the time the relevant part of the program was written.’

However, potential issues remain with such an approach.

First, the nature of ML. Certain AI systems, such as the one in B2C2, are ‘deterministic’, meaning that when given a particular input they will produce the same output each time. However, other ML systems may be ‘non-deterministic’ meaning that the same inputs may not always result in the same outputs.

Also, ML systems do not require all instructions to be explicitly programmed. The ML system may ‘learn’ throughout its lifecycle, for example, because it analyses more data and identifies patterns with greater precision. As a result, how the ML system was intended to work may be removed from how the ML system actually did work.

Second, determining a party’s intention may be difficult. Documentation may not be available or may be insufficient. Technical instructions may be of little value in determining how a decision was intended to be made, for example if the instructions have not been updated. Also, the ML system may not maintain sufficient logs of how it worked.

Further, it may not be possible for one person (like in B2C2), or a few people, to explain how an ML system was intended to work. Instead, there may be many involved in the ML system lifecycle (including third parties) who each had differing degrees of influence of the ML system and can only explain separate parts of how the ML system was supposed to work.

Difficulties scrutinising the decision-making process may place greater emphasis instead on the outcome – was “the result is so outrageous that no reasonable decision-maker could have reached it”.

However, ML systems may identify patterns which would otherwise not have been identified by humans. This may be because of the vast amounts of data that ML systems are able to analyse which a human simply could not. That may result in the ML system identifying patterns which ‘no reasonable-decision maker‘ could have identified when that decision-maker is human, but is reasonable for a ML system.

These potential issues are important. Those procuring, developing and deploying ML systems to exercise contractual discretion will be concerned about being in a position where they cannot evidence or justify their decisions and the court concluding they, in effect, threw darts at a dart board.

What practical steps can organisations take to address the legal risks?

Organisations procuring, developing and deploying ML systems to assist with or exercise contractual discretion need to consider, amongst other things, the legal issue. For contractual discretion this can be summarised as:

  • were only relevant factors taken into consideration and given appropriate weight; and
  • was the outcome reasonable.

What that analysis looks like depends on a case-by-case basis. But asking certain questions will help including:

  • when are ML systems used to exercise contractual discretion?
  • What are the contractual (and other) requirements (explicit and implied) about how that discretion is exercised?
  • What evidence is available as to how decisions should be made? For example, what technical documentation, impact assessments, risk management and governance reports are available?
  • What evidence is available as to how decisions are actually made? Did the ML system produce event logs?
  • Who is in a position to explain how decisions should be and are made? Are any of those third parties and are they required to co-operate, if needed?
  • Are there factors in favour of choosing/designing an ML system, and/or using Explainable AI tools, which can help the decision-maker explain their decision?

Many of these are questions organisations are already asking. This may be because of existing regulations requiring risk management, such as in financial services. Additionally, it may be part of preparations to comply with future AI-regulations, such as the EU AI Act which, amongst other things, specifies a range of obligations for high-risk AI systems. In any event, these questions (and more) should be asked – and kept under review – where ML systems are being deployed as part of good governance and risk management.

Tom Whittaker is a Senior Associate and solicitor advocate in Burges Salmon’s Dispute Resolution and Technology Teams.

Notes and references

1 https://www.bankofengland.co.uk/prudential-regulation/publication/2022/october/artificial-intelligence

2 https://www.gov.uk/government/publications/software-and-ai-as-a-medical-device-change-programme/software-and-ai-as-a-medical-device-change-programme-roadmap#wp-9-ai-rig-ai-rigour

3 Various regulations may also be applicable but they are not discussed here.

4 Braganza paragraph 24

5 Braganza v BP Shipping Limited and another [2015] UKSC 17

6 Braganza

7 UBS v Rose [2018] EWHC 3137 (Ch)

8 Watson v Watchfinder [2017] EWHC 1275 (Comm)

9 https://ico.org.uk/for-organisations-guide-to-data-protection/key-dp-themes/guidance-on-ai-and-data-protection/glossary/

10 see ‘POST Interpretable machine learning

11 https://docs.aws.amazon.com/whitepapers/latest/model-explainability-aws-ai-ml/interpretability-versus-explainability.html. Also, see ‘POST Interpretable machine learninghttps://researchbriefings.files.parliament.uk/documents/POST-PN-0633/POST-PN-0633.pdf. Though see https://wired.co.uk/article/psychology-artificial-intelligence for a different view

12 https://storage.googleapis.com/cloud-ai-whitepapers/AI%20%Explainability%20Whitepaper.pdf

13 see ‘POST Interpretable machine learning’

14 Assuming that there are no applicable laws or regulations which affect what is required. This is not to say that explainability is never required but instead that the balance between accuracy and explainability can vary depending on context and stakeholders.

15 In contrast to the patient example, a doctor or an NHS Trust may place increased importance on explainability to understand how their systems are working and to improve their other diagnostic processes.

The post Can Machine Learning and black box systems properly exercise contractual discretion? appeared first on Society for Computers & Law.

]]>
SCL Weekly Wellbeing: The Cold Shower https://www.scl.org/10901-scl-weekly-wellbeing-the-cold-shower/ Thu, 07 May 2020 21:00:00 +0000 https://production.scl.org/2020/05/07/10901-scl-weekly-wellbeing-the-cold-shower/ Created for SCL by Nick Watson and Gary Waters...

Read More... from SCL Weekly Wellbeing: The Cold Shower

The post SCL Weekly Wellbeing: The Cold Shower appeared first on Society for Computers & Law.

]]>
To help support its members, SCL is providing a Weekly Wellbeing blog to provide insights and inspiration to help you be kinder to yourself and to support the wellbeing of others. #SCLweeklywellbeing

The Weekly Wellbeing blog has been specially created for SCL by Nick Watson and Gary Waters.

Unlike most Legal Technology entrepreneurs, Nick Watson comes from a development background with a history of developing large, bespoke projects for a variety of industries including the Law. Launching in April 2016, Nick co-founded Ruby Datum, a user experience-driven, pioneering Virtual Data Room company. He also has a passion for wellbeing and is working towards a vision of a more mindful legal industry.

Gary Waters is a respected coach and entrepreneur. Gary’s background includes building a #1 ranking hospitality business in one of the UK’s most competitive tourism markets in the South West of England. Gary decided to sell that business so that he could channel all his energy and focus into his passion, which is helping people through coaching. Gary specialises in helping clients develop an Empowered Mindset, which enables them to navigate life’s challenges on their own terms. He’s passionate about helping people create a life of purpose and fulfilment through Personal Coaching, Business Coaching and Consulting programmes.

For the first week, we discussed the importance of a morning routine. We then expanded on one of the key areas for growth, gratitude and the profound impact it can have on our lives.

This week is all about the cold shower. It may seem a bit far-fetched to isolate the cold shower to one article, but the benefits it produces are well worth the focus.

Firstly, it’s an easy, quick win. You don’t have to spend time making lists, reflecting or exercising. The cold shower only takes a minute, with the impact being rapidly noticeable. If you were struggling to motivate yourself in other areas, this is a great place to start.

Why?
Cold Showers will make you feel great.

Endorphins are hormones that, to put quite simply, make you feel great.

Clinical studies show that just 5 minutes of a cold shower can help relieve symptoms of depression, increase alertness, clarity and energy levels through the release of these Endorphins.

Alertness
It probably goes without saying, but plunging yourself into a cold shower in the morning will awaken you from the most groggy of states, increasing alertness and without a doubt, kickstart your day in style.

Boost your immune system
A randomised clinical study against a control group saw a 29% reduction in sickness absence for those who took a cold shower. Plus, it makes you feel great (perhaps the reason why). A clear, optimistic mindset often leads to a much healthier person. Stress is a great producer of illness.

Improved circulation
When you hit the cold shower, your body starts working to try and regulate its temperature. As your heart beats faster, your blood starts to circulate quicker. This will improve your circulation over time, if taken regularly.

Improved circulation aids with cell regeneration, such as muscle recovery or improved, smoother skin (as reported by many) when fresh, oxygenated blood flows to these areas.

Metabolism benefits
Cold showers cause increased activity of the sympathetic nervous system. In-tandem with other efforts to change lifestyle habits, research shows a cold shower can increase metabolism and subsequently lose weight. We still don’t know the why, but it works.

Reduced Pain
One study showed that cold water can result in effects similar to anesthetics for pain relief. The nerve signals conduct impulses and blood vessels constrict, reducing swelling and edema. In turn, this reduces the rate at which nerves transmit pain to the brain.

How do I begin?
If you can manage just 10 seconds of a cold shower, it’s something to celebrate. We like to aim for at least 30 seconds. A minute at 20°c is what you should be aiming for, however many people have reported further benefits from cold showers for as long as 5-10 minutes.

Next time you’re in the shower, turn it to cold and breathe slowly, drawing your focus to your chest as it expands and contracts.

Some people like to take the Wim Hof breathing method as an approach. Wim Hof is known as The Iceman and holds Guinness world records for prolonged time spent under ice. The method is quite simple:

  • Get comfortable, relax
  • Breathe in fast and deep, don’t hold, let it go
  • Repeat 20 to 40 times
  • With your last breath, make it a big one and exhale squeezing all the air out your lungs
  • Hold as long as you can whilst exhaled
  • When gasping for air, inhale deeply, hold for 10-15 seconds
  • Repeat if you wish, perhaps 2-3 times
  • Hop into the shower

Some people like to hold their breath in for longer too. It’s about finding what works best for you.

Just one final note
Stressed during the day? Struggling to break the cycle of anxiety or anger? Hop in a cold shower! You’ll be forced to shift focus into the present moment and often see things from an alternative perspective.

This does all come with the caveat that those with heart problems or medical issues should consult a doctor before taking cold showers, to be safe. Particularly as the cold shower will increase your heart rate. **This article is not medical advice**

Studies:
https://www.sciencedirect.com/science/article/abs/pii/S030698770700566X
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5025014/
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4049052/  

Also see:
SCL Weekly Wellbeing: https://www.scl.org/blog/10886-scl-weekly-wellbeing
SCL Weekly Wellbeing: The Ultimate Life Hack: https://www.scl.org/blog/10892-scl-weekly-wellbeing-the-ultimate-life-hack

SCL Weekly Wellbeing is created by Nick Watson and Gary Waters. #SCLweeklywellbeing

Society for Computers and Law A company limited by guarantee 1133537 Registered Charity No. 266331 VAT Registration No. 115 4840 85 Registered in England and Wales Registered office: Unit 4.5, Paintworks, Arnos Vale, Bristol, BS4 3EH.

The post SCL Weekly Wellbeing: The Cold Shower appeared first on Society for Computers & Law.

]]>
Ada Lovelace Institute warns on possible loss of public trust in UK contact tracing app https://www.scl.org/10898-ada-lovelace-institute-warns-on-possible-loss-of-public-trust-in-uk-contact-tracing-app/ Tue, 05 May 2020 21:00:00 +0000 https://production.scl.org/2020/05/05/10898-ada-lovelace-institute-warns-on-possible-loss-of-public-trust-in-uk-contact-tracing-app/ Institute outlines suggested provisos for the UK app, public health initiatives and UK government strategy more broadly....

Read More... from Ada Lovelace Institute warns on possible loss of public trust in UK contact tracing app

The post Ada Lovelace Institute warns on possible loss of public trust in UK contact tracing app appeared first on Society for Computers & Law.

]]>
The Ada Lovelace Institute has issued a paper which considers what milestones any UK contract tracing app would have to meet before roll-out to ensure the app is safe, fair and equitable.  The paper builds on the rapid evidence review Exit through the App Store? and articulates the technical and practical limitations that would have to be overcome, and the policy and scrutiny measures that would have to be in place, before a contact tracing app is rolled out in the UK.   It argues that if the government launches an ineffective app or untrustworthy app, it will not be adopted, it is unlikely to be effective and could even be actively harmful to people’s health and trust.

The Institute says that to date the government has not answered the concerns it raised in its initial report, nor is it changing its recommendation that there is not yet the evidence and justification for an imminent national roll out. It currently sees no evidence for a scenario where the app will be able to trace contact to a high degree of accuracy and command the high levels of use and adherence needed for it to be a central pillar in the government’s public health strategy, to be relied on to keep people safe.

A key theme running through all the Institute’s suggested steps is the need for greater transparency and honesty with the public about the ethical concerns and technical limitations. 

The first suggested steps are that the government, with support of Parliament, must build the legislative and policy structures to underpin and surround the app, including: publicly setting success criteria and outcomes; and articulating the broader strategy and policy framework.

It must also implement primary legislation and oversight mechanisms. Legal and technical sunset clauses must be built into the design of new powers and technologies. The government must advance primary legislation regulating the processing of data by both public and private sector actors in the use of technology to transition from the crisis. It must further encourage privacy-by-design in technical implementations and must choose privacy-preserving protocols to underscore technical measures. 

Legislation must limit scope creep, by setting out precise purposes for data processing, who has access to data and for what purpose; require the deletion of data after specified periods, with exemptions from deletion of anonymised data for use in research; and prevent discrimination.

Legislation must also prohibit the development of third-party contact tracing apps and uses of contact tracing data. The government must also ensure that the ICO has the appropriate remit and capacity to oversee data use, and to require the performance, publication and approval by the ICO of a Data Protection Impact Assessment.

In addition the efficacy of the app must be demonstrated and the government must be transparent about the technical measures under consideration in advance of their deployment. The paper highlights that the risk is not just that the public lose faith in this app to support the health crisis, it could undermine trust in public health initiatives and government strategy more broadly.  

The post Ada Lovelace Institute warns on possible loss of public trust in UK contact tracing app appeared first on Society for Computers & Law.

]]>
SCL Tech Law Essentials: From Blockchain to Smart Contracts **New Module** https://www.scl.org/10897-scl-tech-law-essentials-from-blockchain-to-smart-contracts-new-module/ Sun, 03 May 2020 21:00:00 +0000 https://production.scl.org/2020/05/03/10897-scl-tech-law-essentials-from-blockchain-to-smart-contracts-new-module/ Presented by Mark Weston, Partner, Head of Commercial (London) and IT Law, Hill Dickinson LLP...

Read More... from SCL Tech Law Essentials: From Blockchain to Smart Contracts **New Module**

The post SCL Tech Law Essentials: From Blockchain to Smart Contracts **New Module** appeared first on Society for Computers & Law.

]]>
Please click on the links at the bottom of the page to view the session and download a PDF version of the slides

Speaker:
Mark Weston, 
Partner, Head of Commercial (London) and IT Law, Hill Dickinson LLP

This sessions focuses on:

  • Distributed Ledger Technology (DLT)
  • Where did blockchain come from
  • the 4 main characteristics of blockchain
  • the benefits of blockchain
  • use cases
  • legal issues
  • bitcoin
  • Smart Contracts and their relationship to blockchain

The post SCL Tech Law Essentials: From Blockchain to Smart Contracts **New Module** appeared first on Society for Computers & Law.

]]>
SCL Weekly Wellbeing: The Ultimate Life Hack https://www.scl.org/10892-scl-weekly-wellbeing-the-ultimate-life-hack/ Thu, 30 Apr 2020 21:00:00 +0000 https://production.scl.org/2020/04/30/10892-scl-weekly-wellbeing-the-ultimate-life-hack/ Created for SCL by Nick Watson and Gary Waters...

Read More... from SCL Weekly Wellbeing: The Ultimate Life Hack

The post SCL Weekly Wellbeing: The Ultimate Life Hack appeared first on Society for Computers & Law.

]]>
To help support its members, SCL is providing a Weekly Wellbeing blog to provide insights and inspiration to help you be kinder to yourself and to support the wellbeing of others. #SCLweeklywellbeing

The Weekly Wellbeing blog has been specially created for SCL by Nick Watson and Gary Waters.

Unlike most Legal Technology entrepreneurs, Nick Watson comes from a development background with a history of developing large, bespoke projects for a variety of industries including the Law. Launching in April 2016, Nick co-founded Ruby Datum, a user experience-driven, pioneering Virtual Data Room company. He also has a passion for wellbeing and is working towards a vision of a more mindful legal industry.

Gary Waters is a respected coach and entrepreneur. Gary’s background includes building a #1 ranking hospitality business in one of the UK’s most competitive tourism markets in the South West of England. Gary decided to sell that business so that he could channel all his energy and focus into his passion, which is helping people through coaching. Gary specialises in helping clients develop an Empowered Mindset, which enables them to navigate life’s challenges on their own terms. He’s passionate about helping people create a life of purpose and fulfilment through Personal Coaching, Business Coaching and Consulting programmes.

Did you do your morning routine last week? We’re not judging you if you didn’t!

For something in your life to change, you need to change something. It can be uncomfortable to start making adjustments for something you don’t yet see working, but like the rush of winning a case or closing a deal, it becomes more addictive as you start to build reference points for it working. We may as well get hooked on something positive, right?

This week we want to dive deeper into a significant part of the morning routine we introduced to you last week, gratitude.

Gratitude is one of the highest energetic emotional states that we can access as humans and it’s one of the fastest ways of shifting your perspective and raising your energy.

Most people only experience gratitude as a fleeting experience we drift in and out of temporarily. The goal of this practice is to make it a more prominent feeling in your life.

When we access a grateful state we activate our parasympathetic nervous system which is responsible for helping us to relax. As a result of lowering our sympathetic nervous system response, our hearts and brain get into rhythm and balance with one another (heart-brain coherence). The opposite of this is mass incoherence, or stress – we can probably all relate here!

We are able to measure the rise in energy and frequency that we radiate in a state of gratitude, which reputable studies have shown (see links at end of this article) many times. Some unverified studies even claim that accessing a state of gratitude for just ten minutes a day can improve the immune system performance by up to 50%.

Putting the stats aside, let’s take a look at ourselves. You cannot feel negative during a genuine state of gratitude. This is something so easily practiced and knowing you cannot feel negative in this true state of gratitude can be empowering; life changing. It’s one of the fastest ways to change how you are feeling and lift yourself out of feeling low and down to high, whatever the day brings. This is the ultimate life hack that will accelerate the way you’re able to deal with pressure.

The key with gratitude is to not just think about what you are grateful for but to truly get into a feeling state of gratitude and make that a default way through which you look at your world.

ACTION: Create a gratitude list

1. Hand write it because doing it makes it something physical.

2. Write out at least ten things you are grateful for in your life.

Don’t worry if the pen initially moves a little slowly to begin with. You have been conditioned to focus on what’s wrong or what’s lacking in your world rather than what’s positive.

Now you have your list, read through it and try to feel it. Like creating a movie in your mind, we’re going to engage with a process called mental rehearsal.

Using just one item (something present, in your life today – ideally involving a person) on your list, create a mind movie sharing the best moments you’ve had – create some upcoming ones if you want. Really make it real (even add a soundtrack) If it’s a person, perhaps you share a song together. Create something like a music video you may watch on YouTube, personal to you.

ACTION: Close your eyes, and do that now. Think of all the most amazing memories you have together. Make the movie as real as you can in your mind. Continue for a couple of minutes, or the length of a short song. Perhaps even put your earphones in and actually play the song.

Now ask yourself, what if you never got to add another scene to that movie? What if that was the last time you ever got to experience that memory? Sit with that for a few seconds and really contemplate it.

Doesn’t that truly make you feel like you’re so lucky to have these memories in your life, and be able to continue creating them? The feeling of this gratitude is so high and powerful that whatever challenges show up during your day, they seem really insignificant in comparison. If COVID-19 has taught us anything, it’s the comfort zone can be swept from under our feet in an instant. Some of the things we are most grateful for can be taken away so easily, and it’s with this empowered knowledge that we can truly live in the moment and remain unaffected by the curveballs life continues to throw us daily.

There are so many people in this world that would do anything to live the memory you’ve just experienced, perhaps even with someone they have lost. Why do we only choose to feel that way when the memory has faded for a while? We can access this state in an instant and feel a sense of true gratitude every day.

We have been so conditioned to be so focussed on what is lacking in our world and chasing what’s missing, thinking it will make us feel fulfilled but if we just took a few minutes a day to stop chasing our tail and focus on what we do have, we realise what we already have, want and need.

Just to recap, make your gratitude list and:

1. Read it. Create a mind movie.

2. Do this first thing when you wake up. Do this last before bed.

3. Keep it on your bedside table, or somewhere you can see it. If you see it, you will do it.

4. Practice this daily for 2-3 weeks because consistency and repetition is the key to building a new habit.

5. Add to your list daily. It keeps you constantly scanning for things in your life.

6. If you ever feel tested during the day, practice this exercise and zero back in on the significance of the challenge at hand.

Studies:

https://www.researchgate.net/publication/288932385_The_effects_of_gratitude_expression_on_neural_activity

https://www.frontiersin.org/articles/10.3389/fnhum.2017.00599/full

https://www.health.harvard.edu/healthbeat/giving-thanks-can-make-you-happier  

The post SCL Weekly Wellbeing: The Ultimate Life Hack appeared first on Society for Computers & Law.

]]>
Announcing – 2nd phase of ‘Remote Courts Worldwide’ https://www.scl.org/10893-announcing-2nd-phase-of-remote-courts-worldwide/ Thu, 30 Apr 2020 21:00:00 +0000 https://production.scl.org/2020/04/30/10893-announcing-2nd-phase-of-remote-courts-worldwide/ The next stage of the global initiative to help public court services cope with coronavirus...

Read More... from Announcing – 2nd phase of ‘Remote Courts Worldwide’

The post Announcing – 2nd phase of ‘Remote Courts Worldwide’ appeared first on Society for Computers & Law.

]]>


Are remote courts working well or badly?

Background – Remote Courts Worldwide (www.remotecourts.org) was launched five weeks ago to help the global community of justice workers (judges, lawyers, court officials, litigants, court technologists) to share news about the video and audio hearings which are now being conducted instead of traditional court hearings in physical buildings. As law courts have closed, this website has rapidly become the definitive worldwide source of information about remote courts in more than 40 countries

Announcing 2nd phase of Remote Courts Worldwide – the next phase in the initiative will focus on inviting and presenting feedback from around the world – from court users, lawyers, and judges – about how remote courts are working in practice. What is working well and what is not. This is intended to help court services to improve their existing remote courts and to inform policymakers when they come to consider the long-term implications of the current changes for the future of their courts.

Insights so far – analysis of the reports in Remote Courts Worldwide suggests that:

  1. Technology has enabled courts to stay open – access to justice is being maintained around the world during the crisis by the wide deployment of video hearings and audio hearings. 
  2. The technologies being used are widely accessible to all – for example, Zoom and Skype, along with conventional telephone conferencing.
  3. There are variations in formality – contrast a laid-back Chilean arraignment hearing with the insistence of a senior Chinese judge that a sense of ritual must be maintained.
  4. Judges are taking a robust approach – a Court of Protection case in England went ahead on Skype, because the judge felt that it would have been extremely risky to convene conventionally. An Australian judge refused an application for an adjournment, supporting the applicants’ argument that a fair trial could not in that case be held by video.
  5. The work of the courts has become more transparent – the pandemic has accelerated a trend of making proceedings more widely available to the wider public via the Web. 

Note – Remote Courts Worldwide is a collaborative project, involving the Society for Computers and Law, the UK LawTech Delivery Panel, and Her Majesty’s Courts & Tribunals Service. It is being led by Professor Richard Susskind, President of the Society for Computers and Law, and an expert in online courts – “We are delighted by the popularity of our site and thank contributors from around the globe. This second phase of our service is vital – to find out what is working well and what is not. Remote courts are here to stay and we must work hard, in light of concrete experience, to improve their performance.

For more information please email enquiries@remotecourts.org or visit www.remotecourts.org

The post Announcing – 2nd phase of ‘Remote Courts Worldwide’ appeared first on Society for Computers & Law.

]]>
This week’s Techlaw News Round-Up https://www.scl.org/10896-this-week-s-techlaw-news-round-up/ Thu, 30 Apr 2020 21:00:00 +0000 https://production.scl.org/2020/04/30/10896-this-week-s-techlaw-news-round-up/ Joint Statement on Digital Contact Tracing published Alessandra Pierucci, Chair of the Committee of Convention 108, and Jean-Philippe Walter, Data Protection Commissioner of the Council of Europe, have published a joint statement on digital contact tracing. The statement says that because contact tracing tools use personal data, it is crucial to ensure that the measures...

Read More... from This week’s Techlaw News Round-Up

The post This week’s Techlaw News Round-Up appeared first on Society for Computers & Law.

]]>
Joint Statement on Digital Contact Tracing published

Alessandra Pierucci, Chair of the Committee of Convention 108, and Jean-Philippe Walter, Data Protection Commissioner of the Council of Europe, have published a joint statement on digital contact tracing. The statement says that because contact tracing tools use personal data, it is crucial to ensure that the measures and related data processing are necessary and proportionate in relation to the legitimate purpose pursued and that they reflect, at all stages, a fair balance between all interests concerned, and the rights and freedoms at stake. The statement considers, among other things, the effectiveness of the tools, trust, impact assessment and privacy by design, purpose specification, sensitivity, quality and minimisation of data, automated decision-making, de-identification, security, architecture, interoperability, transparency, and oversight and audit.

IPO publishes feasibility study on AI-assisted patent prior art searching

The Intellectual Property Office has published a research study. The IPO sought to understand the feasibility, technical complexities and effectiveness of using AI solutions to improve operational processes of registering IP rights. In particular, the IPO was interested in a proof of concept for an AI-powered prior art search/due diligence check that could form part of the online patent filing and patent examiner prior art searching processes.  The study concludes that it is not possible to provide a fully automated solution as part of the patent application filing process.

RUSI publishes paper on AI and UK national security

The Royal United Services Institute for Defence and Security Studies was commissioned by GCHQ to conduct an independent research study into the use of AI for national security purposes. The aim was to establish an independent evidence base to inform future policy development regarding national security uses of AI. The findings show that AI offers numerous opportunities for the UK national security community to improve efficiency and effectiveness of existing processes. AI methods can rapidly derive insights from large, disparate datasets and identify connections that would otherwise go unnoticed by human operators. However, in the context of national security and the powers given to UK intelligence agencies, use of AI could give rise to additional privacy and human rights considerations which would need to be assessed within the existing legal and regulatory framework. For this reason, enhanced policy and guidance is needed to ensure the privacy and human rights implications of national security uses of AI are reviewed on an ongoing basis as new analysis methods are applied to data. The research highlights three ways in which intelligence agencies could seek to deploy AI: automation of administrative organisational processes; cybersecurity; and intelligence analysis, including natural language processing and audio-visual analysis, filtering and triage and behavioural analytics.

CAA announces delay to the applicability date of EU UAS Regulation on drones

Because of the disruption caused by the COVID19 outbreak, the CAA has announced that it will postpone the introduction of the new EU UAS Regulation within the UK, so the earliest that it will come into force will be 1 November 2020.  The CAA says that is aware that the European Commission is also considering a postponement, but the decision on this may not be made for some time. The CAA has decided to make the decision to delay now to provide some certainty. It is possible that the EC may subsequently delay the applicability beyond 1 November, in which case the CAA will update its deadline. The CAA is updating its guidance to reflect the change.

National surveillance camera strategy for England and Wales updated

The Surveillance Camera Commissioner has published its updated national surveillance camera strategy for England and Wales. It contains information about the Surveillance Camera Commissioner’s objectives and plans, including in relation to technology and data protection.

Black Lion Marketing Ltd fined £171,000 for making unsolicited direct marketing calls

The ICO has issued Black Lion Marketing Limited with a monetary penalty under section 55A of the Data Protection Act 1998. This was in relation to a breach of regulations 21 and 24 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR).  240,576 calls were made to subscribers who had been registered with the Telephone Preference Service. There were 233 complaints made regarding unsolicited direct marketing calls and no evidence of consent from subscribers. When considering the level of fine, the ICO also took into account the evidence that BLM Ltd used fictitious trading names in the course of its direct marketing; an action which would be likely to contravene the requirements of Regulation 24 of the PECR. 

The European Parliament issues study on enforcement and co-operation between member states, e-Commerce and the future Digital Services Act

The European Parliament has issued a study on Enforcement and co-operation between member states, E-Commerce and the future Digital Services Act. The study presents an overview of possible options for an effective model of enforcement for a future Digital Services Act. Four key areas of regulatory design are emphasised; the failure of self-regulation in relation to platforms; the importance of correct regulatory framing; the necessity of focusing on the internal operations of platforms; and that the scope of a DSA should be limited but include robust transparency and enforcement measures. A range of enforcement strategies are then evaluated across a suite of Digital Single Market legislation, alongside barriers to member states’ cooperation and effective enforcement. The paper sets out several options for enforcement and concludes with a recommendation of a specific enforcement model for a new DSA.

Ofcom publishes plan of work 2020/21 and extends fibre networks consultation

Ofcom has published its plan of work for 2020/21 – setting out its priorities and work programme for the new financial year. Ofcom has adapted its plan to take account of the exceptional circumstances that have unfolded since its consultation. As well as supporting people and businesses through the current challenges, the strategic themes of its programme of work are: better broadband and mobile; fairness for customers; supporting UK broadcasting; ensuring online communications work for people and businesses; enabling strong, secure networks; sustaining the universal postal service; continuing to innovate in regulation and data to help people and businesses; increasing diversity and inclusion; and support through the EU exit transition period and continued international relationships. Ofcom’s focus is on delivering the work it had planned in full over the course of this year. However it will monitor the evolving coronavirus situation closely and keep plans flexible. To reflect this, it will publish an updated plan in September, as well as quarterly updates on its progress against the plan. 

Separately, it has extended the deadline for responses to its proposals for promoting investment and competition in fibre networks to 22 May 2020.

The post This week’s Techlaw News Round-Up appeared first on Society for Computers & Law.

]]>