Search
Close this search box.

Trends in information technology law: looking ahead to 2017

As we go into 2017 the incipient ‘technologisation’ or ‘IT-isation’, if you’ll excuse the terms, of our lives is gathering pace and becoming much plainer to see. The first few days of December 2016 alone have seen a number of significant developments:

  • Michigan, the home of the US motor industry, became the first state to enact a comprehensive set of state regulations for autonomous vehicles;[1]
  • Amazon announced its ‘Just Walk Out’ technology-enabled physical shop that does away with the check-out;[2] and
  • Google’s Deep Mind subsidiary open sourced its core AI research platform.[3]

Whilst there is no easily observable overarching theme to unite these developments, they all centre on big data and can be grouped into building blocks around the cloud, AI and digital, and in combination constitute what has become known by the portmanteau term the fourth industrial revolution:

Figure 1: building blocks of the fourth industrial revolution

  • the cloud: computer processing is migrating to the data centre, the cloud’s engine room, fuelled by growth in social and mobile and generating an explosion in big data volumes;
  • artificial intelligence (AI): big data is enabling rapid advances in machine learning. Combining machine learning with billions of internet-connected sensors enables machine perception – advances in implantable and wearable devices, personal digital assistants, the Internet of Things, connected homes and smart cities. Add actuation – the ability to navigate the environment – to static machine learning and perception and you get to machine control – autonomous vehicles, domestic robots and drones;
  • digital: other related digital developments on the cusp of adoption at scale as we head into 2017, include augmented reality, gene editing, 3D manufacturing and blockchain and smart contracts.

AI and deep learning are worth calling out for particular attention. In research consultancy Gartner’s ‘Top 10 Strategic Technology Trends for 2007’ survey,[4] Gartner Vice-President and Fellow David Cearley said “over the next 10 years, virtually every app, application and service will incorporate some level of AI. This will form a long-term trend that will continually evolve and expand the application of AI and machine learning for apps and services.”

Deep learning, a machine learning technique, is emerging as AI’s ‘killer app’ enabler. It works by first using large training datasets to teach AI software to accurately recognise patterns from images, sounds and other input data and then, once trained, the software’s decreasing error rate enables it to make increasingly accurate predictions. Deep learning is the core technology behind the current rapid uptake of AI in a wide variety of business sectors from due diligence and e-discovery by law firms to the evolution of autonomous vehicles. To show how this happens, Microsoft in October 2016 released an updated version of the Microsoft Cognitive Toolkit, its deep learning acceleration software, and provided in its accompanying[5] blog an example of how the toolkit used training sets to increase training speech recognition accuracy. This is reproduced at Figure 2, showing how prediction error was rapidly halved in the pre-training phases and then reduced by a quarter again in the more extended fine tuning phases.

Figure 2 – Microsoft Cognitive Toolkit:

Achieving increasing speech recognition accuracy through iterations of training set use

What will all these advances mean for IT lawyers in 2017?

AI, Robotic Process Automation (RPA) and smart contracts projects

AI projects will come across our desks increasingly frequently next year. Here, there are two initial ‘do’s’ and ‘don’ts’ on AI contracts:

  • don’t be blinded by the glare of the new and do work from first principles: in the AI era, legal change in the areas of intellectual property (IP), contract, regulation and tort will continue to evolve from established principles.
  • don’t anthropomorphise AI and do think of AI as a thing, software and data. AI is personal property not a person (the ‘I Robot’ fallacy); AI systems aren’t ‘agents’ in any legal sense (the ‘agency fallacy’); and AI platforms don’t possess separate legal personality in their own right (the ‘entity fallacy’).

Increasing use across business of AI systems, software and algorithms means that more innovations, software and processes capable of IP protection will be generated and implemented by computers. Underlying copyright and patent law in this area is unclear, so do make sure the agreement addresses the parties’ expectations.

The emergence of Robotic Process Automation (RPA) – a software technique for digitising and automating previously labour-intensive business processes – has significant contractual implications.  RPA software will increasingly interact with more and more different third party systems, and it is important for contracting parties to ensure that that the customer’s use case matches the contract’s licence scope: disputes around software ‘over deployment’ – where the licensor says that the licensee is using software beyond what the licence permits – will continue to increase in 2017. As ever, this is an area where ‘contract is king’ and licence permissioning and scope issues will need to be reviewed and fine-tuned.

We heard a lot in 2016 about the blockchain – in the words of the Economist, ‘allowing strangers to make fiddle-proof records of who owns what’[6] – and how this paves the way for smart contracts – software code representing chains or self-executing agreements that the computer can make, verify, execute and enforce automatically under conditions set in advance. Smart contracts promise lower costs, latency and error rates and are likely to enable new operating models in areas as diverse as financial clearing and settlement, insurance claim processing and electronic patient records. For the IT lawyer, it will be important to distinguish between what the software itself does (how it makes and operates the smart contracts as set out in the specification) and how the agreements between developer, smart contract platform operator and customer regulate the use of that software.

Data law

Legal aspects of data – data rights, security, protection and sovereignty – will become an even more central part of IT law practice in 2017. In AI, input and output training and operating data sets will need to be checked for consistency with inbound and outbound licensing terms. We will hear more about ‘linked data’ – standards and practices for connecting the data generated by the internet – as organisations vie to mine and refine this raw material of the fourth industrial revolution.

Information security (IS), as the combination of legal, IT and business requirements, policies and procedures to stop unintended access to and use of data assets, will continue to rise up the corporate agenda in 2017.  As IT moves to the heart of business administration, regulated sectors like banking and insurance continue to see growth of regulator-mandated requirements around all aspects of IT and data use, including audit, privacy, data breaches, data retention, archiving, encryption, IS standards, business continuity, disaster recovery, and flow-down to subcontractors. Banks’ policies that IT vendors must comply with have proliferated as a result – and may run to hundreds of pages – and need careful scrutiny.  RegTech – automating aspects of compliance and reporting to manage the regulatory burden – is emerging as a significant area of FinTech in its own right, with the FCA hosting a landmark TechSprint in November 2016[7] where products like JWG’s RegDelta[8] financial regulation management product were showcased.

Early indications from the incoming Trump Administration suggest that regulated sector compliance burdens in the USA may start to be relaxed from 2017 and if so, this will affect IT vendor contracts in those sectors. In the UK, the added complication of Brexit means that managing IT in the financial, as in other sectors, is likely to get more complex before it gets easier, so that 2017 will combine increasing regulatory burdens with greater uncertainty.

Assuming the UK serves its Article 50 two year notice to leave the EU in March 2017, the EU General Data Protection Regulation (GDPR), which enters into direct effect in the UK on 25 May 2018, will be part of English law for almost a year before exit, meaning organisations will need to continue apace next year with their GDPR preparations.

More generally, the convergence of Brexit and the onset of the fourth industrial revolution looks like unfortunate timing: the need for high quality policy making in AI-related areas will be critical, but much of the civil service’s resource will inevitably be taken up on Brexit legal issues around the repeal of the European Communities Act 1972 and what to do with the mass of statute and case law that has come out of Brussels and Luxembourg since 1973.  Stakeholders in IT-related policy debates will need to be persistent, attentive and vocal in 2017.

Regulation

Regulators around the world will grapple in 2017 with how to address AI. What happens when an autonomous car, bus and ambulance collide? When separate smart contract systems incorrectly record a negotiated loan agreement? When AIs and robots used in a complex supply chain, in construction, in smart city transportation or in the home, fail to work properly? The emerging approach involves two steps:

  • to establish advisory centres of AI excellence. The UK House of Commons Science and Technology Select Committee in October 2016 recommended the setting up of a Standing Commission on AI and a Leadership Council on Robotics and Autonomous Systems; [9] and
  • to adapt existing regulatory frameworks to cater for AI where practical.

In some regulated areas, like UK legal services for example, AI is in general terms already covered by the current regulatory framework. Here, the practical response is to adapt the engagement contract between law firm and client to set and detail expectations and outcomes.

In other areas however, like autonomous vehicles, new regulation will need to be grafted on to the current structure. Here, as AI development moves quickly from the human driver ‘hands on/eyes on’ world to a fully ‘hands off/eyes off’ world, the UK’s approach has been to recognise that policy is aiming at a moving target and to break the regulatory tasks down into bite sized chunks. So, the UK:

  • has clarified that the rules already permit testing of autonomous vehicles (2015);
  • identified issues around insurance, vehicle construction and Highway Code adaptation as blockers that need to removed (2016); and
  • is anticipating further waves of evidence-based reform as new technology develops.

As an aside, big data raises particular challenges for insurance, traditionally based on ‘top-down’ actuarial assessment and mutualisation of risk. Big data means that the premium charged to the individual insured can be much more precisely calculated ‘bottom up’, based on individual driving habits, domestic security, genetic predisposition to illness, etc. Aside from data protection implications, this poses serious questions around the potential for discrimination and potential difficulty or inability for those at high risk to get insurance cover. These issues will continue to play out in 2017, not only in retail general insurance but increasingly in professional and business insurance as big data and AI enable risk to be calibrated ever more precisely.

Competition policy continues to adapt to the online and AI-enabled world.  With over 20% of UK retail sales taking place online in 2016, the UK Competition and Markets Authority (CMA) in early November 2016 launched a press campaign in the run up to ‘Black Friday’ aimed at warning online sellers against price fixing.[10] More fundamentally, in a 2016 book called ‘Virtual Competition[11] the authors have raised searching questions about the extent to which big data and AI in online selling platforms erode basic principles of competition law through instantaneous comparative price adjustments, concentration of platforms and data flows, and behavioural discrimination (determining the highest price the consumer will pay).

Tort law

Outside the area of regulation, and as a final example of the adaptability of established legal principles to technological change, it’s perhaps the common law area of tort that is likely to see the most important developments influenced by the fourth industrial revolution. Negligence under English law centres on the common law duty ‘to be careful’, where, in the famous words of the UK House of Lords “the categories of negligence are never closed” [12], and it is hard to imagine that the common law duty of care will not arise in relation to many, or most, kinds of AI.

Richard Kemp, Kemp IT Law
December 2016
richard.kemp@kempitlaw.com

[1]Michigan passes new laws for driverless car trials’, Financial Times, 9 December 2016, https://www.ft.com/content/58324438-be39-11e6-8b45-b8b81dd5d080

[2] ‘Amazon’s no-checkout store threatens the death of the cashier – Disruptive change equated to supermarket equivalent of self-driving car’, Financial Times, 9 December 2016, https://www.ft.com/content/e89f5c3e-bd55-11e6-8b45-b8b81dd5d080

[3] ‘Open-sourcing Deep Mind Lab’, 3 December 2016, https://deepmind.com/blog/open-sourcing-deepmind-lab/

[4]Gartner Identifies the Top 10 Strategic Technology Trends for 2017’, 18 October 2016 http://www.gartner.com/newsroom/id/3482617

[5]Microsoft releases beta of Microsoft Cognitive Toolkit for deep learning advances’, 25 October 2016 http://blogs.microsoft.com/next/2016/10/25/microsoft-releases-beta-microsoft-cognitive-toolkit-deep-learning-advances/#sm.0000lt0pxmj5dey2sue1f5pvp13wh

[6] The Economist, 5–11 November 2016, page 10.

[7] https://www.fca.org.uk/firms/project-innovate-innovation-hub/regtech

[8] https://regdelta.com/

[9]http://www.publications.parliament.uk/pa/cm201617/cmselect/cmsctech/145/14502.htm?utm_source=145&utm_medium=fullbullet&utm_campaign=modulereports

[10]CMA warns online sellers about price-fixing’, 7 November 2016, https://www.gov.uk/government/news/cma-warns-online-sellers-about-price-fixing

[11]Virtual Competition: the promise and perils of the algorithm-driver economy’, Ariel Ezrachi and Maurice E Stuke, Harvard University Press, 2016

[12] Lord Macmillan in Donoghue v Stevenson [1932] A.C. 562 at p. 619

Share:

More Posts

Send Us A Message