Examining patent applications relating to artificial intelligence (AI) inventions: The Guidance - GOV.UK

2022-09-24 02:04:34 By : Ms. Eva Yee

We use some essential cookies to make this website work.

We’d like to set additional cookies to understand how you use GOV.UK, remember your settings and improve government services.

We also use cookies set by other sites to help us deliver content from their services.

You can change your cookie settings at any time.

Departments, agencies and public bodies

News stories, speeches, letters and notices

Detailed guidance, regulations and rules

Reports, analysis and official statistics

Data, Freedom of Information releases and corporate reports

This publication is licensed under the terms of the Open Government Licence v3.0 except where otherwise stated. To view this licence, visit nationalarchives.gov.uk/doc/open-government-licence/version/3 or write to the Information Policy Team, The National Archives, Kew, London TW9 4DU, or email: psi@nationalarchives.gov.uk.

Where we have identified any third party copyright information you will need to obtain permission from the copyright holders concerned.

This publication is available at https://www.gov.uk/government/publications/examining-patent-applications-relating-to-artificial-intelligence-ai-inventions/examining-patent-applications-relating-to-artificial-intelligence-ai-inventions-the-guidance

1. These guidelines set out the practice within the Intellectual Property Office (IPO) for the examination of patent applications for inventions relating to artificial intelligence (AI).

2. The relevant legislation is the Patents Act 1977, as amended by subsequent legislation, and the Patents Rules 2007. The interpretation of this legislation has been informed by case law in the UK courts. It also reflects the fact that judicial notice must be taken of international conventions (such as the European Patent Convention) and of decisions and opinions made under these conventions by the appropriate bodies. Accordingly, decisions made by the UK courts relating to the Patents Act 1977 are binding on our practice, whilst the European Patent Office (EPO) Board of Appeal decisions are considered strongly persuasive. Decisions of the UK Courts made under previous legislation may be persuasive, dependent on the extent to which that aspect of patent law was changed by the 1977 Act.

3. The government response to call for views on artificial intelligence and intellectual property committed the IPO to publish these enhanced guidelines for the examination of patent applications for AI inventions in respect of the exclusions to patentability contained in the Act. These guidelines also touch briefly on the requirement for sufficiency of disclosure concerning AI inventions.

4. We have also provided an accompanying document that contains various scenarios involving AI inventions. Each scenario briefly describes an AI invention and includes a non-binding assessment of its AI invention in relation to the exclusions to patentability.

5. Together these guidelines and the accompanying scenarios should be read as a supplement to the comprehensive guidance concerning patent practice at the IPO set out in the Manual of Patent Practice.

6. Any comments or questions arising from these guidelines should be addressed to:

Phil Thorpe Intellectual Property Office Concept House Cardiff Road Newport South Wales NP10 8QQ

Nigel Hanley Intellectual Property Office Concept House Cardiff Road Newport South Wales NP10 8QQ

7. These guidelines may be summarised as follows:

in the UK, patents are available for AI inventions in all fields of technology

AI inventions are typically computer-implemented and may rely on mathematical methods and computer programs in some way. UK patent law excludes from patent protection inventions relating solely to a mathematical method “as such” and/or a program for a computer “as such”. However, these exclusions are applied as a matter of “substance not form” by considering the task or process an AI invention performs when it runs

when the task or process performed by an AI invention reveals a technical contribution to the known art, the AI invention is not excluded and is patent-eligible

an AI invention is likely to make a technical contribution if, when it runs on a computer, its instructions: - embody a technical process which exists outside the computer; or - contribute to the solution of a technical problem lying outside the computer; or - solve a technical problem lying within the computer itself; or - define a new way of operating the computer in a technical sense; as may happen when one or more of the five “signposts [footnote 1] ” point to allowability

AI inventions are not excluded if they are claimed in hardware-only form, i.e. if they do not rely on program instructions or a programmable device for their implementation

an AI invention is only excluded from patent protection if it does not reveal a technical contribution. An AI invention is unlikely to make a technical contribution if its task or process: - relates solely to items listed as being excluded (e.g. a business method) and there is no more to it; or - relates solely to processing or manipulating information or data and there is no more to it; or - has the effect of just being a better or well-written program for a conventional computer and there is no more to it

the conditions set out above apply whether the invention is categorised as “applied AI” or “core AI” or it relates to training an AI invention in some way

patent protection is available for training datasets when they are used in inventions which reveal a technical contribution. However, claims to datasets characterised solely by the information content of the dataset are likely excluded as presentation of information as such

the sufficiency of disclosure of an AI invention or dataset is assessed, like any other invention, according to the principles set out in Eli Lilly v Human Genome Sciences [2008] RPC 2

8. There is no single agreed-upon definition of artificial intelligence. The government has defined AI as:

technologies with the ability to perform tasks that would otherwise require human intelligence, such as visual perception, speech recognition, and language translation.

9. In the absence of a universally accepted definition of AI, it may be helpful to think about AI inventions in terms of the simplified conceptual model illustrated below.

10. Artificial intelligence inventions are typically computer-implemented inventions based on computational models and algorithms such as neural networks, genetic algorithms, machine learning algorithms, and other similar approaches. These models and algorithms are essentially mathematical in nature. They operate upon input data and provide various forms of output data.

11. When an AI invention is “computer-implemented” we mean that it is implemented, at least in part, by a computer program that is executable on a computer, an arrangement of computers, a computer network, or some other programmable device. A “computer program” includes a sequence of computer-executable instructions.

12. Artificial intelligence inventions find application across all fields of technology. Patent protection might be sought for many aspects of AI inventions. For the purposes of these guidelines, we categorise these aspects generally into two categories: “applied AI” or “core AI”.

13. An “applied AI” invention applies AI techniques to a field other than the field of AI. An applied AI invention may be used either to:

(a) perform specific processes or solve specific problems lying outside the computer on which it runs, e.g. a specific process other than one relating to the internal operation of the computer itself; or

(b) make the computer work better generally, i.e. to perform processes or solve problems concerned with the internal workings of the computer itself.

An applied AI invention may be implemented as an application program for execution on a computing system or as part of the internal workings of a computing system.

14. A “core AI” invention does not specify any application or use-case for its AI features. Instead, a core AI invention defines an advance in the field of AI itself (eg, an improved AI model, algorithm, or mathematical method). The IPO treats a computer-implemented core AI invention as an application program that carries out the claimed tasks of the core AI.

15. Regardless of whether they might be categorised as “applied AI” or “core AI”, certain types of AI invention require training. Some AI models or algorithms are trained using specific training datasets.

16. As an alternative to computer implementation, AI inventions may be implemented in a hardware-only form, such as special-purpose electronic circuitry, that does not rely on executable instructions.  

17. Section 1(1) of the Act sets out four conditions that an invention must satisfy for the grant of a valid patent: A patent may be granted only for an invention in respect of which the following conditions are satisfied, that is to say-

(a) the invention is new; (b) it involves an inventive step; (c) it is capable of industrial application; (d) the grant of a patent for it is not excluded by subsections (2) and (3) or section 4A below; and references in this Act to a patentable invention shall be construed accordingly.

18. These four conditions apply to all inventions in all fields of technology. Thus, a patent may be granted for an AI invention when it is new, involves an inventive step, is capable of industrial application, and is not excluded from patent protection. Patents are available for AI inventions provided they satisfy these four conditions. These guidelines are concerned with the requirement of s.1(1)(d) that an AI invention must not relate to so-called excluded subject matter, as set out in s.1(2).

19. Section 1(2) [footnote 2] of the Act declares that certain things are not inventions for the purposes of the Act: It is hereby declared that the following (among other things) are not inventions for the purposes of this Act, that is to say, anything which consists of-

(a) a discovery, scientific theory or mathematical method; (b) a literary, dramatic, musical or artistic work or any other aesthetic creation whatsoever; (c) a scheme, rule or method for performing a mental act, playing a game or doing business, or a program for a computer; (d) the presentation of information; but the foregoing provision shall prevent anything from being treated as an invention for the purposes of this Act only to the extent that a patent or application for a patent relates to that thing as such.

20. AI inventions typically rely on mathematical models and algorithms for their implementation. This means the “mathematical method” and “program for a computer” exclusions must be considered carefully in the examination of AI inventions. The “program for a computer” exclusion is interpreted to encompass both the set of instructions comprising a program and those instructions when embodied in hardware form, so it is not possible to avoid the exclusion by claiming a program in hardware form (e.g. when stored in a computer-readable medium or a programmed computer [^3). The words “as such” appearing at the end of section 1(2) qualify the extent of its exclusions. For example, it is only a program for a computer “as such” that is excluded from patent protection. A computer-implemented invention is not excluded if it relates to something more than a program for a computer “as such”.

21. The patentability of computer-implemented inventions is the subject of an extensive body of decided case law handed down by the UK courts. According to binding precedent of the UK courts, a computer-implemented invention avoids exclusion under section 1(2) if it makes a technical contribution to the state of the art [footnote 4], but a contribution consisting purely of excluded subject matter does not count as a technical contribution. [footnote 5]

22. Whether an invention makes such a relevant technical contribution is decided by following the approach approved by the UK Court of Appeal in Aerotel/Macrossan: [footnote 6]

(1) properly construe the claim; (2) identify the actual contribution; (3) ask whether it falls solely within the excluded subject matter; (4) check whether the actual or alleged contribution is actually technical in nature.

23. An explanation of the four steps of the Aerotel approach is found in the Manual of Patent Practice. In Aerotel , the Court gave guidance as to how one identifies the actual or alleged contribution for the purposes of steps (2), (3) and (4):

[43] The second step – identify the contribution - is said to be more problematical. How do you assess the contribution? Mr Birss submits the test is workable - it is an exercise in judgment probably involving the problem said to be solved, how the invention works, what its advantages are. What has the inventor really added to human knowledge perhaps best sums up the exercise. The formulation involves looking at substance not form – which is surely what the legislator intended.

[44] Mr Birss added the words “or alleged contribution” in his formulation of the second step. That will do at the application stage - where the Office must generally perforce accept what the inventor says is his contribution. It cannot actually be conclusive, however. If an inventor claims a computer when programmed with his new program, it will not assist him if he alleges wrongly that he has invented the computer itself, even if he specifies all the detailed elements of a computer in his claim. In the end the test must be what contribution has actually been made, not what the inventor says he has made.

24. Further, it is the claim as a whole that must be considered when assessing the actual or alleged contribution that the invention has made [footnote 7].

25. The answer to this critical question is the subject of a lengthy line of authorities of the UK Court of Appeal, including its judgments in Merrill Lynch [footnote 8], Gale’s Application [footnote 9], Aerotel/Macrossan [footnote 10], Symbian [footnote 11], HTC v Apple [footnote 12] and Lantana. [footnote 13] The Court of Appeal has observed that this question is inherently difficult and that the boundary line between what is, and what is not, a technical contribution is imprecise . It has concluded that there is no comprehensive nor precise test for determining when a computer-implemented invention makes a technical contribution. However, paragraphs [45] to [49] of the judgment in HTC v Apple provide a helpful starting point:

[45] How then is it to be determined whether an invention has made a technical contribution to the art? A number of points emerge from the decision in Symbian and the earlier authorities to which it refers. First, it is not possible to define a clear rule to determine whether or not a program is excluded, and each case must be determined on its own facts bearing in mind the guidance given by the Court of Appeal in Merrill Lynch and Gale and by the Boards of Appeal in Case T 0208/84 Vicom Systems Inc [1987] 2 EPOR 74, [1987] OJ EPO 14, Case T 06/83 IBM Corporation/Data processing network [1990] OJ EPO 5, [1990] EPOR 91 and Case T 115/85 IBM Corporation/Computer-related invention [1990] EPOR 107.

[46] Second, the fact that improvements are made to the software programmed into the computer rather than hardware forming part of the computer does not make a difference. As I have said, the analysis must be carried out as a matter of substance not form.

[47] Third, the exclusions operate cumulatively. So, for example, the invention in Gale related to a new way of calculating a square root of a number with the aid of a computer and Mr Gale sought to claim it as a ROM in which his program was stored. This was not permissible. The incorporation of the program in a ROM did not alter its nature: it was still a computer program (excluded matter) incorporating a mathematical method (also excluded matter). So also the invention in Macrossan related to a way of making company formation documents and Mr Macrossan sought to claim it as a method using a data processing system. This was not permissible either: it was a computer program (excluded matter) for carrying out a method for doing business (also excluded matter).

[48] Fourth, it follows that it is helpful to ask: what does the invention contribute to the art as a matter of practical reality over and above the fact that it relates to a program for a computer? If the only contribution lies in excluded matter then it is not patentable.

[49] Fifth, and conversely, it is also helpful to consider whether the invention may be regarded as solving a problem which is essentially technical, and that is so whether that problem lies inside or outside the computer. An invention which solves a technical problem within the computer will have a relevant technical effect in that it will make the computer, as a computer, an improved device, for example by increasing its speed. An invention which solves a technical problem outside the computer will also have a relevant technical effect, for example by controlling an improved technical process. In either case it will not be excluded by Art 52 as relating to a computer program as such.

26. The list of decided cases mentioned in paragraph [45] of HTC v Apple had been approved by the Court’s earlier judgment in Symbian:

In deciding whether the Application reveals a “technical” contribution, it seems to us that the most reliable guidance is to be found in the Board’s analysis in Vicom and the two IBM Corp. decisions T 0006/83 and T 0115/85 and in what this court said in Merrill Lynch and Gale be followed unless there is a very strong reason not to do so. [footnote 18]

27. This list of cases (along with the judgment in Symbian itself and the decision of the EPO Board of Appeal in Hitachi T 0258/03) was studied by the High Court in AT&T/Cvon [footnote 19], where the Court tried to distil the essence of what they reveal into five so-called “signposts to a relevant technical effect” [footnote 20] Subsequently the fourth of these signposts was expressed less restrictively in HTC v Apple [footnote 21] so that the five signposts read as follows:

i) whether the claimed technical effect has a technical effect on a process which is carried on outside the computer; ii) whether the claimed technical effect operates at the level of the architecture of the computer; that is to say whether the effect is produced irrespective of the data being processed or the applications being run; iii) whether the claimed technical effect results in the computer being made to operate in a new way; iv) whether the program makes the computer a better computer in the sense of running more efficiently and effectively as a computer; v) whether the perceived problem is overcome by the claimed invention as opposed to merely being circumvented.

28. In AT&T/Cvon, the High Court also mandated that:

If there is a technical effect in this sense, it is still necessary to consider whether the claimed technical effect lies solely in excluded matter. [footnote 22]

Thus, in practice, the signposts reflect both steps (3) and (4) of the Aerotel approach, albeit in reverse order.

29. In HTC v Apple, the Court emphasised that although these signposts are useful in answering steps (3) and (4) of the Aerotel approach this does not necessarily mean they will be determinative in every case. [footnote 23] The Court went on to explain that the signposts should not be treated as prescriptive conditions, and an invention is not automatically patentable if only one of the signposts is found to exist. [footnote 24] This means that, as well as considering the “signposts”, these guidelines will also consider how decisions in the list recommended in Symbian show that AI inventions may reveal a technical contribution.

30. The principle of “substance not form” reflects that the exclusion to “a program for a computer” is taken to cover both the instructions comprising a program and those instructions embodied in hardware form, as discussed earlier. It means all categories of claim should be considered in the same way when assessing if their contribution is technical at steps (2), (3), and (4) of Aerotel. For example, claims to a computer program, a programmed computer, and a computer-implemented method are each assessed by considering what task [footnote 25] or process [footnote 26] it is the instructions comprising the program, programmed computer, or method perform when they are run. The requirement to consider the claimed task or process of a computer-implemented invention in this way was explained by the High Court in Halliburton [footnote 27] :

[32] Thus when confronted by an invention which is implemented in computer software, the mere fact that it works that way does not normally answer the question of patentability. The question is decided by considering what task it is that the program (or the programmed computer) actually performs. A computer programmed to perform a task which makes a contribution to the art which is technical in nature, is a patentable invention and may be claimed as such. Indeed (see Astron Clinica [2008] RPC 14) in those circumstances the patentee is perfectly entitled to claim the computer program itself.

[33] If the task the system performs itself falls within the excluded matter and there is no more to it, then the invention is not patentable (see Symbian paragraph 53 above). Clear examples are from the cases involving computers programmed to operate a method of doing business, such as a securities trading system or a method of setting up a company (Merrill Lynch and Macrossan). Inventions of that kind are held not to be patentable, but it is important to see why. They are more than just a computer program as such. For example, they self-evidently perform a task which has real world consequences. As Fox LJ said in Merrill Lynch (p569 at line 27), a data processing system operating to produce a novel technical result would normally be patentable. However, that is not the end of the analysis. He continued: “however it cannot be patentable if the result itself is a prohibited item” (i.e. a method of doing business). When the result or task is itself a prohibited item, the application fails.

[34] The reasoning in Merrill Lynch means that the computer implemented invention claimed there would not have been excluded from patentability if it were not for the combined effect of two exclusions in s1(2) - computer programs and (in that case) business methods. The cases in which patents have been refused almost always involve the interplay between at least two exclusions …

[35] The business method cases can be tricky to analyse by just asking whether the invention has a technical effect or makes a technical contribution. The reason is that computers are self-evidently technical in nature. Thus, when a business method is implemented on a computer, the patentee has a rich vein of arguments to deploy in seeking to contend that his invention gives rise to a technical effect or makes a technical contribution. For example, the computer is said to be a faster, more efficient computerized book keeper than before and surely, says the patentee, that is a technical effect or technical advance. And so it is, in a way, but the law has resolutely sought to hold the line at excluding such things from patents. That means that some apparently technical effects do not always count. So a computer programmed to be a better computer is patentable (Symbian) but as Fox LJ pointed out in relation to the business method exclusion in Merrill Lynch, the fact that the method of doing business may be an improvement on previous methods is immaterial because the business method exclusion is generic.

[36] The Aerotel approach is a useful way of cutting through the cases like Merrill Lynch, Macrossan and Gale in which more than one exclusion is engaged. Take a patent claim consisting of a claim to a computer programmed to perform a business method. What has the inventor contributed? If the answer is a computer program and method of doing business and there is nothing more present, then the contribution falls solely within the excluded subject matter. It can be seen not to be patentable at step 3, before one gets bogged down in the argument that about whether a book keeping system running more efficiently on a computer is a technical effect. Following Aerotel the question has answered itself.

[37] The “better computer” cases - of which Symbian is paradigm example - have always been tricky however one approaches this area. The task the program is performing is defined in such a way that everything is going on inside the computer. The task being carried out does not represent something specific and external to the computer and so in a sense there is nothing else going on than the running of a computer program. But when the program solves a technical problem relating to the running of computers generally, one can see that there is scope for a patent. Making computers work better is not excluded by s1(2).

[38] What if the task performed by the program represents something specific and external to the computer and does not fall within one of the excluded areas? Although it is clear that that is not the end of the enquiry, in my judgment that circumstance is likely to indicate that the invention is patentable. Put in other language, when the task carried out by the computer program is not itself something within the excluded categories then it is likely that the technical contribution has been revealed and thus the invention is patentable. I emphasise the word “likely” rather than “necessarily” because there are no doubt cases in which the task carried out is not within the excluded areas but nevertheless there is no technical contribution at all.

[39] So in Merrill Lynch and Macrossan the computer programs were unpatentable because the task the program performed was a business method. In Gale the program was unpatentable because the task it performed was a mathematical method (albeit the reasoning was the other way round, starting from the mathematical method rather than the computer program aspect).

31. IPO practice is to examine whether an AI invention makes a contribution that is technical in nature by considering what task or process it performs when run on a computer. In the following sections we consider how the law and practice set out above applies to the various aspects of AI inventions for which patent protection might be considered.

32. In paragraph [38] of Halliburton, quoted above, the High Court observed that:

… if the task performed by the program represents something specific and external to the computer and does not fall within one of the excluded areas … that circumstance is likely to indicate that the invention is patentable.

33. In this context a computer-implemented invention makes a contribution that is technical in nature if its instructions:

embody (e.g. carry out or control) a technical process which exists outside the computer; [footnote 28] and/or

contribute to the solution of a technical problem lying outside the computer [footnote 29]

34. If an AI invention satisfies either (or both) of these conditions, then it likely reveals a technical contribution and is not excluded under s.1(2). AI inventions meeting either (or both) these conditions likely produce the sort of technical effects indicated by AT&T signposts (i) and (v).

35. The paradigm computer-implemented invention embodying a technical process which exists outside a computer is found in the decision of the EPO Board of Appeal in Vicom. [footnote 30] The invention in Vicom concerned a method of image convolution. (We note that image convolution operations are an inherent feature of convolutional neural networks (CNNs) and some computer vision tasks.) The inventive algorithm involved repeated selection and application of convolution filters (each having a smaller dimension compared to that used by a known method) using a mathematical error-minimisation technique. When the inventive image convolution was run on a conventional general-purpose computer it required far fewer calculations to give approximately the same result as the known method. Hence, the invention produced image convolutions with an increase in processing speed compared to the known method when its instructions were executed on a conventional computer. As explained by the High Court in AT&T/Cvon, the Board held that the inventive algorithm was more than a mathematical method “as such” because:

… the mathematical method which underlay the invention was being used in a technical process which was carried out on a physical entity by technical means. [footnote 31]

And it was more than a program for a computer “as such” because:

… what was claimed was not the computer program at all, but the process of manipulating the images. That process was a technical process and hence made a technical contribution. [footnote 32]

The computer-implemented invention in Vicom made a technical contribution because its instructions represented a specific technical process (image processing) lying outside (being external to) the computer itself. The essential reasoning of Vicom is reflected in signpost (i).

36. More recently, in Halliburton [footnote 33] the High Court considered the patentability of computer simulations through the lens of the modern-day Aerotel approach. The invention in Halliburton was concerned with improving the design of drill bits for drilling oil wells (and the like) to increase their drilling efficiency and operational life. [footnote 34] The claimed invention included iteratively modelling multiple designs of the drill bit (including its cutting elements) using finite element methods, and simulating drilling of earth formations using the drill designs to determine an optimised design for the drill bit [footnote 35]. The Court held that the invention was more than a computer program “as such” because it was:

… a method of designing a drill bit. Such methods are not excluded from patentability by Art 52/s1(2) and the contribution does not fall solely within the excluded territory. [footnote 36]

And it was more than a mathematical method “as such” because:

the data on which the mathematics is performed has been specified in the claim in such a way as to represent something concrete (a drill bit design etc.). [footnote 37]

Finally, the court confirmed the contribution was technical in nature because:

Designing drill bits is obviously a highly technical process, capable of being applied industrially … The detailed problems to be solved with wear and ability to cut rock and so on are technical problems with technical solutions. Accordingly finding a better way of designing drill bits in general is a technical problem. This invention is a better way of carrying that out. Moreover the detailed way in which this method works - the use of finite element analysis - is also highly technical.

Hence, Halliburton made a technical contribution because, as well as being a technical process, its instructions solved a technical problem (with drilling efficiency and operational lifespan) lying outside the computer. Halliburton indicates the presence of signposts (i) and (v).

37. Vicom and Halliburton are similar because the data processed by the computer represented a physical entity external to the computer, i.e. an image or a drill bit design. However, it is important to note that a computer-implemented invention may, more generally, embody a technical process or contribute to the solution of a technical problem lying outside a computer, even if the data being processed does not represent a physical entity.

38. For example, Protecting Kids the World Over (PKTWO)s Application [footnote 38] concerned a system for monitoring electronic communications data (e.g. e-mail communications) for particular words and phrases to ensure that users (e.g. children) were not exposed to inappropriate content or language. The High Court held that the improved speed and reliability of the algorithm used for sampling and analysing expressions found in the electronic communications was reflected in an improved speed and reliability of an alarm notification for alerting a user to inappropriate communication. [footnote 39] The Court held that, when viewing the claim as a whole, providing the alarm in this way was a relevant technical process. Thus, the Court held:

The effect here, viewed as a whole, is an improved monitoring of the content of electronic communications. The monitoring is said to be technically superior to that produced by the prior art. That seems to me to have the necessary characteristics of a technical contribution outside the computer itself. [footnote 40]

Accordingly, the Court held that the contribution made by the invention

… does not reside wholly within the computer program as such exclusion. I think that conclusion is in accordance with the AT&T signposts. In particular I would say that the invention solves a technical problem lying outside the computer, namely how to improve upon the inappropriate communication alarm provided by the prior art. [footnote 41]

Thus, PKTWO solved a technical problem lying outside the computer, indicating the presence of signposts (i) and (v).

39. The analysis presented above does not mean that every process (or solution of a problem) lying outside a computer reveals a technical contribution. In paragraph [33] of Halliburton, quoted above, the Court observed that:

If the task the system performs itself falls within the excluded matter and there is no more to it, then the invention is not patentable … Clear examples are from the cases involving computers programmed to operate a method of doing business, such as a securities trading system or a method of setting up a company (Merrill Lynch and Macrossan) … When the result or task is itself a prohibited item, the application fails.

40. Thus, if a computer-implemented invention (or AI invention) makes a contribution consisting purely of excluded matter, and there is no more to it, then it does not count as a technical contribution and falls to be excluded under s.1(2). A clear example is found in Merrill Lynch which related to a data processing system for making a trading market in securities. The Court acknowledged that the invention had the real-world effect (i.e. outside the computer) of producing an improved trading system. However, this did not count as a technical contribution because it consisted of nothing more than a method of doing business as such. Similarly, in Macrossan, the invention related to a method for producing the documents for use in the formation of a corporate entity such as a company. The court held that the task of the invention was “for the very business itself, the business of advising upon and creating an appropriate company”, [footnote 42] which consisted solely of a method of doing business “as such”.[footnote 43]

41. Certain tasks or processes performed by a computer-implemented invention are not regarded as being outside a computer or technical in nature. Any effect which is no more than one produced by merely running a program (such as the mere manipulation of data) does not count as a technical contribution because it falls solely within the program exclusion.[footnote 44] For example, in Autonomy [footnote 45] one aspect of the claimed invention involved automatically analysing the text in an active window of a computer and generating a list of links related to that content. [footnote 46] This aspect was held to be a paradigm example of what is caught by the exclusion to a program for a computer “as such”:

[40] In my judgment … automatic text analysis, comparison and results generation is a paradigm example of a case in which the contribution falls squarely within excluded matter, i.e. a program for a computer. The claimed contribution, so far as the first element is involved does not exist independently of whether it is implemented by a computer. On the contrary, it depends on a computer processing or displaying information in an active window, and on a search program to analyse it and to compare and generate results … The only effect produced by the invention is an effect caused merely by the running of the program, which consists of the manipulation of data. It is in short a claim to a better search program.

The invention in Autonomy (automatic text analysis) would not produce a technical effect external to a computer in accordance with signpost (i).

42. An AI invention is likely to reveal a technical contribution if its instructions:

embody or perform a technical process which exists outside a computer; or

contribute to the solution of a technical problem lying outside the computer;

as may be the case when signposts (i) and/or (v) are relevant.

43. A recent example is BL O/296/21 (Imagination Technologies). A combination of a general-purpose computer and a deep neural network (DNN) accelerator was found to provide a technical effect of processing image data for computer visions problems more efficiently, thereby solving a technical problem. Signposts (i) and (v) pointed to patentability.

44. Several examples of AI inventions performing external processes and/or solving external problems are illustrated in the scenarios, for example:

Scenario 1 - ANPR system for recognising a vehicle registration number Scenario 2 - Monitoring a gas supply system for faults Scenario 3 - Analysing and classifying movement from motion sensor data Scenario 4 - Detecting cavitation in a pumping system Scenario 5 - Controlling a fuel injector in a combustion engine Scenario 6 - Measuring percentage of blood leaving a heart

45. However, an AI invention is unlikely to reveal a technical contribution, and is likely to be excluded from patent protection, if:

its task, process, or result relates to items listed as being excluded under section 1(2) (e.g. a business method) and there is no more to it; or

its task or process relates to processing information or data and there is no more to it

46. The scenarios illustrate several examples of AI inventions that are not patentable for these reasons, for example:

Scenario 7 - Automated financial instrument trading Scenario 8 - Analysing patient health records Scenario 9 - Identifying junk e-mail using a trained AI classifier

47. Paragraph [37] of Halliburton explains that making computers work better is not excluded by s1(2):

The “better computer” cases - of which Symbian is paradigm example - have always been tricky however one approaches this area. The task the program is performing is defined in such a way that everything is going on inside the computer. The task being carried out does not represent something specific and external to the computer and so in a sense there is nothing else going on than the running of a computer program. But when the program solves a technical problem relating to the running of computers generally, one can see that there is scope for a patent. Making computers work better is not excluded by s1(2).

48. In this context, decided case law shows a computer-implemented invention makes a technical contribution if its instructions:

solve a technical problem lying within a computer [footnote 47]; and/or

define a new way of operating a computer in a technical sense [footnote 48]

49. If a computer-implemented invention (or an AI invention) satisfies either (or both) of these conditions, then it likely reveals a technical contribution and is not excluded matter. Any invention meeting either (or both) of these conditions will likely produce the sort of technical effects indicated by one or more of AT&T signposts (ii), (iii), (iv), and (v).

50. The paradigm computer-implemented invention solving a technical problem lying within a computer is Symbian. It concerned the programming of a dynamic linked library (DLL) for storing functions common to the applications running on the computer’s operating system. [footnote 49] In certain circumstances, adding functions to the DLL meant that existing applications using the DLL were unable to link to the added functions correctly, causing a malfunction. The inventive program had an “extension part” for the DLL which ensured that any application could select and link correctly to the desired functions in the DLL. The invention was held to be a “program which makes the computer operate on other programs faster than prior art operating systems enabled it to do by virtue of the claimed features”. [footnote 50] This solved a technical problem lying within the computer itself “because it has the knock-on effect of the computer working better as a matter of practical reality.” [footnote 51] Thus, Symbian indicates the presence of signposts (iv) and (v) (and, arguably, signpost (ii) since the effect was achieved irrespective of the nature of the data being processed and the applications being run).

51. Symbian approved two IBM Corp. decisions of the EPO Board of Appeal, T 0006/83 and T 0115/85, which are two further examples of programs solving technical problems concerned with the internal workings of a computer itself. As helpfully summarised in AT&T (see [21] to [25] & [31]), the inventions in both IBM Corp. decisions provided technical effects in a computer system which operated irrespective of the nature of the data being processed or the applications being run. Their essential reasoning is reflected in signpost (ii).

52. In marked contrast to Symbian and the two IBM Corp. decisions, Gale’s Application is the paradigm computer program that does not solve a technical problem with the internal workings of a computer. The invention was a new way of calculating square roots which was sympathetic to the operation of the underlying architecture of the computer. For example, prior art methods of calculating square roots were said to rely on binary division operations. The problem with these prior methods was said to be that, conventionally, division operations were not directly provided within binary systems, so they had to be implemented by combining various binary “add”, “subtract”, “test” and “shift” functions. The inventive method of calculating square roots eschewed division operations and was instead implemented using multiplication operations that are inherently easier (faster) to implement using the registers of a conventional general-purpose computer, e.g. using binary “shift” operations.

53. Yet the Court of Appeal concluded the invention in Gale was no more than a program for a computer as such because its instructions did “not solve a “technical” problem lying within the computer.” The Court held that the instructions did no more than “prescribe for the cpu in a conventional computer a different [i.e. new] set of calculations from those normally prescribed when the user wants a square root.” Thus, the Court also held that the instructions did not “define a new way of operating the computer in a technical sense” (a point to which we will return shortly). So, the Court decided that the effect produced in Gale did not count as a technical contribution because it was no more than one produced by merely running the new program. [footnote 52] In other words, the effect of the invention in Gale extended no further than it being just a better or well-written program which is excluded as a program for a computer as such under section 1(2). Viewed through the modern-day signposts, the invention of , Gale would not indicate any effect that is “technical” in the sense of signposts (ii), (iii), (iv), or (v).

54. As we have just seen, Gale’s Application was refused (in part) because its program for calculating square roots did “not define a new way of operating the computer in a technical sense”. This principle was distilled into the wording of signpost (iii) which asks:

whether the claimed technical effect results in the computer being made to operate in a new way.

55. Thus, for a technical contribution to be found following signpost (iii), a computer implemented invention should cause the computer to operate in a new way (i.e. it must be a new program) and it must produce some sort of technical effect that is not caught by the program exclusion. As the Aerotel judgment explained in its discussion of Gale a “technical effect which is no more than the running of the program is not a relevant effect” (and is caught by the program exclusion). While the new program for calculating square roots in Gale made the computer operate in a new way in one sense (i.e. it was a new program), it did not operate the computer in any relevant technical sense that avoided the program exclusion. Gale did not operate a computer in a new way beyond just being a better or well-written program for a conventional computer.

56. We note the wording of signpost (iii) is agnostic as to whether the invention works at the application level. This contrasts with the wording of signpost (ii), for example. Any technical effect produced by an application program is unlikely to meet signpost (ii) because it would hardly be “a technical effect produced irrespective of the data being processed or the applications being run”. In contrast, the wording of signpost (iii) is not so restrictive. Signpost (iii) is engaged by a computer-implemented invention (which may be an application program) provided it is a new way of operating a computer in a relevant technical sense. This flows from the judgment in Gale itself. While Gale was ultimately held not to define a new way of operating the computer in a new technical sense, the Court at least entertained the possibility that it might have done so under different circumstances. Similarly, in Hitachi T 0258/03, which was the basis for signpost (v), the Board held that method steps of a modified business scheme (embodied in an application program) circumvented a technical problem instead of solving it, so did not contribute to technical character. It may be inferred from Hitachi that if, under different circumstances, the method steps comprising the steps of the business scheme had solved a technical problem (instead of circumventing it) then they would have contributed to technical character.

57. Unfortunately, there are few (if any) explicit examples of the positive application of signpost (iii) in decided case law. The IPO acknowledges that this means there is uncertainty about the sorts of effect a computer-implemented invention (or AI invention) must reveal to be a positive example of what is meant by “the computer being made to operate in a new way” in a relevant technical sense. The judgment in Gemstar v Virgin suggests that suitable technical effects are those which involve making a computer “work better, faster or differently in that sort of performance sense”. [footnote 53]

58. We consider that a computer may be regarded as operating in a new way (in a technical sense) if it includes a functional unit (e.g. a bus, GPU or special-purpose accelerator, etc.) which is programmed to operate in a new way that is technical in nature, namely if the functional unit has a technical effect going beyond the mere execution of its new program (e.g. it is better, faster, or works differently in a performance sense). When this is true, we consider the newly programmed functional unit might be said to have a knock-on effect of operating the computer in a new way of in a technical sense. Alternatively, it might be said that, by virtue of its new programming, the functional unit creates a new arrangement of hardware within the computer.

59. For example, we note the observation in Lenovo that the facts of Aerotel could be an example of signpost (iii). [footnote 54] On the facts of Aerotel, a new “special exchange”, was a new application program, running on conventional computer hardware, which verified if there was sufficient credit in a user’s pre-payment account and, if so, it made a telephone call to the user’s intended recipient. [footnote 55] The incorporation of the new “special exchange” into the otherwise conventional computer system was held to create “a new physical combination of hardware” [footnote 56] within the computer system.

60. Another possible example of signpost (iii) might be BL O/066/06 (ARM Limited) in which the speed and accuracy of a compiler was improved by using performance data from a non-invasive trace unit (monitoring the execution of a compiled program) to control the workings of the compiler. The Hearing Officer found the contribution over the known art was technical and could not be regarded as merely a computer program as such. This outcome may be contrasted with the refusal in BL O/173/08 (Intel Corporation) where the invention was a vectorising (parallelising) compiler whose sole contribution lay in improving a program, like Gale.

61. An AI invention is likely to reveal a technical contribution if its instructions:

solve a technical problem lying within a computer; or

define a new way of operating a computer in a technical sense

62. The scenarios illustrate examples of AI inventions which solve technical problems lying within a computer system in accordance with one or more of signposts (ii), (iv) or (v), for example:

Scenario 10 - Cache management using a neural network Scenario 11 - Continuous user authentication Scenario 12 - Virtual keyboard with predictive text entry

63. Scenarios illustrating examples of AI inventions causing new operation of a computer in a relevant technical sense, according to signpost (iii), are for example:

Scenario 16 - Processing neural network on a heterogeneous computing platform Scenario 17 - Special purpose processing unit for machine learning computations Scenario 18 - A multiprocessor topology adapted for machine learning

64 On the other hand, scenarios illustrating AI inventions which do not solve a technical problem within the computer or cause it to operate in a new technical sense are:

Scenario 13 - Optimising a neural network Scenario 14 - Avoiding unnecessary processing using a neural network Scenario 15 - Active training of a neural network

65. The practice of the IPO is to treat computer-implemented or software-implemented core AI inventions in the same way as an application program running on a computer. Core AI inventions should, therefore, be examined in the same way as any other computer-implemented invention by considering each case on its own merits. The guidelines for AI inventions we have already set out above apply equally to core AI (as they do for applied AI). If a core AI invention reveals a relevant technical contribution, it will not be excluded under section 1(2). However, in contrast to applied AI inventions, the advance a core AI invention makes is necessarily limited to the field of AI itself, i.e. the advance will be in the models or algorithms constituting the core AI invention. This means that, unlike applied AI inventions, core AI inventions are unlikely to be directly concerned with the real-world application of those models and algorithms to technical problems external to, or lying within, a computer system. In contrast with applied AI inventions, we believe it is unlikely that signposts (i), (ii), and (iv) will point to allowability for core AI inventions.

66. However, as we have explained above, there is some uncertainty about when a computer implemented invention (or AI invention) might meet signpost (iii). As discussed above, the facts of Gale’s Application, Hitachi T 0258/03, and Aerotel would seem to indicate that application programs (such as computer-implemented core AI) are not necessarily precluded from revealing technical effects in accordance with signposts (iii) and (v). For example, we consider that a core AI invention may make a technical contribution if its instructions define:

a functional unit of a computer (e.g. GPU or special-purpose accelerator, etc.) being made to work in a new way, or

a new physical combination of hardware within the computer (see paragraph 59 above), provided that the instructions produce a technical effect within the computer that does not fall solely within the excluded subject matter of section 1(2). For example, the effect must be something more than an effect derived by the invention being just a better algorithm or well-written program running on a conventional computer (which would fall solely within the computer program exclusion). The core AI must bring about a change to the technical operation of a conventional computer, not just a change to the way a program or algorithm works

67. At the time of writing, the sole example of an AI invention being considered by the UK courts in respect of excluded matter is the core AI invention which was held to be excluded in Reaux-Savonte v Comptroller General of Patents [footnote 57] . The invention was an “AI genome” which was understood to be a hierarchical or a modular arrangement of computer code facilitating its evolution over time so that the computer code was able to modify, adapt, change, and improve over time in the same way that biological code evolves. [footnote 58] The High Court upheld the Hearing Officer’s finding that the invention was a program for a computer “as such” and was not technical in nature. Amongst other things, the Court upheld the Hearing Officer’s findings that signposts (iii) and (v) were not shown. For example, in respect of signpost (iii), the Hearing Officer had found that even if the applicant’s program was new, it did not provide a technical effect resulting in a computer being made to operate in a new way: > a computer system operating on new code does not imply that the system works in any way differently to how it would with the old code. I have been unable to find anything in the application that suggests that a computer system is being made to operate in a new way. [footnote 59]

68. In BL O/390/22 (IBM Corp.) the invention was aimed at improving active learning of entity resolution rules to scale better over large data sets. (Entity resolution relates to finding records in a data set that refer to the same entity across different data sources, and it may be used for deduplication in a single database or for matching entities of different databases.) While the claimed invention made use of conventional hardware elements (distributed memory and disk cache hierarchy) to perform entity resolution more efficiently, the operation of these hardware elements was unchanged. Hence the computer itself did not operate in a new way and signpost (iii) did not assist.

69. The scenarios illustrate examples of core AI inventions that might be seen to operate a computer in a new way in the technical sense of signpost (iii), for example:

Scenario 16 - Processing neural network on a heterogeneous computing platform Scenario 17 - Special purpose processing unit for machine learning computations Scenario 18 - A multiprocessor topology adapted for machine learning

70. Scenarios illustrating core AI inventions that do not cause new operation of a computer in a technical sense are, for example:

Scenario 13 - Optimising a neural network Scenario 14 - Avoiding unnecessary processing using a neural network Scenario 15 - Active training of a neural network

71. The model or algorithm underpinning an AI invention (e.g. a neural network) may need to be trained using a set of training data before it can be used for its intended application or purpose. For the purposes of these guidelines, methods of training and other machine learning methods may also be categorised as being either applied AI or core AI inventions, so they should be examined in the same way as applied AI or core AI using the guidelines above. Inventions involving the training of AIs are not excluded if they reveal a relevant technical contribution to the known art.

72. We believe that a useful analogy for thinking about inventions relating to training AIs is that of calibration. Technical devices or functions may require calibration before they can be used accurately for their intended technical purpose. For example, a device having a sensor, such as a thermometer or touch-sensitive display, may require calibration to provide an accurate estimate of a physical parameter, such as temperature, or correctly discriminate physical touch inputs in a touch-sensitive display. This is true whether such devices are implemented in hardware or software (or some combination of both). Under the Aerotel approach, a computer-implemented method of calibration that makes a technical contribution to the art is a patentable invention. By analogy, it follows that a method of training an AI model or algorithm for a specific technical purpose may also make a technical contribution. Each case should be examined on its own merits.

73. In practice, we note that there are many ways in which methods of training AIs and machine learning may be claimed. For example:

74. Methods of training AI models/algorithms rely on training data, often referred to as a “dataset”. There are several ways in which patent protection for the features comprising datasets might be considered.

75. Firstly, the use of the features of a dataset may be explicitly (or implicitly) claimed as a constituent feature of a training method. If the training method makes a technical contribution, as discussed above, then patent protection is afforded to the dataset by virtue of it being an integral feature of the patentable method. Secondly, innovation may lie in a method of generating or improving a dataset. If the method makes a technical contribution, then it is patent-eligible. Thirdly, the constituent features of a dataset itself may be claimed directly, e.g. as a dataset characterised by its content (i.e. the information the data represents and its organisation) and/or its delivery (e.g. on paper or in some computer-readable form). However, we consider it is unlikely that a claim to a dataset itself can be shown to meet all four requirements for a patentable invention, i.e. be new, non-obvious, industrially applicable, and not excluded. In respect of excluded subject matter, we consider that claims to datasets are likely excluded as presentation of information as such.

76. The scope of the exclusion to presentation of information as such was considered by the High Court in Gemstar v Virgin. The Court held that if the presentation of information has some features over and above the information and its delivery, then it might be patentable. The Court held there was a distinction between “the content or its mere delivery, on the one hand”, which is excluded from patentability, and “that material plus some additional technical aspect of its delivery, on the other” which may be patentable. [footnote 60] The Court concluded, “So what achieves patentability is some real world technical achievement outside the information itself.” [footnote 61] This conclusion is consistent with the subsequent judgment of High Court in Garmin v Philips [footnote 62] which held that, “the key point is to ensure that the claimed feature is not in substance a claim to information content.” [footnote 63]

77. Thus, where a claim to a dataset is only characterised by its content (its information and how it is structured) and/or its mere delivery (e.g. a conventional means or process for presenting it) then it is unlikely that it provides any real-world technical achievement outside its information itself, and it is likely excluded as the presentation of information as such. The practice of the IPO is that any claim to a dataset will be treated on its own merits. However, unless there is a persuasive reason that the claimed dataset makes, as a matter of substance, a real-world technical achievement outside the information it holds, then such a claim is likely to be excluded under section 1(2)(d) as the presentation of information as such.

78. The scenarios illustrate examples of training AI models/algorithms that reveal a technical contribution, for example:

Scenario 4 - Detecting cavitation in a pumping system Scenario 6 - Measuring percentage of blood leaving a heart Scenario 11 - Continuous user authentication Scenario 18 - A multiprocessor topology adapted for machine learning

79. An illustrative example of a training method that does not reveal a technical contribution is:

Scenario 15 - Active training of a neural network

80. An AI invention may be implemented in hardware-only form. By “hardware-only” we mean that the claimed invention does not, in substance, rely on executable instructions or programmable devices in any way, for example when the invention is implemented in dedicated or special-purpose electronic circuitry, or when embodied in hardware on an integrated circuit. When an AI invention is claimed in this way, it is likely the program for a computer exclusion of section 1(2) is not engaged. Moreover, the mathematical method exclusion is usually avoided because a circuit implementing a mathematical method is technical in nature and is more than a mathematical method as such. However, it is emphasised that merely drafting a claim to cover both software and hardware implementations is not, of itself, sufficient to avoid the exclusions set out in section 1(2) (see for example the decision in BL O/542/22 (Emotional Perception AI). Where a claim covers both hardware and software implementations then the software implementation should be assessed according to the guidelines set out above in respect of computer-implemented AI inventions. If the software implementation covered by the claim is found to be excluded subject matter as such, then it follows that the claim is bad (see MoPP 1.04).

81. Recent examples of hardware-only implementations of mathematical methods being allowable are found in BL O/420/21 (Imagination Technologies Limited). In the context of the relevant descriptions and drawings, the Hearing Officer found that the phrases “fixed function circuitry” and “dedicated hardware” appearing in the claims should be construed as meaning an arrangement of gates, transistors and registers that achieve a specific function. Hence, the Hearing officer found the claimed inventions to be protecting (new) specific pieces of hardware which were not programmed or programmable in any way, and concluded they were technical in nature and could not be a program for a computer or mathematical method as such. The Hearing Officer also distinguished the claimed inventions from the situation in Gale’s Application where computer-program instructions stored in hardware form (on a ROM) were held, as a matter of substance, to be caught by the program exclusion.

82. An AI invention is likely to reveal a technical contribution if it is claimed in hardware-only form, i.e. it does not rely on program instructions or a programmable device for its implementation.

83. Section 14(3) of the Patents Act 1977 requires that:

The specification of an application shall disclose the invention in a manner which is clear enough and complete enough for the invention to be performed by a person skilled in the art.

84. A summary of the relevant principles to be applied when determining whether a patent application satisfies this section of the Act is set out in Eli Lilly v Human Genome Sciences: [footnote 64]

The specification must disclose the invention clearly and completely enough for it to be performed by a person skilled in the art. The key elements of this requirement which bear on the present case are these: (i) the first step is to identify the invention and that is to be done by reading and construing the claims; (ii) in the case of a product claim that means making or otherwise obtaining the product; (iii) in the case of a process claim, it means working the process; (iv) sufficiency of the disclosure must be assessed on the basis of the specification as a whole including the description and the claims; (v) the disclosure is aimed at the skilled person who may use his common general knowledge to supplement the information contained in the specification; (vi) the specification must be sufficient to allow the invention to be performed over the whole scope of the claim; (vii) the specification must be sufficient to allow the invention to be so performed without undue burden.

85. These are the relevant principles to be applied when considering the sufficiency of disclosure of all inventions, including AI inventions. UK law does not impose any other requirements concerning the disclosure of AI inventions, beyond these principles. Whether an AI invention meets these disclosure requirements is decided by considering each case on its own merits.

86. Likewise, the extent to which a training dataset should itself be disclosed is a matter to be decided by considering each case on its own merits. However, we note the recent decision of the EPO Board of Appeal in T 0161/18 (Äquivalenter Aortendruck/ARC SEIBERSDORF). We believe this decision both reflects, and is consistent with, the principles set out in Eli Lilly v Human Genome Sciences. However, the decision in T 0161/18 highlights that the disclosure of an AI invention relying upon relevant features of a training dataset should teach those details in a manner that enables the invention to be worked across its scope without undue burden. In T 0161/18 the Board stated that:

the application does not disclose which input data are suitable for training the artificial neural network according to the invention, or at least one dataset suitable for solving the present technical problem. The training of the artificial neural network can therefore not be reworked by the person skilled in the art and the person skilled in the art therefore cannot carry out the invention. [footnote 65]

HTC Europe Co Ltd v Apple Inc [2013] RPC 30 ↩

Section 1(2) is so framed to have, as nearly as practicable, the same effect as Art 52(2) & 52(3) EPC ↩

Aerotel, paragraphs [47] and [85]; HTC v Apple [2013] RPC 30, at paragraphs [35], [44] and [48] ↩

Aerotel Ltd v Telco Holdings Ltd & Macrossan’s Application [2007] RPC 7 ↩

Lantana v Comptroller General [2015] RPC 16, at [64] ↩

Merrill Lynch’s Application [1989] RPC 561 ↩

Aerotel Ltd v Telco Holdings Ltd & Macrossan’s Application [2007] RPC 7 ↩

Symbian Ltd v Comptroller-General of Patents [2009] RPC 1 ↩

HTC Europe Co Ltd v Apple Inc [2013] RPC 30 ↩

Lantana Ltd v Comptroller General [2015] RPC 16 ↩

AT&T Knowledge Ventures / Cvon Innovations Ltd [2009] EWHC 343 (Pat) ↩

AT&T, at paragraph [40] ↩

HTC v Apple, at paragraphs [51] and [150] ↩

AT&T, at paragraph [41] ↩

Halliburton v Comptroller General [2012] RPC 12, at [32] ↩

Halliburton v Comptroller General [2012] RPC 12 ↩

Vicom T 0208/84; Gale’s Application at page 327, line 52, and page 328, line 17 ↩

Halliburton v Comptroller General [2012] RPC 12 ↩

Protecting Kids the World Over (PKTWO)’s Application [2012] RPC 13 ↩

Autonomy Corporation Ltd.’s Patent Application [2008] RPC 16 ↩

Gale, at p. 328, ll. 1-2; HTC v Apple, at [49] and [51] ↩

Gale, at p. 328, ll. 12-13 ↩

Gemstar v Virgin [2010] RPC 10, at [42] ↩

Lenovo v Comptroller General of Patents [2020] RPC 18, at [30] ↩

Reaux-Savonte v Comptroller General of Patents [2021] EWHC 78 (Ch) ↩

Garmin v Philips [2019] EWHC 107 (Ch) ↩

Eli Lilly v Human Genome Sciences [2008] RPC 2 ↩

Don’t include personal or financial information like your National Insurance number or credit card details.

To help us improve GOV.UK, we’d like to know more about your visit today. We’ll send you a link to a feedback form. It will take only 2 minutes to fill in. Don’t worry we won’t send you spam or share your email address with anyone.