Artificial intelligence (AI) cannot be legally named as an inventor to secure patent rights, the Supreme Court has ruled.
In a judgment on Wednesday, the UK’s highest court concluded that “an inventor must be a person” in order to apply for patents under the current law.
The ruling comes after technologist Dr Stephen Thaler took his long-running dispute with the Intellectual Property Office (IPO) to the country’s top court over its rejection of his bid to list an AI he created as the inventor for two patents.
The US-based developer claims the AI machine named DABUS autonomously created a food or drink container and a light beacon and that he is entitled to rights over its inventions.
But the IPO concluded in December 2019 that the expert was unable to officially register DABUS as the inventor in patent applications because it is not a person.
The decision was upheld by the High Court and Court of Appeal in July 2020 and July 2021.
Following a hearing in March, a panel of five Supreme Court justices have unanimously dismissed Dr Thaler’s case.
Lord Kitchin, with whom Lords Hodge, Hamblen, Leggatt and Richards agreed, said the IPO “was right to decide that DABUS is not and was not an inventor of any new product or process described in the patent applications”.
He continued: “It is not a person, let alone a natural person and it did not devise any relevant invention.
“Accordingly, it is not and never was an inventor for the the purposes of… the 1977 Act.”
The judge said the IPO was entitled to find that Dr Thaler’s applications should be taken as “withdrawn” under patent rules because “he failed to identify any person or persons whom he believed to be the inventor or inventors of the inventions described in the applications”.
The Supreme Court also rejected Dr Thaler’s argument that he was entitled to apply for patents for DABUS inventions on the basis that he was the AI’s owner.
Lord Kitchin said DABUS was “a machine with no legal personality” and that Dr Thaler “has no independent right to obtain a patent in respect of any such technical advance”.
The judge said the IPO has previously been correct to emphasise that the case was “not concerned with the broader question whether technical advances generated by machines acting autonomously and powered by AI should be patentable”.
He added: “Nor is it concerned with the question whether the meaning of the term ‘inventor’ ought to be expanded, so far as necessary, to include machines powered by AI which generate new and non-obvious products and processes which may be thought to offer benefits over products and processes which are already known.”
In a statement welcoming the ruling’s “clarification” of patent law, an IPO spokesman said it did not alter the Government’s previous post-public consultation conclusion that “there should be no legal change to UK patent law now” while noting “many share the view that any future change would need to be at an international level”.
He said the Government was “keen to make the UK a global centre for AI and data-driven innovation”, adding that the IPO recognised that “there are legitimate questions as to how the patent system and indeed intellectual property more broadly should handle such (AI) creations”.
“The Government will nevertheless keep this area of law under review to ensure that the UK patent system supports AI innovation and the use of AI in the UK. We will continue to engage in AI inventorship discussions internationally to support UK economic interests,” the spokesman said.
He said: “UK patent law is unfit and unable to offer protection for AI generated inventions and, as a consequence, it can be expected this could have a detrimental impact on industry in the UK.”
Mr Jehan said the “lack of patent protection for AI generated inventions” would push industries to look outside the UK and act as a “disincentive to disclose inventions created by AI systems”.
He added that “in a worst case scenario”, the Government’s decision not to update the law “could mean that inventions devised by AI systems cannot presently be protected and controlled by patents in the UK”.
Patents, which provide protective legal rights, are granted for inventions that must be new, inventive and capable of being made or used or a technical process or method of doing something, according to Government guidance.
Dr Thaler’s case reached the Supreme Court amid recent scrutiny of AI developments – such as OpenAI’s ChatGPT technology – including their potential impact on education, the spread of misinformation and professions.
At the March hearing, the technologist’s lawyers argued that patent law does not “exclude” non-human inventors and contains no requirements over “the nature of the inventor”.
Mr Jehan told the judges in written arguments that the expert believes he “cannot truthfully be named as an inventor” of the DABUS creations.
“There is no prohibition in law that prevents patents from being granted for inventions generated by AI systems,” Mr Jehan said.
But Stuart Baran, for the IPO, said in written arguments that patent law required “identifying the person or persons” believed to be an inventor.
He said at the time that the IPO had not received other AI-conceived invention patent applications like Dr Thaler’s, labelling the dispute a “test case, rather than one which is motivated by any pressing need in the real world”.
But Mr Baran said the IPO recognised “this is a rapidly developing area of technology and the situation may change considerably in the future”.
He said that following a 2021 consultation, the Government decided there was no evidence that UK patent law was inappropriate to protect inventions made using AI.