The British Supreme Court has ruled that an artificial intelligence (AI) machine called DABUS cannot be listed as an inventor under UK patent law. The ruling came after its creator, Stephen Thaler, tried to secure a patent for a beverage container and a flashing light it created, even as concerns increase about the implications of artificial general intelligence (AGI) for humans.
The top UK court ruled that “DABUS is not a person at all,” dismissing Thaler’s appeal. However, the court clarified that the ruling does not make other inventions by increasingly advanced AI ineligible for a patent.
UK Court Rules Against AI Authorship
The government’s lawyer said allowing Thaler’s appeal could allow something arbitrary to be listed as an inventor in the future. Thaler listed himself as a patentee, but DABUS was the inventor. Thaler’s lawyers said the ruling would discourage innovation.
“[The] policy of prohibiting the grant of patents for AI-generated inventions acts as a major disincentive to innovation.
Courts in Australia and Europe also did not allow Thaler to list DABUS as an inventor. South Africa was the only country that allowed Thaler to record DABUS as the inventor.
The UK ruling follows that of the US Copyright Office regarding human authorship. The US government department said that content created without human input could not be protected by copyright law.
The Recording Academy said that only works that humans had some role in creating could be eligible for a Grammy award. The statement by its boss, Harvey Mason Jr., came after AI songs using vocals or music from prominent artists created a grey area for Grammy nominations.
How Will Laws Handle Immediate Questions?
Thaler’s case raises complex questions about the advance of AI into the realms of artificial general intelligence (AGI). OpenAI’s charter loosely defines AGI as AI technology that could possess intellect primed for economically valuable work.
The Brookings Institute suggests that AI does not have the manual dexterity needed for common jobs in the US. This limitation means AGI is still some way from completely replacing humans. OpenAI’s GPT-4 model critics say people should not mistake AI performance for competence.
Read more: Will AI Replace Humans?
However, courts may soon have to brace for more cases like Thaler’s since Google’s Bard tool can already create code, which humans have traditionally done. The code is usually the software developer’s or their employer’s intellectual property.
Read more: Top 7 Machine Learning Applications in 2024
If laws don’t consider AI an inventor, it becomes unclear who is accountable for a catastrophic malfunction. EY’s recent foray into using AI for investigating fraud raised questions about the technology’s ability to detect different types of fraud. If the tool, for example, misses something, then it is unclear with whom the blame would lie.
Do you have something to say about the UK Supreme Court ruling that AI is not an inventor, the perceived threat of AGI to humans, or anything else? Please write to us or join the discussion on our Telegram channel. You can also catch us on TikTok, Facebook, or X (Twitter).
Disclaimer
In adherence to the Trust Project guidelines, BeInCrypto is committed to unbiased, transparent reporting. This news article aims to provide accurate, timely information. However, readers are advised to verify facts independently and consult with a professional before making any decisions based on this content. Please note that our Terms and Conditions, Privacy Policy, and Disclaimers have been updated.