Out-Law News 1 min. read

Expert warns of cybercrime potential of ‘hostile’ AI


‘Hostile’ artificial intelligence (AI) is the latest weapon being wielded by criminals, according to one legal expert.

Christian Toon, cybersecurity specialist at Pinsent Masons, warned that generative AI was being harnessed by online fraudsters at a “worrying pace”. He added: “A prime example of hostility within the generative AI scene, is software that acts as an attacker’s best friend, a malicious version of ChatGPT. Unlike ChatGPT and other mainstream generative AI tools, these hostile GPT-variants does not have in-built guardrails, safeguards, morality or ethics and are only concerned with creating malicious outputs.”

“There are already reports of cybercriminals using these variants for social engineering – asking the AI to create a phishing campaign and then to respond to any subsequent queries it receives,” Toon said. His comments came after Ian Hogarth, the head of the UK’s AI taskforce, warned that the technology could pose a threat to the NHS and other state entities.

In an interview with the Financial Times, Hogarth likened the scale of the threat to that of the Covid-19 pandemic and the WannaCry ransomware attack in 2017, which cost the NHS an estimated £92 million and led to the cancellation of 19,000 patient appointments. “The kind of risks that we are paying most attention to are augmented national security risks,” he added.

The AI taskforce has received £100m in initial funding from the UK government to conduct independent AI safety research – the largest amount any country has yet committed to AI safety. Toon said: “The taskforce is right to be concerned about the development of hostile AI – but businesses should be too.”

“It is of particular concern for firms that have ‘hostile nation state activity’ in their threat profiles. The average cybercrime gang is still just exploring malicious AI use and while generative AI tools are in their infancy, they will not be for long,” Toon said.

We are processing your request. \n Thank you for your patience. An error occurred. This could be due to inactivity on the page - please try again.