UK's new AI Cyber Security Code of Practice
The question
What is the UK's proposed AI Cyber Security Code of Practice?
The key takeaway
The UK Government is establishing a voluntary AI Cyber Security Code of Practice (AI Cyber Code). The AI Cyber Code aims to protect the end-user of AI and sets out steps to cover the entire AI supply chain, with a particular focus on Developers and System Operators.
The background
On 15 May 2024, the UK Government published a call for views on the cyber security of AI (the Call for Views). This forms part of a wider piece around AI by the Government to ensure that we effectively harness the power of AI, but do so in a safe and secure way. The proposed voluntary AI Cyber Code was first published in November 2023, developed by the Department for Science, Innovation and Technology and based on the National Cyber Security Centre's Guidelines for secure AI system development, alongside the US Cybersecurity and Infrastructure Security Agency and other international cyber partners.
The Government intends to use the feedback gathered from the Call for Views to update the AI Cyber Code.
A key aim of the Government in establishing the voluntary AI Cyber Code is to create "baseline security requirements across various areas of technology". Establishing these baseline security requirements will have many benefits, including:
- enabling users of AI to verify that AI products are securely designed;
- create good security practices within the AI sector and thereby create a marketplace where security and safety is a distinguishing factor among competitors;
- improve cyber security, thereby reducing cyber-attacks and protecting the data that is used within AI tools; and
- support the UK in becoming a leader in AI by enabling innovation and safety to develop together.
The development
The AI Cyber Code sets out 12 principles which cover the AI supply chain and focuses on four groups of stakeholders, namely:
- developers: businesses and individuals that are responsible for creating an AI model and/or system;
- system operators: businesses responsible for embedding/deploying an AI model and system within their infrastructure;
- data controllers: "any type of business, organisation or individual that control data permissions and the integrity of data that is used for any AI model or system to function";
- end-users: "any employee within an organisation or business and UK consumers who use an AI model and system for any purpose, including to support their work and day-to-day activities".
The AI Cyber Code covers the different stages of use of an AI tool including:
- secure design;
- development;
- deployment; and
- maintenance.
Key principles included in the AI Cyber Code include:
- designing systems for security as well as functionality and performance;
- modelling the threats to a system;
- ensuring decisions on user interactions are informed by AI-specific risks;
- securing the supply chain;
- communication and processes associated with end-users;
- maintaining regular security updates for AI model and systems; and
- monitoring the system's behaviour.
Each principle notes which stakeholder (as outlined above) is primarily responsible for implementing the respective principle.
Why is this important?
The AI Cyber Code has been developed with a pro-innovation approach in-mind and seeks to encourage the safe development and deployment of AI tools. By adhering to the voluntary AI Cyber Code, AI developers will be able to differentiate themselves from competitors through their commitment to the safe and secure development of AI. In-turn, the focus on security aims to help promote the UK as a leader in the AI marketplace.
Any practical tips?
It goes without saying that few AI platforms will survive for long if they are not secure. The AI Cyber Code provides a useful reference point, including from the UK Government perspective. Indeed, the Government will be monitoring the application of the AI Cyber Code and working with stakeholders to determine what regulation may be needed in the future.
Autumn 2024
Stay connected and subscribe to our latest insights and views
Subscribe Here