The EU has published its draft Standard Contractual Clauses for the procurement of AI (AI SCCs). These are drafted for public organisations (such as public hospitals) wishing to procure AI systems developed by an external supplier, and are based on the requirements for high-risk AI systems in the EU AI Act. The AI SCCs might be a good starting point, but dig a little deeper, and you’ll find that the clauses are based on some fundamental (but questionable) assumptions. Use these with caution – these might assist with drafting clauses to address AI-related risk, but these are not the whole answer, even when it comes to mitigating risk under the AI Act…

We’ve set out our top 5 takeaways below.

  1. What are the AI SCCs? These are model clauses and there are two versions:
    • a ‘full’ version targeting AI systems classified as high-risk AI (HRAI) for the purposes of Article 6 of the AI Act (together with Annexes II and III). To take a few examples, this would include AI systems used for certain employment and HR-related purposes, AI as a medical device and certain biometrics-based systems.
    • a lighter version which is intended to be used in respect of non-HRAI. For example, this could be used in respect of the procurement of foundation models i.e. models designed for generality of output and can be adapted to a wide range of tasks.
  1. Who should use the AI SCCs? The AI SCCs are drafted for public organisations that are procuring AI systems developed by external suppliers. This may include procurement by public hospitals and other public health institutions. Their use is not mandatory, but they might be a good starting point in considering how to address the AI Act contractually (even for private sector customers).
  1. What do the clauses address? Both versions of the AI SCCs are based on the requirements and obligations for HRAI in the EU AI Act. For example, both versions address the supplier having: a risk management system in place; certain data governance measures; technical documentation and instructions for use; record-keeping; transparency; human oversight; and accuracy, robustness and cybersecurity. The ‘full’ version of the AI SCCs go further, addressing the quality management system and conformity assessment. These obligations align with those on ‘providers’ of HRAI under the AI Act.
  1. What do the clauses not address? The introductory remarks state that the AI SCCs only address provisions specific to AI systems and the EU AI Act. These introductory remarks state the AI SCCs do not address the GDPR or wider legal or commercial issues, such as intellectual property rights, payment, applicable law or liability. Reading through the AI SCCs, it’s not clear that the drafters stuck to this brief though (see below).
  1. So what are the big issues with the AI SCCs? The AI SCCs are by no means perfect. They are based on some fundamental assumptions that are not expressly called out, and this is likely to lead to confusion during their use:
    • Not clear on supply chain designation under AI Act: The AI SCCs place obligations on the ‘supplier’ of the AI system and the public organisation acting as ‘customer’. But this very simple delineation does not reflect the much more complex supply chain designations actually used under the EU AI Act. The obligations on a ‘customer’ or ‘supplier’ of an AI system under the EU AI Act depend entirely on whether that actor qualifies as a ‘deployer’, ‘provider’, ‘authorised representative’, ‘importer’ or ‘distributor’ for the purposes of the Act. Each of these actors is subject to very different obligations for the purposes of the Act, and many of these designations and obligations are not captured at all by the AI SCCs. The AI SCCs appear to assume that the public organisation is acting as a ‘deployer’ and the supplier is acting as a ‘provider’, but this is a fundamental assumption and is not called out expressly. In practice, AI supply chains tend to be a complex web, and the reality may not be as straightforward. You should always conduct an in-depth analysis of your AI supply chain before drafting AI Act-related clauses, and the AI SCCs may not be appropriate for your scenario.
    • Commercial incentive for non-HRAI: It is not at all clear why a supplier of non-HRAI would contractually agree to assume obligations that go far beyond what is required by legislation i.e. assume obligations that mirror legislative obligations on providers of HRAI under the AI Act. We expect providers of non-HRAI will want to heavily negotiate any attempt to impose obligations on that provider that are equivalent to the stringent and heavy-handed obligations imposed on providers of HRAI under the Act.
    • IP clauses: The introductory remarks state that the AI SCCs do not address matters that go beyond the Act, such as intellectual property. However, the AI SCCs go on to feature several clauses addressing intellectual property rights in datasets used for training, testing and validation. Further, these clauses are not as robust as we’d typically expect to see in these kinds of agreements, and will need to be adapted based on jurisdiction and the commercials of the procurement arrangement.
Author

Jaspreet Takhar is a senior associate in Baker McKenzie' London office and advises market-leading tech and healthcare companies on issues at the cutting-edge of digital health.

Author

Julia Gillert is Of Counsel at Baker McKenzie's London office, and has shaped her practice to focus exclusively on regulatory matters affecting the Healthcare & Life Sciences industry.

Author

Elina Angeloudi is an associate at Baker McKenzie's London office and specialises in regulatory advice to pharmaceutical and medical devices companies.