Andrew Woolsey is an associate and Sophie Ashcroft is a partner in the commercial dispute resolution team at law firm Browne Jacobson
Construction companies are using artificial intelligence tools to increase productivity and add new proficiencies in areas such as contract management and project-management processes. However, AI usage may also bring about complex – and at times unexpected – disputes.
As a general point, the vast majority of construction disputes are settled out of court or via private arbitration proceedings in which details of the hearing and final award are confidential. In lieu of real-world examples of AI solutions going wrong within a construction context, we are using theoretical scenarios to set out the potential disputes that could arise on a project using AI.
AI complicates accountability
Construction projects are typically bespoke and multifaceted agreements that change as the project progresses and involve several stakeholders.
In the project’s overall context, various contractual relationships exist. Developers have contracts with contractors who, in turn, have contracts with subcontractors and architects. The use of AI introduces an additional party to the chain – namely, the AI developer.
“The use of AI means that typical liability frameworks may not be suitable”
The question of who is liable when construction projects go wrong is ultimately a factual one, but the introduction of AI makes answering this question far harder due to the inherent complexity of AI tools, which may trigger a whole range of time-consuming and expensive disputes.
For instance, was the project’s failure due to flawed data, inadequate training methodology or hardware issues with the AI tool? Architects and contractors may be liable for oversight and implementation failures. In the event that a construction project runs into difficulty, then it is likely that there will be a cascade of claims, with stakeholders seeking to recover their losses from the party next in the contractual chain.
Claims usually arise from a breach of contract due to failures in service provision (eg, by missing contractual milestones or the AI solution producing results that do not meet specifications). This may result in damage claims, contractual remedies such as delay payments, and, in extreme cases, contract termination.
Advice for contractors
It is essential that stakeholders are clear about the primary objectives of the project and the ways in which AI will be used to achieve these objectives. Contracting parties should ensure that there are comprehensive agreements in place at each stage of the contractual chain to provide certainty if something goes wrong. The roles, responsibilities, expectations and risk of the various parties should also be contractualised and defined in detail from the outset of the project, so as to apportion liability throughout the chain with maximum clarity.
The use of AI means that typical liability frameworks may not be suitable. Parties contracting to use an AI tool in a construction project should ensure that the relevant agreements include AI-specific warranties, indemnities and limitation provisions. These terms should be tailored to the specific context in which the AI tool will be deployed and should be based on standards that are clearly measurable. This would likely involve drafting warranties which, while based on common service standards such as reasonable care and skill, respond to the fact that an AI tool is being used.
Where the AI’s outputs are subject to human oversight, the scope of such obligations should be clearly drafted, including identifying the required skills/experience of the individuals concerned, the nature of any training required, as well as the processes to be followed in testing, monitoring and reviewing the AI’s outputs. Record-keeping in relation to the review of decisions made by AI solutions may also be helpful in managing the risks associated with their use.
It is also important for parties to recognise and mitigate the risk that the AI developer (often a start-up) might not have sufficient assets available or sufficient insurance coverage to satisfy a claim, which could result in increased liability for the other parties in the chain.
Managing risk is crucial for the avoidance of dispute. Contractors should therefore be taking a proactive approach towards risk management by reviewing and amending standard form agreements that, for the reasons specified above, will likely not be suitable for projects that use AI solutions. Contractors should collaborate with partners, consultants, lawyers (whether in-house or external advisers) to produce agreements that provide contractual certainty in an area that is inherently uncertain due to the complexity of the technology in question.