Local governments are struggling with AI procurement, lacking clear guidelines.
- The Ada Lovelace Institute reports significant gaps in government guidance since 2010.
- Central government’s enthusiastic AI adoption leaves local authorities trailing.
- Key safety and ethical concerns in AI procurement remain unaddressed.
- Procurement is crucial for safe and effective AI integration in public services.
The Ada Lovelace Institute has highlighted that local governments are encountering significant difficulties in procuring new AI technologies for public services. Despite a confident outlook from the government over the years, these local entities find themselves at a disadvantage due to insufficient guidance.
The report analysed 16 pieces of AI-related legislation and guidance issued by the Conservative government from 2010 to 2024. It concluded that local authorities lack a clear framework on how to integrate AI responsibly and effectively. This absence of direction has left many local governments without the confidence necessary to make safe procurement decisions.
Imogen Parker, associate director at the Ada Lovelace Institute, emphasized that procurement should ensure AI tools are safe, fair, and aligned with public interest. However, this task becomes daunting in the face of reduced budgets and the complex nature of AI technologies.
The report warns of potential risks comparable to the “Post Office’s Horizon scandal” if ethical procurement processes are not embedded. Parker underscores the importance of ethical considerations, stating the high stakes involved and the potential for disaster if AI is not integrated with care.
Anna Studman, a senior researcher at the Ada Lovelace Institute, pointed out that AI systems have the capacity to damage public trust significantly. She stressed that procurement offers a vital opportunity for local authorities to scrutinise suppliers about the societal impact of their technologies.
Local authorities must enhance their procurement strategies to ensure AI is implemented safely and ethically.