Palantir’s newest UK contract takes the AI and information analytics firm into the coronary heart of considered one of Britain’s greatest industries: monetary companies, which accounts for 9% of the economic system.
The Miami-based firm embedded its know-how in the NHS in 2023, the police in 2024 and the army in 2025. Land and broaden, they are saying in the tech business. Palantir has adopted the script, constructing contracts value greater than £500m.
Now in 2026, its deal with the Financial Conduct Authority (FCA) to dive into the terabytes of information it gathers provides it yet one more unparalleled view of the internal workings of the British authorities. It additionally provides it sight of a trove of information about the workings of considered one of the most essential world centres of finance, the Metropolis of London.
The attraction of firms comparable to Palantir to public authorities is pushed by three forces: the push to discover extra environment friendly methods to use human sources amid strained public funds; the existence of lakes of information swollen by society’s elevated tendency to digitise transactions and communications; and the daybreak of AI and the Labour authorities’s unbridled enthusiasm for its potential to unlock elusive financial progress.
However its former use of Peter Mandelson’s lobbying firm, World Counsel, Palantir has change into an influential voice in Whitehall. With earnings of $1.4bn in the final three months of final yr alone, it might afford prime expertise and its AI-enabled information evaluation methods impress many who see them, in demonstrations at the very least. Marketing campaign teams rail towards Palantir’s work with the US Division of Homeland Safety and its ICE operations, and its service to the Israel Protection Forces, however the contracts maintain coming.
Its technologists will arrive at the FCA headquarters in east London and discover a regulator anxious it is devoting an excessive amount of power to pursuing potential monetary crime circumstances that go nowhere. It needs to use AI to higher detect indicators of wrongdoing so it might crack down on the severe crime of cash laundering, which underpins social ills comparable to human trafficking and the medicine commerce, in addition to fraud, which impacts many individuals and accounts for about 40% of all crimes in the UK.
Its workplan for 2025-26 set out an ambition to “broaden the use of information and intelligence to determine and act on the riskiest corporations and/or people” and use “community analytics to determine dangerous networks of corporations and/or people”. However because it strikes to AI detection of monetary wrongdoing, criminals might nicely reply with their very own methods of beating the bots.
“If the FCA depends on an AI-based detection mannequin, a foul actor might take steps to affect that system when it critiques materials,” mentioned Christopher Houssemayne du Boulay, a accomplice and barrister at the regulation agency Hickman & Rose who specialises in severe and complicated monetary crime.
For instance, they could use invisible “white text” in paperwork to instruct the AI to ignore something in that doc that could be incriminating. “You may completely see that being utilized in a monetary crime context as a result of developments in technological capabilities for good can equally nicely be exploited by criminals and continuously are exploited very nicely,” he mentioned.
The arrival of AI as a weapon to combat cash laundering has been lengthy anticipated. “Folks have talked about utilizing machine studying and earlier types of synthetic intelligence to spot patterns of cash laundering] since the Nineteen Nineties,” mentioned Prof Michael Levi, an internationally recognised professional in cash laundering at Cardiff College. “Now that know-how is accessible, now we have to make selections about how to use it, what the dangers are.”
He mentioned it was comprehensible that some individuals may concern the penalties of information firms having the ability to combine totally different datasets in a manner that would threaten privateness.
However he added: “Criminals are additionally afraid of it [and] additionally some elites could be afraid, as a result of company holdings by way of shell firms and thru actual firms with obscured possession needs to be a part of the goal for these sorts of applied sciences.”
Disclaimer: This article is sourced from external platforms. OverBeta has not independently verified the information. Readers are advised to verify details before relying on them.