Skip to main content
Morning Star Conference
Big Brother is watching the poor and disabled

While claiming to target fraud, Labour’s snooping Bill strips benefit recipients of privacy rights and presumption of innocence, writes CLAUDIA WEBBE, warning that algorithms with up to 25 per cent error rates could wrongfully investigate and harass millions of vulnerable people

WAR ON CLAIMANTS: Liz Kendall outside the Department of Work and Pensions, March 2025

THE government’s Public Authorities (Fraud, Error and Recovery) Bill forces banks and other financial institutions to act as an extension of the state in snooping on the population of Britain. As is usually the case, the poor and vulnerable are the particular victims of this discriminatory proposed legislation, but the Bill fits into a wider landscape of “Big Brother,” surveillance-state government that should worry every British citizen, regardless of wealth or health.

DWP Secretary Liz Kendall, who has put forward the Bill as part of the Labour government’s wider war on benefit claimants and in particular the disabled and those with mental health issues, claims that the legislation is designed to combat fraud and organised crime. However, it will compel banks to spy on the account activity of anyone in receipt of any form of state benefits, and to report those it considers to have any form of even potentially suspicious transactions.

The mass nature of this surveillance means that it will inevitably be carried out by computers using algorithms and AI rather than by human beings, and as with all alogorithmically driven decisions there will be a significant error rate — with investigations and potentially penalties aimed at those who have done nothing wrong and had no idea they were in the government’s crosshairs until it is already well under way.

This model of “snoopers’ charter” is inherently discriminatory, targeted against the poor and vulnerable. One of the Dorset Labour MPs, Neil Duncan-Jordan, who has been critical of the whole project, has rightly pointed this out, asking: “Why should someone in receipt of benefits have fewer rights to privacy? And why are we asking banks to become an arm of the state? These new powers strip those who receive state support of a fundamental principle of British law: the presumption of innocence. By default, welfare recipients would be treated as suspects, simply because they need support from the state.”

Critics have pointed out that the potential scale of injustice in Kendall’s plan dwarfs anything that arose from the notorious Post Office Horizon scandal, with even a 1 per cent error rate meaning as many as 237,000 of Britain’s 23.7 million benefit claimants being wrongly spied on and targeted for investigation.

But the commonly used 1 per cent figure vastly underestimates the scale of errors in algorithm-driven processes. A 2021 peer-reviewed study for the Harding Centre for Risk Literacy at the University of Potsdam and the Max Planck Institute for Human Development by Rebitschek et al found that error rates in algorithmic processes are around 25 per cent.

Even if that finding was assumed to be an outlier, computer science expert Moritz Hardt has said that finding an algorithmic system that manages to keep errors down to 5 per cent is extremely unusual. Based on these expert studies, the scale of injustice in Kendall’s new snooping system is enormous, with well over a million potential victims at a minimum and all of them among the most vulnerable people in our country.

The consequences for individuals of being targeted are also potentially dramatic. In the 2023/4 financial year, before the introduction of this new legislation, almost 700,000 benefit claimants suffered deductions from their benefits — not because of fraud, but because of HMRC “clawback” of DWP errors resulting in overpayments, often years old, discovered when they were forced to move from legacy benefits onto universal credit, costing them thousands when they can least afford it.

As the Big Issue recently pointed out, the DWP’s errors have already hit innocent claimants hard, like the disabled woman whose disability benefits stopped when she was wrongly accused of owing the government £28,000 or the single mother who was accused of owing £12,000 when the DWP actually owed her money.

And this huge sledgehammer is being used to crack a relatively tiny nut. The sum total of benefit fraud and error in Britain according to official statistics published by the Department for Work and Pensions as of May 15 2025, amounts to around £9.5 billion a year, with almost one-third of that offset by underpayments of £3bn annually, according to the Office for Budget Responsibility and National Audit Office (when taking into account corrections for state pension underpayments) and £23bn in benefits and support that goes unclaimed each year, which will of course not be picked up by the new surveillance Bill.

In stark contrast, according to Tax Justice UK and Tax Research UK, the amount lost each year to tax evasion and avoidance by corporations and the wealthy is at least £36bn and as high as £58bn. This will also not be identified by the snooping Bill, or punished in any other way by HMRC, which issued precisely zero fines to those who set up tax evasion schemes for the offenders in the most recent five-year period.

In case anyone is tempted to think that because they don’t claim benefits it’s not their problem, the Public Authorities (Fraud, Error and Recovery) Bill is only one part of the surveillance state that is not creeping but mushrooming under the government of Keir Starmer, who is building eagerly on a foundation laid during 14 years of Tory or Tory-led government.

Facial recognition is increasingly being used by police forces from London to Northamptonshire to south Wales, in many cases as permanent installations in constant use, such as in Croydon and Dalston in London and in Cardiff in south Wales.

The surveillance state cannot even be avoided by staying home. The Starmer government’s demand for access to Apple users’ encrypted data forced Apple to withdraw advanced data protection from British users of its devices, while US spy-tech firm Palantir has been given control over NHS patients’ health data. There is public concern, in terms of civil liberties, surveillance and data privacy, that Palantir now has a contract with policing bodies like Leicestershire Police Service.

These intrusions are not merely theoretical threats. A recent University of Zurich study on the ability of “AI chatbots” to influence political opinions has been heavily criticised for not disclosing what its study was doing while it was under way, but it found high levels of ability to change online users’ beliefs through the consistent use of these “bots” in discussions — and, crucially, that the more information the AI system was given about the likes, preferences and habits of those whose opinions it was trying to change, the more rapidly and effectively it could achieve the desired changes.

Such bots were found to have been heavily used to push political discourse and opinion further right during the 2024 general election. In the 2019 general election, thousands of “Bots for Boris” pushed messages of support to influence the outcome.

The 2019 usage was clumsy, with cloned accounts reciting identical messages, but these systems are becoming ever more sophisticated, and the Zurich study shows that the saying “knowledge is power” is even more true when it is applied to clandestine means for keeping populations docile, distracted or compliant.

Socialism, at a time when more people than ever in and out of Britain need it and need those who understand and practise it, faces a war on its beliefs and its ability to organise, one that is being waged by forces with no scruples and which are more than happy to make more of us even financially and time poor and unable to organise or gather information to resist.

The state’s drive to gain access to our lives and to control what we hear, see and do must be resisted, even if it wasn’t impoverishing millions. When it is, the imperative is even more undeniable.

Claudia Webbe was previously the MP for Leicester East (2019-24). You can follow her at www.facebook.com/claudiaforLE and x.com/claudiawebbe.

The 95th Anniversary Appeal
Support the Morning Star
You have reached the free limit.
Subscribe to continue reading.
More from this author
Sherin Wafi, center, and her daughter Mira, 4, mourn during the funeral of her husband Hosam Wafi who, according to family members, was killed during an Israeli strike, in Khan Younis, southern Gaza Strip, Monday, June 2, 2025
Arms Trade / 3 June 2025
3 June 2025

Starmer should not need to wait for the High Court’s decision on F-35 parts in order to do the right thing, warns CLAUDIA WEBBE

The vote count on May 1 at Grimsby Town Hall, Lincolnshire, for the Greater Lincolnshire Mayor election
Features / 6 May 2025
6 May 2025

With Reform UK surging and Labour determined not to offer anything different from the status quo, a clear opportunity opens for the left, argues CLAUDIA WEBBE

Children sit and play on the remains of a tank, at the river
Features / 21 April 2025
21 April 2025

Keir Starmer’s £120 million to Sudan cannot cover the government’s complicity in the RSF genocide or atone for the long shadow of British colonialism and imperialism, writes CLAUDIA WEBBE

NO SAFE ZONES: Children walk by the destroyed house of journ
Features / 8 April 2025
8 April 2025
As Israel’s crimes escalate, Keir Starmer’s government must not subvert, block or ignore the investigation and prosecution of British citizens involved in acts of genocide, writes CLAUDIA WEBBE
Similar stories
Britain / 2 April 2025
2 April 2025
A Universal Credit sign on a door of a job centre plus in ea
Britain / 22 January 2025
22 January 2025
Campaigners warn DWP proposals could be counterproductive and create a two-tier justice system