Disabled benefits claimants are being unfairly investigated

Disabled benefits claimants are being unfairly investigated

Cruel system — The government is using an algorithm forcing people into gruelling and invasive benefit fraud investigations. Now, disabled people who claim they have been unfairly targeted are mounting a fight back.

Disabled benefit claimants believe they are being unfairly targeted by a secret fraud detection algorithm. People are receiving notice, without warning, informing them they have been flagged as a potential fraudster, with no explanation as to why they are under suspicion. 

Rick Burgess, who is part of the Greater Manchester Coalition of Disabled People (GMCDP), has been investigated by the Department for Work and Pensions (DWP) twice. In Burgess’s case, he was able to establish his innocence fairly, but the process he describes is still one of distress and confusion. 

Burgess claims Employment and Work Allowance (ESA) and carer’s allowance. He recalls receiving a brown envelope from DWP, and immediately feeling anxious. When he opened the letter, Burgess was horrified to discover that his benefit claims were being investigated. “They want you to fill in a form, clarify your financials, reconfirming everything that you’ve already told them,” he says. “My first reaction was, well, I’ve done nothing, my claim has not been different for two, three years.”

Once this information was submitted, all Burgess could do was wait. Trying to “get on with normal life” amid the investigation was understandably no easy feat. “Whatever your normal level of anxiety, there’s 20 per cent on top until it’s sorted. It’s not good for your health.” On top of this, Burgess says that the stress of being investigated imperilled his ability to look after his mum, who he was a full-time carer for. 

After an agonising wait, Burgess got a letter to confirm there would be no further investigation. But that wasn’t the end of the corrosive effects of the process. “It still goes through your head: why did it happen? Obviously, you’re relieved it’s over,” he says. “But it also chills your public confidence, you think: ‘Oh, if I go out and have a coffee, does that look decadent? Does that look like I’m living beyond my means?’”.

In his role with the GMCPD, Burgess has seen firsthand the serious impact of these investigations on many others. Testimony reveals how people are forced to make long and frustrating calls to call centres, dealing with operators with no training to assist disabled and vulnerable people. Others have been made to fill in forms of over 80 pages that ask the same questions repeatedly, as if designed to try and catch the claimant out. “It absolutely has pushed some people into harming themselves, up to and including suicide,” says Burgess.

Anecdotal evidence collected by the GMCPD says a “huge percentage” of the group has been hit by the DWP system and subjected to the humiliating and invasive fraud investigation process. But the DWP won’t divulge any information about how the algorithm works, or why it appears to be targeting disabled people.

The GMCDP, completely run and staffed by disabled people, is now fighting back, with the help of legal campaigners Foxglove. They hope to force the government to release crucial information about how the whole process works. 

Foxglove and GMCDP sent a letter to DWP in 2021, asking for more transparency about the algorithm, but they did not receive any satisfactory answers. They have just ramped up the pressure by sending another legal letter. “This isn’t up to the GMCDP to prove that there is an algorithm in use, it’s up to the DWP to prove that there isn’t discrimination going on,” Martha Dark, Director and Co-Founder of Foxglove, stresses. 

DWP did not respond to Huck’s request for comment.

Campaign group Privacy International (PI) first uncovered DWP’s use of fraud algorithms when they found references to ‘cutting-edge artificial intelligence’ in a 1,000 page training manual in 2019. Realising these tools could target individuals as well as criminal gangs, PI tried to get more information through a Freedom of Information request, but got nowhere, according to PI’s Legal Officer Laura Lazaro Cabrera.

When an individual is being investigated for benefit fraud, they may be asked to provide or confirm information, to prove that they are entitled to the support they receive, and may have to attend an interview under caution. Without an advocate or lawyer for support, this stage in particular can be terrifying. 

If more investigation is deemed necessary, DWP officers can use undercover surveillance, monitor social media accounts, look at people’s browsing history and check bank statements, among many other measures. 

Burgess describes how disabled people have severely limited their lives, fearing being reported and investigated. “It is thoroughly debilitating,” he says. “I know people who’ve basically ended having any public life as they are terrified of being spied on and reported.” Benefits can also be stopped while an investigation is ongoing, compounding this stress.

As Burgess points out, having to prove yourself and your right to support as a disabled person is exhausting. “You should be innocent until proven guilty,” Burgess says. “This system is more guilty until proven innocent.” 

Digitisation and automation can be beneficial. When thousands of people had to claim Universal Credit at the start of the pandemic, online rather than paper-based claims undoubtedly helped people to receive essential benefits quickly. But automating decisions that directly and deeply affect people’s lives is also fraught with challenges. The data used by an algorithm might be bias, and the way it is analysed and combined with other data can introduce further unfairness. 

Humans make decisions at every stage of the process of developing and using an algorithm, potentially introducing bias, says Jessica Wulf of European research organisation AlgorithmWatch. “Sometimes [people] are not even aware that they are putting their opinions into [systems],” says Wulf. She points to how automated decision-making systems tend to put groups such as disabled people at the most risk, because they “reproduce the power structures we have in society”. Although the harms caused are normally unintentional, steps need to be taken to mitigate the risk of unfair outcomes. 

A 2021 report by Big Brother Watch (BBW) looked at automated fraud detection tools in use by UK public bodies. It found evidence of invasive and unfair systems, lacking public transparency in the same way as the DWP algorithm. “We are deeply concerned that these secret algorithms could be disadvantaging and discriminating against Britain’s poor,” Jake Hurfurt, BBW’s Head of Research and Investigations, says. “The growth in algorithms and predictive models, used by DWP and local councils, risks treating the poor with suspicion by default. There must be transparency when decisions around welfare are influenced by automated processes.”

There are also debates about how effective these kinds of systems actually are. According to Dr Morgan Currie, lecturer in Data and Society at Edinburgh University, there is a lack of rigorous testing to find out if systems designed to flag risky benefit claims actually deliver on their promises. “There is no sense that [DWP] know whether it is helping or not,” she says. 

There are tools and processes available to improve fairness and transparency, as well as checking if systems are performing as intended, including equality impact assessments, algorithmic audits and data protection impact assessments. The trouble is in the GMCDP case, we don’t know if any of these tools have been used, or if they identified any problems.

The psychological burden of fighting a fraud accusation can be huge. When you don’t know why you’ve been targeted, it’s even worse. For Burgess of the GMCDP, this technology represents yet another barrier to participation in mainstream society for disabled people. “Because disabled people are over-represented in poverty, they have to rely on social security more, so when [the system] becomes unfair it’s particularly unfair on them,” he stresses. 

Although on paper claimants have legal rights to further information about a fully automated decision, in practice these rights are hard to exercise. Many people aren’t aware of them, and there are lots of grey areas. Understandably, many people on benefits are terrified of being seen as causing trouble in case their income is stopped, or they are investigated again. Fighting back through legal action is costly and slow, but is the only real means available to claimants at present. 

Many organisations are calling for a register of public sector algorithms, so that it’s easy to find out if important decisions have been automated, as well as independent audits of data and algorithms to determine if they are producing biased results. Burgess would like to see disabled people involved in co-design of new technologies that affect them, as well as the greater transparency from government that is at the heart of many groups’ calls.

“These systems are often rolled out and thought about later”, adds Martha Dark. “There isn’t sufficient public consultation to see if the public want these systems in use.” 

GMCPD and Foxglove are hoping for a response from the DWP by the end of March 2022. If they are successful in obtaining more details about the secret algorithm, it could be a huge win. “While this claim is being taken by a disability rights group in Manchester, the impact of the case is potentially huge,” says Dark. “It demands transparency for everybody.” 

Support GMCPD’s the legal case here.

Follow Anna Dent on Twitter

Enjoyed this article? Like Huck on Facebook or follow us on Twitter.