The Home Office has rebuffed Public Law Project’s (PLP) latest attempt to find out more about the secret algorithmic criteria used to decide whether a proposed marriage should be investigated as a “sham”.
Sham marriage investigations can be invasive and unpleasant and it appears that they are targeted at some nationalities more than others. PLP is concerned about the lack of transparency and possible discrimination involved in the automated triage system, and we would like to make contact with people who may be affected, as well as organisations that support them. If you know anyone either being investigated or at risk of investigation please get in touch.
The sham marriage algorithm
Documents previously obtained by PLP under the Freedom of Information Act 2000 indicate that a triage system comes into play if one or both of a couple who have given notice to the registrar come from outside the European Economic Area, are not settled in the UK, or lack a valid visa. If one of these conditions is met, the couple is referred to the triage system.
An algorithm processes the couple’s data, applies secret criteria, and allocates the couple a green or red light. A red light indicates that an investigation is required to identify or rule out sham activity. The couple is asked to provide more information and, often, to attend an interview and cooperate with home visits. This can be a highly intrusive process, to which many couples may be reluctant to agree.
If they refuse, the couple will not be allowed to marry. If they do comply, immigration officers will use the new information to determine if the marriage is a sham. If the decision goes against the couple, they can still marry, but their immigration status will be at risk and one or other party may face removal from the UK.
Despite PLP’s repeated requests, the Home Office has refused to disclose the criteria used by the algorithm.
As we currently understand it, there are three major concerns about the system.
First, because the criteria used by the algorithm remain a secret, there are concerns about procedural fairness. The Home Office insists that publication of the criteria would be likely to prejudice its ability to investigate possible sham marriages, and would not be in the public interest. We consider that public law standards require disclosure about how the system works and refusal to publish is unlikely to be justifiable on public interest grounds. When decisions are made by an algorithm, there must be transparency and accountability.
Second, there is a concern that the algorithm may be discriminatory. This is because some nationalities – including Bulgarian, Greek, Romanian, and Albanian people – seem more likely to be targeted for investigation than others.
It may be that nationality is included in the algorithm’s criteria. If this is the case, then the algorithm may be directly discriminatory, contrary to sections 13 and 29 of the Equality Act 2010. Even if the criteria do not include nationality, the algorithm may nonetheless be indirectly discriminatory if it is having a systemic negative impact on people of a particular nationality, contrary to sections 19 and 29 of the Equality Act 2010.
A third concern is that decision-making at the investigation stage may be flawed due to “automation bias”: that is, over-reliance on automated decision support systems. If the official conducting a sham marriage investigation is aware that the couple has been given a red light by the algorithm, they may be predisposed to conclude that the relationship is a sham. In other words, the system may encourage reliance on irrelevant considerations.
PLP is keen to hear from anyone who feels they have been unfairly targeted by a sham marriage investigation, and from any organisations or practitioners working on this issue. You can contact Tatiana Kazim at email@example.com.