Trusting government computer systems to be infallible can result in serious harm to ordinary people. I’ve learned this during the many years in which I have been supporting victims of the Post Office scandal in the face of defensive and intransigent public bodies.
That is why I feel obliged to speak out now, as I see the risk that history repeats itself. The government is working on another ambitious IT project: the digital identities framework. Its reach extends across every government department. It will be used by hundreds of thousands of private businesses and millions of individuals. The benefits – in business savings, government efficiency, technological innovation and reducing bureaucratic burdens for citizens – could be immense. If this project is a success it will position the UK as a world leader in digitising personal identity and unlocking economic growth while protecting individuals’ privacy and preventing identity theft and fraud.
But the downsides if the government gets the digital identity infrastructure wrong are equally immense. I was therefore mystified to learn that many of the public bodies that are to provide the fundamental personal data attributes to the system do not keep accurate records of individuals’ sex. There are no safeguards to stop them providing ambiguous and outright wrong data. I fear that unless the government builds in these safeguards now, as a matter of urgency, we will see the same tragic mix of personal suffering and public waste that followed the Post Office scandal.
Whether someone is male or female is a material, unalterable fact about them. That fact needs to be reliable when recorded at birth, and for the rest of their lives. All personal data can be kept private in some situations. But accurate data is also routinely needed in everyday situations such as healthcare, sport, law enforcement and single-sex services, both to ensure that everyone is treated appropriately, and to keep people safe.
A person whose sex is recorded wrongly may receive the wrong medical diagnosis and treatment, with potentially catastrophic results. A care agency relying on government digital identities that misrecord sex may unwittingly send a man who identifies as a woman to provide intimate care to an elderly or disabled woman alone at home – even if that woman has specified that she wants a female carer. Transgender people will be ill-served too: since different government sources, all of them treated as authoritative – completely reliable – may contradict each other concerning a single individual’s sex. Such a person may be locked out of the system altogether, flagged as a “synthetic data” risk – that is, as a fraud. This is the opposite of what was intended, which is to try to accommodate everyone, enable them to keep their data private and let them share it when they choose.
The case of Baby Lilah described in this report is a rare case of administrative error in recording sex at birth. But it illustrates a fundamental point: that it is the duty of the state to ensure that everyone’s foundational personal information is accurate and usable.
Together with Lord Lucas, I helped to add three safeguards to the Data (Use and Access) Bill. These would prevent unreliable data sources being given a top score for reliability. They are sensible, practical and essential. They will ensure that the system works as it should for everybody, including transgender people, and forestall another IT disaster.
The government says it will seek to remove those clauses, claiming that they breach human rights. This is a serious assertion that deserves a serious response.
This paper from Sex Matters shows that, far from this being the case, the three safeguards are necessary to support everyone’s human rights. Unless the government resolves the fundamental problems with personal data on sex before building this system, individuals’ human rights will be harmed and a great deal of taxpayers’ money will be wasted. The true breach of human rights would be ploughing ahead and creating a system that cannot reliably verify one of the most fundamental facts about every citizen, and which systematically marks false information as true.
I urge the government to treat the problems I and others raised in the House of Lords with the utmost seriousness. If it does not, I fear the country is sleepwalking into another high-cost computer-system scandal.
Safeguards to protect data integrity in the digital identity system
The government is seeking to build a secure, privacy-preserving digital identity system that will allow people to prove their identity and facts about themselves securely, privately and remotely.
Leading on this for the government is Cabinet Member the Rt Hon Peter Kyle MP. His department, the Department for Science, Innovation and Technology, is responsible for delivering this ambitious and important project. The Data (Use and Access) Bill, which is currently passing through Parliament, will establish the legislative framework.1
The crucial job for this framework is to ensure that wrong or unreliable data is not mislabelled as true and reliable. This requires quality controls, and clarity and consistency about definitions and metadata.
The government bill was introduced and passed through the House of Lords and is now going through the Commons. Among the amendments made by peers were three that strengthen the law to ensure that any public authorities that provide people’s personal information to the system only provide information that is reliable and trustworthy.
The Secretary of State must assess whether public authorities handle personal data accurately and reliably (clause 28(4)).
Public authorities can only share data that is accurate, and they must know what it refers to (clause 45(6)).
The Secretary of State may establish a “data dictionary” so that different items of data are not muddled up (clause 140).
These three safeguards would work together to solve the legacy problem of public authorities being ambiguous about what information they are collecting under “sex” and whether they are keeping it accurate and reliable. Currently organisations confuse three different things and often record them interchangeably:
the immutable material reality of a person’s biology
legally modifiable “certified sex”, which can be changed by law in the UK with a gender-recognition certificate (GRC)
self-declared “gender identity” based on subjective internal feelings.
The government says ensuring data accuracy would breach human rights
The government is seeking to remove these three clauses, which safeguard data integrity from the bill during the next stage of the debate.2
Section 19 of the Human Rights Act requires government ministers to tell Parliament whether or not each law they seek to introduce is compatible with human rights. When introducing the Data (Use and Access) Bill to the Commons, the Rt Hon Peter Kyle MP, Secretary of State for Science, Innovation and Technology, made a statement on the front page of the bill that clause 45(6) is incompatible with human rights.
EUROPEAN CONVENTION ON HUMAN RIGHTS
Secretary Peter Kyle has made the following statement under section 19(1)(b) of the Human Rights Act 1998:
I am unable (but only because of clauses 45(6) and 141(2)) to make a statement that, in my view, the provisions of the Data (Use and Access) Bill are compatible with the Convention rights but the Government nevertheless wishes the House to proceed with the Bill.
Clause 45 concerns the establishment of the “information gateway” through which public authorities such as the NHS, HM Passport Office and DVLA will provide information directly to digital identity apps without individuals having to upload a picture of their paper documents. This clause empowers public authorities to disclose pieces of your personal information directly to a certified service at your request (in practice, this will happen by you clicking “yes” to consent to an app retrieving specific pieces of your personal data from government sources).
Clause 45(6) imposes some simple, straightforward quality controls on this:
Public authorities and the information gateway
In introducing the bill for its second reading in the Commons, the Secretary of State explained why he thinks this clause breaches human rights:
“People will use digital identities to buy a house, to rent a car and to get a job. The intention of clause 45(6) is to force public authorities to share whether someone’s information, such as their sex, has changed when disclosing information under clause 45 as part of a digital verification check. That would mean passing on an excessive amount of personal data.”
Sir Chris Bryant MP, Minister of State for Data Protection and Telecoms, made a similar point in response to questions from Labour MP Tonia Antoniazzi and Conservative MP Dr Caroline Johnson:
“We are getting a bit more technical than I am able to answer precisely, but my bottom line is that if somebody is applying to rent a property, the landlord should not have to know both sex at birth and gender. That is an inappropriate invasion of people’s privacy.”
European Convention on Human Rights Article 8
Everyone has the right to respect for his private and family life, his home and his correspondence.
There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.
The government is considering the wrong use cases
Both ministers raised the spectre of people being forced to share information about their sex in situations where that information is not needed – when buying or renting a property or car, for example – and said that this would force them to share an excessive amount of information. This will simply not happen.
By principle and design, the digital identity system is privacy-protecting. It is designed to share each piece of an individual’s information only with their consent, and for a specific purpose. As John Peart, Assistant Director for Market Oversight and Integrity, Office for Digital Identities and Attributes at DSIT, explains:
“The last time I tried to get into a nightclub, I got asked for ID. To prove my age, I handed over my driving licence to the person on the door. They looked at it, then scanned it to keep a digital copy of it.
“In handing over my driving licence, I exposed my full name, my full date of birth, my home address, my sex, and my signature. Not only had I handed this information over to a complete stranger, but there is now an electronic copy of my driving licence and all my personal data stored on someone else’s IT system (hopefully securely).
“It doesn’t need to be this way. The venue staff only needed to confirm one attribute of my identity; that I was over 18. If I was able to use a digital identity, that could have been all they were given access to. Using a digital identity service – like an app on my phone – would likely have been a safer and more privacy-preserving way to prove who I am.”3
Similarly, for an individual to prove their right to rent, they need to show only that they meet the legislated criteria (which relate to citizenship or immigration status). A digital identity service which checks whether someone meets these criteria will not ask for, record, check or pass on information about the individual’s sex as this is neither relevant nor necessary. No-one will question why someone male has adopted a female and title or vice versa. A fingerprint or facial biometric check will be used by the computer app to ascertain people’s identity without needing to record anyone’s sex for this purpose.
Use cases where sex matters
But digital identity apps will also have many uses where information about a person’s sex is relevant to the service being accessed. A digital identity and attributes system that cannot supply this information reliably would be seriously inadequate.
For example, a person might use a digital identity to join a dating service or a gym; to apply for a job in social care or at a rape-crisis centre; to sign up for sporting competitions and to register with a sports governing body to track their sporting achievements; to share information about themselves with a healthcare provider; when applying for vetting to join the police; or when being questioned as a suspect in a crime. Your sex is not relevant for the general “right to rent”, but it is likely to be relevant to your landlord if you are seeking to rent a room in a house-share or hall of residence, or booking a homestay via an online app that involves sharing facilities with other people (where a female host might specify female guests only).
The government has recently won a case at the Court of Appeal defending the decision not to give a Californian man a legal “non-binary” identity in the UK. Anna Thompson, the Deputy Director of the Equality Hub in the Cabinet Office, said that the government had undertaken a scoping exercise and found that male/female identifiers “are intrinsic to systems that departments use to function and provide services to the public”. The court summarised that she gave evidence that:
“Sex is… an important factor in the provision of a wide variety of public sector services: the prison estate is exclusively split into male and female accommodation; hospitals may have single sex wards; and local authorities may fund rape crisis centres and domestic abuse refuges that offer their services to females only. She says that in so far as some government services recognise that some people may prefer not to be referred to as either male or female: ‘This tends to be the exception rather than the rule and in no circumstance amounts to legal recognition.’ By way of example, the Department for Work and Pensions (‘the DWP’) uses the title ‘Mx’ if individuals ask for it, but this does not affect their entitlement to sex-specific benefits.”4
A person may be called Mr, Ms, Mrs or Mx or other honorifics without their title needing to match their sex. If a person does not want to share data recording their sex, they may not always have to. In some situations there can be an option to respond “prefer not to say”, and a person may elect not to hold information on their sex in a reusable digital identity app at all, while still holding other data such as their name and date of birth.
Where sex really is a relevant eligibility criterion for a service or job, someone who does not wish to share that data may simply not be able to use that particular service or apply for that job. What the digital identity system should not do is provide false information and attest that the information is true.
Ensuring data accuracy does not force information to be shared
The Secretary of State said that the safeguard in Clause 45 would “force public authorities to share whether someone’s information, such as their sex, has changed… as part of a digital verification check”. This is a misunderstanding. The digital identity system will share information only with consent. The safeguards do not force data to be shared; they simply ensure that when a person asks for a piece of data to be shared it is accurate.
The safeguards included in 45(6) are practical requirements for data accuracy. They would ensure that information is clearly defined and accompanied by metadata. For example, if a public authority shares that a person’s name is “Joyce”, this needs to be accompanied with metadata tagging the field as either first name or last name. If it shares a person’s date of birth, this has to be accompanied by metadata clarifying the date format, so that 1st December does not get confused with 12th January.
Adequate metadata would also make clear whether the data refers to a type of attribute that can change (such as a person’s name, address, age or certified sex) or a type that is immutable (such as place of birth, date of birth or actual sex). This is important for error-checking. A person may have records showing two different addresses over time. But if a person appears to have two different dates of birth, then either there has been an error in their data, or these are records for two different people. This does not mean that their address or date of birth or sex needs to be shared every time they use their digital identity; it only means when this information is shared as verified data, it should be accurate.
46(6)(b) says that a public authority has to be able to attest that the information it shares is accurate (at the time of recording, for those attributes that can change), and 46(6)(c) says that if information has been corrected, this has been done by lawful means. If a public authority cannot attest to accuracy, then it cannot share the data. It will not be forced to share whether someone’s information has changed; it will simply respond that it cannot reliably provide or verify that piece of information.
Since we are talking about large computer databases, in most cases the question of whether a public authority or other data collector can attest to the accuracy of a piece of information is not a question about the data it holds on a specific individual, but a question about the adequacy of the rules it uses for collecting and storing that category of information for everyone.
For example, a public authority may hold your mobile phone number because it has asked for that number (for example, when you apply for a passport you are asked for your mobile number in order to receive a text when your new passport is dispatched). But unless the authority has verified that the phone number is correct (such as by texting you a security code and asking you to enter it on a website), the authority is not able to know that the phone number it holds is accurate – you could have entered a wrong digit by mistake. So even if most of the phone numbers the Passport Office holds are likely to be correct, it cannot share anyone’s number as authoritative data because it does not know which ones are wrong.
Similarly, if an authority has stored information on your sex in a field along with other people’s self-identified gender, it will be unable to attest that any specific individual’s “M” or “F” actually relates to their sex. The safeguards mean that it would have to respond that it does not hold the information on anyone’s sex sufficiently reliably to share that information via the information gateway.
Public authorities should provide accurate data
The principles of checking data quality and attesting to each attribute separately are built into the digital identity system through a scoring framework for information.5 The big flaw with the current design of the system is that conformity with this standard will be assessed only for the private-sector service providers seeking certification to be part of the scheme. Public authorities are presumed to provide authoritative information and their data will be given a top score automatically.
This mirrors what happened in the Post Office scandal, when private-sector sub-postmasters were held to stringent standards while the poorly designed government computer system was simply assumed to be correct.
The government’s digital identity scoring framework says that to be authoritative for a particular piece of information, a public source must “make sure the integrity of the information is protected”. During the Lords stage of the debate, the Minister of State for Science, Research and Innovation, Sir Patrick Vallance, said:
“We must have a single version of the truth on this. There needs to be a way to verify it consistently and there need to be rules. That is why the ongoing work is so important. I know from my background in scientific research that, to know what you are dealing with, data is the most important thing to get. Making sure that we have a system to get this clear will be part of what we are doing.”
But there is no safeguard to ensure that information from public bodies is reliable, and we know that public authorities such as the NHS, DVLA and HM Passport Office have not ensured that the integrity of sex data is protected. These public authorities record immutable sex, “certified sex” and self-declared gender identity in the same field interchangeably, which means they cannot reliably say which piece of information they hold for any individual. Wrong and inconsistent information has been recorded for tens of thousands of people.6
What is wrong with the data sources?
HM Passport Office allows people to change their recorded sex with as little as a statement from themselves and one other person.7
The DVLA allows people to change their recorded sex by making a declaration that they “solemnly and sincerely declare” they wish to live in the opposite gender.8
The NHS allows people to change their registered sex on request.9 It then gives them a new NHS number.
The safeguards do not breach human rights
The Secretary of State said that the result of clause 45(6) would be:
“Passing on an excessive amount of personal data. Sharing such changes by default would be an unjustifiable invasion of people’s privacy.”
The government’s assessment of compatibility with the European Convention on Human Rights said:
Digital Verification Services and data sharing by public authorities (clauses 45-49): The Department considers that, but for clause 45(6), any interference with Article 8 rights pursues a legitimate aim, being in the interests of the economic wellbeing of the country, and is proportionate (see paragraphs 35-46). However, given that clause 45(6) forms part of the Bill following a non-Government amendment, the Department considers that the Secretary of State cannot make a statement that the provisions of the Bill are compatible with Convention rights.”10
This is simply wrong. If someone asks a data controller to share a piece of their personal data (such as their biological sex), then it is not an “unjustifiable invasion of their privacy” to do just that. This is true for everyone, including trans people, since like everyone else they need to be able to share accurate information on their actual sex (for instance, with healthcare providers).
Similarly, expecting people to provide accurate data in answer to the question “What is your sex (we mean biological sex)?” is not requiring excessive information; the correct answer is precisely the right amount of information to answer the question. This is true for everyone, including trans people.
The slightly different question, “What is your actual sex as recorded at birth, or if applicable as modified by a UK gender-recognition certificate?” is also capable of being answered truthfully and unambiguously, and is sometimes necessary, for example for the purpose of marriage or pensions.
But the two questions are not the same and the metadata for the field needs to be clear about which of the two questions the data relates to (since the second attribute can change and the first cannot).
In either case, collecting, recording and sharing information that a person has consented to share is a “proportionate means to a legitimate aim” and not a breach of Article 8 (the right to respect for private life).
Objective facts about a person can be verified. Examples include what sex they are, whether they have a valid driver’s licence, whether they are married and to whom, that they have parental authority for a particular child, what qualifications they have, and that they live at a particular address.
A category of data that has no objective means of verification and can be changed by the user (such as dietary preferences, gender identity or how happy they are feeling) cannot be verified. Individuals can of course assert such personal information about themselves. A public authority might wish to collect such self-asserted data as part of a survey (the census includes several such questions), but it cannot provide it as authoritative.
The human-rights arguments about privacy have been made before, in relation to the census. In the run-up to the 2021 Census in England and Wales, the Office for National Statistics issued guidance stating that people could answer the “sex question” on the census with whatever was stated on any official document (such as a passport or driving licence).
The campaign group Fair Play For Women challenged this guidance in a judicial review. Sir James Eadie KC for the ONS argued that sex was an “umbrella term”, and that asking about a person’s actual sex risked a breach of Article 8 of the Human Rights Act. Mr Justice Swift disagreed and found that Fair Play For Women had a “strongly arguable case”. He said:
“I doubt there would be any breach of article 8(1) rights [from asking for a particular definition of sex] but, if there were, it would be justified. The question would be posed in pursuit of legitimate objectives… and any interference would be justified on the fair balance principle, in particular, given the careful and confidential way in which census information is used.”11
Around the same time the Equality and Human Rights Commission also clarified in relation to the Scottish Census that:
“Collecting information on sex assigned at birth can be, but will not always be, an interference with a trans person’s right to a private life.”12
The Office for Statistics Regulation has also issued guidance for collectors of official statistics (and on good practice for others). It does not say that it breaches human rights to collect accurate data on sex or to differentiate this from collecting data on self-identified gender.13
The information regulator has failed to protect data accuracy
We have written to the Information Commissioner’s Office about the problem of data accuracy and received a disappointing and complacent response:
“Government departments are responsible for setting out how they record someone’s sex, the purposes for which the information is processed, and how an individual may change this recording in limited circumstances. From the perspective of the accuracy principle, if these departments set out that an individual can record their sex in a particular way after meeting certain conditions (whether this is a gender recognition certificate or another method) then this would be considered accurate for the purposes of processing. This is because the purposes for which government departments and public sector bodies (as data controllers) process this personal data are not limited to identifying a person’s sex as recorded at birth.”
This response sidesteps a central question: for what purpose can public authorities possibly be recording people’s “sex” if they define the term so inconsistently?
What is undeniable is that in some situations, what is needed is the clear fact of a person’s actual sex. Examples include for that person’s healthcare, for establishing eligibility for competitive sport, when seeking access to a single-sex service that requires registration, and when applying for a job that requires someone to be of a particular sex.
Public authorities that mix up objective facts and subjective, unverifiable identity claims should not be allowed to provide the resulting “data” through the information gateway to others, who may unwittingly use it in situations where it can cause real-world harm.
Accurate personal information is a human right
A person’s actual biological sex is a fact about them that does not change throughout their lifetime. But genuine administrative errors can occur and need to be corrected, with careful controls.
The case of Baby Lilah has already been mentioned in both the House of Lords and the House of Commons debates on the Data Bill. Lilah is a baby girl who was born in November 2024. When her parents went to register the birth, the registrar accidentally wrote the wrong sex on the birth register, recording her as “male”.14
This is a rare but not unknown occurrence. The Registration of Births, Deaths and Marriages Regulations 1968 set out precise instructions on lawful rectification of different kinds of errors.15 This involves a strict sign-off procedure and a handwritten note in the margin of the birth register. Correcting an error like this is in no way comparable to recording a transgender person’s “gender identity” in place of their sex.
This system needs careful translation into the digital age, but so far this has been overlooked. Birth records are not only used to create paper birth certificates but also shared as automated data feeds that will flow into digital identity data.16 As things stand, a child whose sex has been recorded wrongly will have that wrong sex on their official digital record for life. Baby Lilah will have an official digital record that says “male”, with the fact that she is female recorded only in the “MarginalNote” field.17 But this marginal note is likely to go unread by IT applications, causing anomalies and administrative problems for her as she grows up.
The right to a means of personal identification has been found by the European Court of Human Rights to be protected under Article 8. We all have a right to have the fact of our sex correctly and reliably recorded as part of our foundational identity (on the birth register and its digital counterparts) and wherever it is needed.
This applies to Baby Lilah, to people who identify as transgender and to all of us.
With thanks to Grace Bingham and Ewan Murray for permission to use their daughter Lilah’s photograph and tell her story.