The UK’s Supreme Court has ruled that “man”, “woman” and “sex” in the Equality Act 2010 refer to sex, not self-ID or paperwork (gender-recognition certificates). This agreed with our legal interpretation. We have published new guidance and are in the process of updating our publications to reflect the judgment. We are also working to provide answers to the questions we're hearing from supporters and the media. We will publish these as soon as possible.

This post is part of the Digital ID can’t be gender self-ID campaign |

Government removes safeguards from digital identity bill

Minister says apply “common sense” – because the data can’t be trusted

The government is passing legislation to establish a “trustmark” for digital identity services, and to allow these services to access individuals’ personal information from public bodies, based on their consent. The aim is to give people a way to prove who they are and facts about themselves, without having to show, or have organisations keep copies of, documents such as passports, exam certificates, bills or bank statements.

People will be able to have a government-endorsed app on their phone that they can use to prove things about themselves, for instance that they are over 18, without revealing other information. They will also be able to use digital identity services to help do things like renting a flat, applying for a job or registering on a gig-economy platform. People who want to use paper documents such as a passport to prove their identity will still be able to do so.

The idea is that these digital services won’t share or have access to all your personal information, all the time, only to the items of data you chose to share at any particular moment. With online, automated and at-a-distance interactions becoming ever more important, these trustmarked services are set to become the “identity layer” of UK economic and civic life, underpinning interactions with public and private services, and allowing us to reveal or keep our personal information private when we choose to.

How will digital identity services work?

Tracy Edwards

To be able to reliably prove facts about you, a digital identity service will first need to ascertain that you are who you say you are. For example, if your name is Tracy Edwards it needs to confirm not only that you are “Tracy Edwards”, but that you are that Tracy Edwards. 

It needs to be able to differentiate Tracy Edwards, the prize-winning yachtswoman who is a member of Sex Matters’ advisory group and was born in September 1962 in Berkshire, from the textile artist, the university lecturer, the two registered nurses and the dozens of other Tracy Edwards whose data may be held. 

The digital identity service can then verify other facts (called “attributes”) associated with this Tracy Edwards’ unique identity. Tracy Edwards has a valid driving licence and a certain bank account with a certain amount of money in it; she owns the intellectual property from certain books and is entitled to collect royalties from certain publishers; this is her mobile phone number; she owns a particular property, and so on. 

All that information isn’t in one big database. When she uses a digital identity service for a particular purpose it will ask permission to access the specific data she gives consent for, from various sources. For example, a job application service should be able to verify qualifications from a range of different exam boards and professional bodies, and employment history from different employers. Once the service has done this once, the data can be used for multiple job applications.

Tracy Edwards may want to prove she is over 60 to get a railcard for discounted travel tickets. The digital identity service needs to provide no more information about her age than whether or not she is over 60 (for example by producing a QR or share code that encodes this information). She can then prove she is over 60 without sharing her name or date of birth with the seller. The system has been designed this way in order to preserve the maximum amount of privacy consistent with sharing the information needed for the specific purpose.

A digital identity service does not need to record someone’s sex at all to confirm who they are, or other facts about them. But sometimes sex is one of the pieces of information the person wants to share.

Perhaps Tracy Edwards wants to share certain health information in registering with a pharmacy or private healthcare provider. This includes her sex.

Perhaps she wants to register with an agency that provides personal care at home. It needs to know her sex. She also wants to know the sex of any carers it sends.

Perhaps she wants to join a gym. It wants to record the sex of each member in case it has to provide information to an emergency service. It also uses it for a practical purpose: to create a membership app with a barcode that can be used to swipe for entry to 24-hour unsupervised male, female and unisex changing rooms.

Perhaps she wants to compete in the female elite category of a sport. The athletics club and sports governing body needs to record her sex accurately, as well as the sex of everyone else applying to be in that category.

Perhaps she wants to join a dating site. It asks individuals to provide their sex. 

Perhaps she wants to volunteer as a counsellor for vulnerable women. Again, sex matters.

But she may well not want every Amazon delivery driver to be able to access data that shows she is a woman living alone. She may want to enter a writing competition that is judged without information on the writers’ personal characteristics. She wants to take out car insurance; the insurance company shouldn’t take her sex into account in calculating the premium (that has been unlawful sex discrimination since 2012). 

Using digital identity services doesn’t necessarily mean she has to share the fact that she is female in every interaction. But when she does want to share her sex, the data shared needs to be accurate and reliable, for her own safety and security and those of others. 

When she declares her sex she is not declaring her “gender identity”. She does not necessarily want to get into a conversation with a pharmacist’s assistant about why she doesn’t wish to be classified as a “cis-woman” or a “person with a vagina”. The single letter “F”, stored reliably in a data field that is labelled as referring to the immutable, biological characteristic, is precisely the right amount of data for accurately recording her sex. 

What if a person does not want to share their sex? Often they will not have to. If there is no record of a person’s sex, then any pharmacist or radiologist (or computer) looking at their records will need to consider the risks and benefits of any procedure, treatment or medication without knowing that data. The trade-off for not sharing data on sex in such a situation may be that the person accepts higher risk and potentially less benefit. In some cases an individual who chooses not to share their sex will have to accept that means they cannot use a service: a care agency or hospital may not be able to provide care to someone who doesn’t want their sex recorded. A service provider that has accurate data about a transgender person in situations where it needs that data can also be responsive to that person’s wishes concerning their care and privacy.

What is the problem?

The Data Protection Act and General Data Protection Regulations should protect our data, requiring organisations to keep it accurate, label it clearly, not get confused about it and process it only for lawful purposes. But for many years now, many organisations have conflated sex with “gender identity” and enabled people to change their recorded sex to reflect the sex they wish they were.

This means these organisations, which include the Passport Office, the Driver and Vehicle Licensing Agency and the NHS central Personal Demographic Service (which holds NHS numbers and other core data about us) cannot be used as reliable sources of information about what sex people are. 

Digital identity services have one job: to identify reliable pieces of personal information. If they cannot do this they will fail. They will fail to secure trust and as a result, will fail to save money. They will fail to keep people safe and they will fail to comply with the law.

Sex Matters has been trying (so far with no success) to get decision-makers in government to understand this basic concept.

Last week a committee of MPs removed two clauses from the Data (Use and Access) Bill that had been added by peers during the bill’s passage through the House of Lords. These would have required the government to consider whether these data sources are reliable. On Tuesday, in the next debate on the bill, it is likely to succeed in removing a third safeguard clause. 

Government plans to rely on citizens and business to use “common sense” 

In the debate last week, DSIT Minister Chris Bryant rejected the arguments for ensuring that data about sex is accurate, saying:

“I simply do not buy this argument that we need to make this provision in relation to all digital verification services.”

He recognised that sex matters in prisons, the health service and many other areas, but said: 

“Simple common sense should apply in relation to female-only spaces and wanting to make sure that women are safe.”

But in a letter to Dr Caroline Johnson MP he said:

“Digital verification services can be used to prove sex or gender, in the same way that individuals can already prove their sex using their passport, for example.”

Johnson had to put the record straight about this in a point of order in the House of Commons. We know that “passport sex” is not reliable or accurate. HMPO even explicitly allows men who identify as “crossdressers” to have “female” passports. 

The minister himself appears not to know this, but he also says it will be fine to feed this unreliable data into the new digital identity and attributes verifications system, and allow people to use it to prove their sex. The government has not bothered to do its homework but is expecting millions of individuals such as frontline workers and elderly women who need care to apply “common sense” – that is, to know when they cannot trust the data, even after the government has spent millions of pounds and passed a law to develop an official system to offer a “trustmark” to  signify that the data can be trusted.

The Data (Use and Access) Bill will eventually go back to the House of Lords, without the safeguards the government has taken out. We hope that peers will be more assiduous in considering this flaw in the bill. 

If the government won’t fix it and Parliament won’t fix it, then, as with the For Women Scotland case on the definition of sex in the Equality Act, the matter of whether it is lawful to create a national digital identity framework that cannot accurately verify sex will have to be determined by the courts.