In February, ChinaFile published an article about the Wisconsin-based company Promega selling DNA analysis equipment to the police in China’s Xinjiang Uighur Autonomous Region. Not long after we posted the story, Yves Moreau contacted ChinaFile to discuss his experiences communicating with Promega. Moreau, a professor of engineering, specializing in human clinical genomics, at the University of Leuven, Belgium, had been emailing with Promega since 2016, warning its communications department first about how Promega’s products might be used in a proposed DNA databasing project in Kuwait, and later alerting the company to his concerns about Xinjiang. My interview with Moreau became a wide-ranging conversation about genetic privacy, the true meaning of “free, informed” consent, inconsistent ethics standards for trans-national businesses, and the scientific community’s concerns over the United States government’s use of DNA samples.
Can you tell me a bit about your day job, to help explain the expertise that you bring to this issue?
I’m a data-cruncher in clinical genetics. I do analysis of patient genomes to identify mutations that cause rare genetic disorders. I got interested in genetic privacy about a decade ago. With my team, I’ve been building infrastructures that allow us to do research on genetic data while protecting patient data privacy.
It’s from that background that I eventually got interested in human rights and human genetics, or forensic genetic “ethicology.” It happened by accident. Somebody forwarded me an article about the mandatory DNA databasing of the entire population of Kuwait. I was very surprised because I was already working on genetic privacy, and this law I hadn’t heard about had passed a year earlier. I asked around: “What is our community doing about this?”
Features
02.19.20American Company Sold DNA Analysis Equipment to Security Officials in Xinjiang, Documents Show
Doing DNA profiling of an entire population, and all the visitors that would travel to Kuwait, seemed like quite a poor idea. And it turned out that there was really no reaction from the scientific community. So I got in touch with a colleague who said, “Well, actually, our Society of Human Genetics does take social standpoints, and that’s something we could take a standpoint on.” They wrote a very nice letter to the Prime Minister of Kuwait. I thought it was so nice so as not to be very useful, but I was wrong, because it was then picked up by scientists. New Scientist wrote a piece, and then The Washington Post wrote a piece. And at the same time, there were a number of things happening in Kuwait. Within three months, because there was international pressure and also a lot of national pressure, the parliament had to repeal the law.
There were some very limited reports on how people in Xinjiang had to provide fingerprints, voice prints, 3D facial scans, and DNA samples to get a passport. I reached out to Human Rights Watch and asked if they would be interested in having somebody with some technical expertise connect with them on this. They were interested, and then they discovered some important documents that I reviewed. And then I reached out to some colleagues who are expert forensic geneticists who said, “Oh, that’s big, because it means that the Xinjiang authorities are building very large-scale DNA profiling infrastructure in Xinjiang.” And so that’s when the reporting on DNA profiling [in Xinjiang] started, back in 2017.
Why do you think the scientific community wasn’t initially engaged about the proposed DNA database in Kuwait? Are there not many scientists concerned with privacy issues?
There’s quite a bit of work on how to do research while protecting privacy and guaranteeing confidentiality. The community overall is fairly responsible, but in a first-world sense. So Europe, the U.S.—that’s the context. There’s not a lot of thought about what it means once you go to other regions of the world. It’s really a classical scientific context.
Across scientific communities, there is a variable level of consciousness of the implications of using this kind of data. Physicians are usually quite conscious that behind the data they’re looking at are patients. When you get biologists, they might be less aware. And then when you get to computer scientists—of course, it’s a whole landscape, not just one thing—but usually people are even less aware and tend to consider whatever they have in front of them as just “data.” There’s a distanciation with respect to data that occurs very quickly when you do data analysis.
Do you believe that true informed consent is possible in Xinjiang right now?
The requirement for human subject research is free and informed consent. If I’m going to do, let’s say, a genome-wide association study trying to see how some disease has genetic variances in the population of Xinjiang, I believe that for that research you could talk about free, informed consent. I would worry about how the data might be reused, but for the initial project where it’s clear that you’re going to do this [particular] analysis, I believe that free, informed consent is possible.
When you talk about the kind of techniques that can be directly used by the police to identify individuals and therefore to repress populations, then I think that informed consent would entail saying, “Do you agree to participate in a study whose results are directly relevant for the development of technology that can be used to discriminate against your ethnic population, against your family, or against yourself?” In these situations, I think free, informed consent is an oxymoron. You could imagine that it was free consent because you didn’t tell the participants everything. Or, you could imagine that it was informed consent because you’ve told the participants that they could be abused as a result of the research. But, you know, they’re not free to say no. Free, informed consent in this context is an oxymoron for this type of research.
The second dimension is that free, informed consent is a necessary but not a sufficient condition for ethical research. You cannot use free, informed consent as a free card to bypass all the other ethics requirements. So [for the requirement of] beneficence, and nonmaleficence, you cannot do research where there is a clear risk that the results would be used to abuse an individual or a population. [For the requirement of] justice, people should share both the burdens and the rewards of research in a fair way. Again, [in Xinjiang,] these requirements are not met. This type of research fails on essentially all ethical dimensions that you can imagine. Informed consent could not possibly be a solution to these problems.
There is a very strong trend now to say, “OK, here is a piece of research that people are criticizing, let’s see whether there was free, informed consent, and let’s see whether there was ethical approval, and if these requirements are met, then the criticism was not relevant.” I think this is wrong. I think it’s happening because these things are much more tangible, much less subject to opinion. So, basically you can say, “Did you have the proper form or the proper signatures, and did you have the proper documents with the stamp of the institutional review board?” Questioning the ethics of a study in general is trickier because, indeed, it touches on values. What is ethically acceptable will have some cultural variation.
At the same time, there are quite a few publications where the study was approved by the ethics committee of the Ministry of Public Security’s Institute of Forensic Science. That doesn’t make sense. The police having their own ethics committee to approve their own studies? That doesn’t work. So in those cases, you have to question how the ethical approval was given. There is space for some cultural variability, but these kinds of situations are really not all that complicated. They ask that you, for example, as a journal, second-guess an ethics committee. And, actually, guidelines on ethical publishing foresee that possibility, [the possibility] for a journal or a publisher to decline to publish work that has been approved by an ethics committee because they don’t feel comfortable with it. Publishers can say that, even if they’re not sure about the cultural standard where the research was done, if they feel too uncomfortable with [the ethics of the research], they can deem it unacceptable.
In the case of Xinjiang, it’s more clear-cut. When the police approve the development of technology that is being used on a massive scale to repress a certain population, you can really question whether the ethical approval makes sense at all. I don’t think it does.
You told me you first contacted Promega in 2016 about the possibility of its equipment being used in Kuwait. How did Promega respond?
Well, they were initially very responsive. But when the questions got more pointed, I got boilerplate answers like, “Yes, we are concerned about things like genetic privacy.” But in terms of really reacting and doing something about it . . . Of course, the DNA database did not go through in Kuwait, but I didn’t have the impression that [Promega was] at all willing to take it seriously and intervene.
Actually, at the time, Promega had important leverage. First, they were a technology supplier [to two of the companies that were bidding for the Kuwait contract, Sorenson and Bode Cellmark]. So they could have told Sorenson or Bode Cellmark that they weren’t going to supply the technology because they thought there was a problem [with the database project].
[Thermo Fisher was a third company also bidding for the Kuwait contract.] Second, and this is quite complicated because a lot of it is secret, but there was a lawsuit between Promega and Life Technologies [a company that changed names and became part of Thermo Fisher] because Promega holds important intellectual property on DNA profiling technology. And from what I understand of the settlement terms, the products that Thermo Fisher was selling could only be used for forensic and paternity testing use. So Promega could have said, for example, “When you do mass DNA profiling of an entire population, this is not forensic genetics anymore. This is surveillance.” Obviously it would have taken a lot of guts, but Promega could have told Thermo Fisher that using the technology in that way might infringe on the terms of the settlement. Now, the settlement is secret, so what I’m saying is rather speculative, but you can see that several of Thermo Fisher’s products are specified as being for forensic case use or paternity testing only. I’m quite sure that comes from the settlement between Promega and Thermo Fisher. So that means that—at least for some time, because the patents might have since expired—Promega did have a lot of leverage on forensic DNA technology that it apparently chose not to use.
You first contacted Promega about Xinjiang in 2017, is that correct?
Yes. But from then on they were unresponsive. Since 2017, I’ve been sending them regular emails with updates about the Xinjiang situation or issues of genomic surveillance, just to keep them updated and hoping this would be fed back to their corporate organization so that they would realize that it was important to be careful about how this technology is used.
Did you have specific information about the use or potential use of Promega’s equipment in Xinjiang, or was this just a generalized concern, knowing what you already knew about what was going on in Kuwait?
What was in [ChinaFile’s Promega article] were the kinds of things that I saw as well. We saw a number of winning bids explicitly mentioning products from Promega sold in Xinjiang. Promega is less high-profile than Thermo Fisher, because Thermo Fisher was the dominant supplier of sequencers to the market. [Editor’s note: Promega does not manufacture DNA sequencers; it manufactures “kits” that amplify certain aspects of the DNA information, allowing that information to be put into a sequencer.] Also, in terms of kits, Thermo Fisher is an important player. So I didn’t put as much focus on Promega because they were a second-line player in this market. But I still communicated with them and insisted that there was a problem with their technology.
Since 2017, they have been clearly aware of the risk of abuse of their technology in Xinjiang and in the rest of China—because, of course, we focus a lot on Xinjiang, but that is not to say that there aren’t problematic things happening in the rest of China. There are lots of problems with DNA databasing across the whole of China, because the government tends to focus on target groups. There was a good report from Human Rights Watch on this. They tend to focus on sex workers, on drug addicts—on groups that are seen as criminal but that are also extremely vulnerable social groups. Political opponents, protesters, petitioners: all these groups have been disproportionately targeted by DNA profiling across China.
So it’s important that Promega, from 2017 on, could not say that they did not know something was wrong. That’s a position companies like to take, saying that they blinded themselves and therefore didn’t know what was going on. Well, in this case, because they were given proper notice, they cannot say that they were not aware of it.
In your opinion, how should companies be responding to the information that their products are being used to abet human rights abuses? How should foreign governments be responding?
When Thermo Fisher made the announcement that they were going to stop selling their human identification solutions in Xinjiang, they actually made a very important statement. They said that they recognized the importance of considering how their products are used or may be used by their customers. It’s really quite unique for a company to acknowledge that, and I think it’s very, very important.
The key element in this is foreseeability. For example, the basis to decide whether somebody is complicit in a crime is whether or not the action they contributed was essential for the offense to happen in the way it happened. If you’re talking about supplying the technology [in Xinjiang], that would fulfill the requirement. But the second criteria is that they knew, or should have known, what was going to happen. So let’s say I’m a hardware store and I sell knives. If a man buys a knife in my shop and uses it to murder his wife a year later, then the police are not going to come to my shop and say I should go to prison because I sold a knife to this man. But, the situation would be really quite different if the man had burst into the store, shouting that his wife was cheating on him and that he wanted to buy the biggest knife that the store had. In that case, my responsibility could be put in question.
This kind of responsibility does apply to companies when they do business in their own country, but somehow, when they go to the other side of the world, it suddenly seems to disappear. It seems that only the local regulations apply, and of course when you go and do business with the Chinese police, what you’re doing is not illegal in China. But I think that there is a fairly straightforward solution: a company should be accountable both to the legal strictures of the country in which it does business—which is already the case—and also of its home country. The idea is that if you’re an American business, when you do business in China, you should abide by American standards of liability.
Currently, there is an effort by the United Nations to draft a treaty on the responsibility of transnational corporations. [If it were to pass as written,] it could definitely be something that would have an immense impact on our civilization. This is not hyperbole. Today, most human rights abuses involve a technological component, so [we should] hold companies responsible for their actions when they actually know what’s going on. Thermo Fisher sells sequencers in Xinjiang, and Thermo Fisher has offices all across China; it cannot say that it has no knowledge whatsoever of what is happening in different regions of China. That is not defensible, in my opinion.
Where else in the world do you see issues like this?
Much of my effort is on China, because I only have 24 hours in a day, and I’m still running a regular lab, as best or as badly as I can. But there are other very concerning situations. Nothing on the scale of Xinjiang.
In the U.S., at the U.S.-Mexico border, there are technologies being deployed for rapid DNA profiling. This is a slightly more advanced technology that allows people with less technical skills to quickly do DNA profiling—a minimally-trained police officer can generate a DNA profile. That is being used against highly vulnerable populations: people who arrive at the border. Their DNA profiles are being put in the U.S.’s national DNA database, the CODIS database, that is used for criminals. Immigrants who are entering the country illegally, but also immigrants who are actually applying for asylum at a legal port of entry, are being put in this database. Minors from these populations are put in these databases. This is saying that these people are almost criminals, while sociological studies actually show that both legal and illegal immigrants have a lower criminality rate than the general population. The idea that someone who arrived illegally at the border is a potential criminal is a very deep form of stigmatization that has no basis whatsoever.
This technology is also used to try to check whether a child is the child of the claimed parent. So a parent arrives at the border with a child, and then the argument is these children are being trafficked by people who are unrelated to them—and that might actually sometimes happen—and that the traffickers are trying to use the child to cross the border, because the child will receive better protection. However, checking family relations based on DNA is extremely problematic. It essentializes family relations to biological relations, which is really wrong. Even the Romans had a saying: the mother is always obvious, but the father is the one the marriage points to. Paternity is actually as much a social construct as it is a biological construct, and in the 21st century, maternity is also, to some extent, a social construct. Essentializing family relations to biological relations is extremely disruptive to families. A parent might not be the biological parent of a child and still be the parent of that child. And it’s not acceptable to simply say, “It’s too bad we make mistakes in two, three, four percent of cases.”
The other ongoing issue is consumer genomic databases, such as 23andMe, Ancestry.com, and GEDmatch. These companies, in particular the bigger ones, 23andMe and Ancestry.com, have enough genetic information to have a virtual DNA database of the entire U.S. population, and probably even the European population. Because you don’t have to be in the genetic database to be traceable—you only need to have a relative in the database and have genealogical information. A company like Ancestry not only has the genetic profiles, it also has all the genealogical information. It has billions of genealogical records from all over the world. That means that the only thing that is protecting the U.S. population from being in a national [government criminal] DNA database is the fact that the police still need a warrant to access that information, and that these companies vowed to fight those warrants. But there’s no guarantee they will win.
However, GEDmatch, which is a smaller company, allows people to use their profiles to do matches across other suppliers. So if you’re a customer of 23andMe, you cannot find a relative that has a profile at Ancestry, but you can both upload your profiles to GEDmatch to find each other. There is at least one reported case of a judge issuing a warrant against GEDmatch.
So the U.S. is very close to having a population-wide DNA database in the hands of these two companies, accessible to the police through a warrant. That’s also concerning because people never signed up for this. There was never any social debate about whether these databases could be used by the police. Also, strikingly, in the CODIS criminal database, the police are not allowed to search for relatives. So we’re in this very strange situation where the police are not allowed to use their own criminal database in a certain way, and therefore they’re trying to use a general population database in a manner that they’re not even allowed to use against criminals. This is extremely problematic.