Since reports of mass detentions first emerged in 2017, the Xinjiang region of western China has become nearly synonymous with detainment camps. But while the camps were the most galling example of Beijing’s crackdown on Uyghurs, an ethnic group native to Xinjiang, they were only part of a larger system of surveillance, made possible by cutting-edge technologies in the hands of an authoritarian government.
In his new book “In the Camps,” Darren Byler, an assistant professor in the School for International Studies at Simon Fraser University in Vancouver, British Columbia, draws on interviews with both detainees and those who worked in the camps, as well as vast troves of government documents, to paint a detailed picture of life in Xinjiang since 2017.
In this interview with The Diplomat’s Shannon Tiezzi, Byler – who has researched Xinjiang and the Uyghurs for over a decade – explains the reality of life in “China high-tech penal colony,” where surveillance is everywhere, and the camps are a constant reminder of what awaits anyone who checks the wrong box in an invisible algorithm.
One of the common themes in your interviews is that people didn’t believe the crackdown would impact them – the sense that “I’m safe because I’m Kazakh, not Uyghur” or “because I’m not religious” or “because I’m well-educated and speak fluent Chinese.” Their confidence, it turns out, was tragically misguided. Given the lived reality in Xinjiang today, do you think there are still members of Turkic ethnic groups who have that sense of safety?
Among the most insulated minority state officials and security personnel there may be some feelings of safety. Since they have taken an active role in the mass internment campaign, and they themselves have not yet been targeted, they may feel as though they are “safe” in some ways. But they also know quite clearly what lines they cannot cross, because they have seen what happens to people who don’t fervently support the campaign or resist it even in minor ways. So yes, some people may view themselves as safe, but not as invincible.
Other Turkic people, particularly those whose families are not within the state apparatus, have actively sought out forms of protection. In my interviews for the book, Kazakhs and Uyghurs told me about people in their communities who had divorced their husbands after they were detained and strove to marry others who were politically protected. Others in their communities publicly denounced friends and relatives as a way of showing their loyalty to the state project. Over time this protection-seeking appears to have increased.
A Han resident of the region who recently visited their family in Northern Xinjiang told me that it was relatively common for young Turkic people in urban settings to embrace a more assimilated way of life since the campaign had begun. For many this means doing things like going to Han restaurants, speaking and writing only in Chinese, and dressing in ways that appear cosmopolitan. For some, particularly young women, there is an increased prevalence of interethnic romantic or economic relationships, something that my interviewee said was widely viewed as a form of protection.
Similarly, most Han I’ve spoken to are convinced that they, too, are not impacted by what’s happening in Xinjiang. Those detained must be guilty, the thinking goes. But China’s surveillance network is not limited to Xinjiang. Are there any signs or indications that this sort of high-tech predictive policing – up to the detaining of “pre-criminals” – is being deployed in other parts of China?
The parameters of the surveillance and detention system in Xinjiang are largely unique to that region. Most of the people who were assessed as “untrustworthy” and sent to detention facilities for “training” were deemed guilty of terrorism or religious extremism crimes that were “not serious” or were “not malicious.” So they were quite specific to enforcement of China’s sweeping counter-terrorism laws, which apply pretty specifically to religious minorities in China — namely Uyghurs, occasionally Tibetans and other groups, such as the Falun Gong. So this type of population-level assessments and detentions of regular individuals is unlikely in most of China. That said, throughout the country such tools have been used, or could be used, to target and assess community leaders who are deemed troublemakers.
The digital forensics tools that are used to scan smartphones throughout Xinjiang — often called “counterterrorism swords” — have been bought by border agencies in places like international airports across the country. Domestic police departments in ethnic minority areas in Ningxia, Sichuan, Yunnan, and elsewhere have also purchased them. These assessment tools are plugged into phones using a USB cable and scan through the phone’s hard drive looking for more than 50,000 markers or patterns of illegal activity. This indicates to me that in border situations and criminal investigations, tools that were developed and battle tested in Xinjiang have been added to the repertoire of tools used by state security elsewhere in the country.
Over 500 cities and municipalities across China have developed smart city systems that use forms of biometric surveillance. In most other contexts, these tools are used to enforce traffic laws and facilitate economic infrastructure. In some cases, they support social credit assessment pilot programs and grassroots policing. But so far it appears as though disfavored populations like the Uyghurs and Tibetans are the most dramatically affected by such systems — with the police being alerted to their presence in communities across the nation. Most protected citizens, it appears, are less affected in their daily life.
China cracks down on any number of groups that embrace an identity outside of what the state defines as acceptably “Chinese” – Tibetans, for example, or Christians. Why did the government adopt such extreme methods in Xinjiang in particular?
The Uyghurs, like the Tibetans, live in their own ancestral homeland, speak their own language, and are ethno-racially distinct from the Han population. These ties to sacred land, the knowledge system that is carried by their language, and their ethno-racial difference together mean that they carry claims to autonomy or collective self-determination that are difficult for the Chinese state to capture. As place-based peoples at the periphery of the native lands of the Han people, Uyghurs and Tibetans (like Mongols, Kazakhs, and others) occupy a position that is similar to those of other Indigenous peoples in Asia.
However, in distinction from the Tibetans, Uyghurs are also a much larger group (around 12 million people), their region possesses a greater amount of natural resources (coal, oil, natural gas) and arable land, and they are positioned on a core node of China’s Belt and Road Initiative. Perhaps most importantly, Uyghurs are a Turkic Muslim group with great affinities with the people of Central Asia and Turkey.
Initially, when the state authorities began to relabel Uyghur non-violent and violent protests as terrorism in the early 2000s, there did not appear to be much credible evidence of political Islam as a motivating factor. In the mid-2010s as Uyghurs became more tightly linked with the broader Muslim world through the arrival of smartphones and internet, several isolated and unconnected suicide attacks carried out by a small number of Uyghurs did appear to meet international definitions of what constitutes terrorism. And it appears that some state authorities and Han settlers really did begin to believe their own fears of a rising Uyghur insurgency. I was living in the region at the time and would hear often from Han interviewees about these perceived threats of “extremism.” Despite the fact that only a several hundred people were involved in such attacks, they felt as though the entire population was suspect.
Some Han people, though, particularly those who understood the history of Xinjiang, knew that the issue at stake was not simply that Uyghurs were “prone to terrorism,” but that Uyghurs were experiencing systematic discrimination and dispossession coupled with pervasive state violence in the form of police brutality, surveillance, and control throughout all aspects of life. Many Uyghurs I interviewed at that time complained about the lack of freedom and opportunity available to them, but the vast majority I spoke with were not interested in violent resistance. They hoped simply to find a better life for themselves and their autonomous community within the Chinese system.
The state authorities I’ve spoken with and state documents I reviewed talk about the mass surveillance and internment system as a long-term strategy to produce “permanent stability” in the region and take care of the “Xinjiang problem” once and for all. There are many economic and political factors that contributed to the calculus of the campaign, but in general I think the mass surveillance and internment project in Xinjiang should be viewed as a major test of Chinese capacities to conduct a sophisticated invasion, occupation, and transformation of spaces that were at the margins of Chinese control. The lessons they have learned and technologies they have developed in Xinjiang will likely be adapted to a range of security and tactical situations as China takes a greater role on the world’s stage. This is not to say that I anticipate “new Xinjiangs” emerging elsewhere on China’s frontiers, but that the Xinjiang experience will likely inform decision making and technology deployment.
The title of the book is “In the Camps,” but you show that the oppression is prevalent outside the camps as well. The surveillance net extends into every facet of life: facial scans at mosque doors, tracking apps on smartphones, police checkpoints that use facial recognition software. As you put it: “In a general sense, state authorities and private manufactures now control significant aspects of everyday Muslim life.” Is this level of control sustainable over the long term? Will there be a generation of Uyghurs, 20 years from now, for whom that degree of surveillance is just internalized as normal?
The control I refer to is contingent on political will and economic factors. It costs a great deal of money to build and maintain these systems. State documents show that the China has invested as much as $100 billion to build the camps and related material and digital infrastructure. They have also hired around 60,000 low level police to work as grid workers in addition to tens of thousands of additional officers. Maintaining a security workforce of 100,000 in addition to the maintenance and updating of software and hardware systems will require significant spending going forward. While some of these costs may be recouped through assigned labor schemes, land and asset seizures, and increased access to natural resources and post-campaign tourism, it is likely that it will be quite some time before the systems pay for themselves in concrete terms. Already, in northern Xinjiang in particular there is some evidence that checkpoints are no longer being used due to malfunctioning equipment and a combination of a lack of urgency and funding.
The lack of political will to maintain the intensity of the system is likely also at least partially a result of increasing international pressure. Regional and national state authorities have largely moved from a phase of active mass detainment to formal mass incarceration and job assignments in securitized factories. Since 2017 over 533,000 individuals have been formally prosecuted in Xinjiang.
State authorities are also actively attempting to erase both material and digital evidence of the camp system — hiding former detainees in prisons and factories and pretending as though nothing has happened. Part of the retraction of some forms of obvious surveillance equipment in urban spaces open to international travelers appears to be an effort to hide obvious elements of control.
Despite this retraction of some visible forms of control, the general technologies of biometric assessment — face and voice recognition — and dataveillance — scanning digital histories — are now highly finetuned. The base datasets that technology companies and police work from are expansive and highly symmetrical. In internal police documents obtained by The Intercept, I saw over and over again that the probability readings of facial imagery were at 95 percent or higher. This means that individuals registered in Xinjiang can really be tracked and made searchable in real-time. Likewise most individuals have had their smartphone scanned no fewer than 10 times over the course of a year.
So that is to say that the next generation of Uyghurs will grow up with an awareness that their movement and digital speech are being tracked and that they can always be deemed untrustworthy. The system is really the first settler colonial process of dispossession — taking the land and labor of a colonized people — that has been attempted in a fully digitized environment. My sense is that the psychological trauma of this system of unrelenting and intimate domination will likely be felt over generations.
Another key theme in your book is the complicity of U.S. tech firms in China’s crimes against humanity. In fact, the Chinese tech companies involved learned from facial recognition and surveillance tools in the United States, either through observation or direct partnerships. As a broader global debate about the ethics of cutting-edge technology unfolds, what lessons can we take from Xinjiang to prevent the next “high-tech penal colony” from taking shape?
The technologies used in Xinjiang are largely the same as technologies used in border contexts in North America. When a traveler goes through a border crossing here, they often have their face or irises scanned and matched to the image on their passport. If their digital file raises an alarm, their smartphone may be taken from them and scanned using a digital forensics tool. The difference in Xinjiang is that these technologies have been generalized across the entire region — so it is as though Uyghur citizens are crossing half a dozen or more international borders every day. Likewise, since the entire population of Muslims, 15 million people, have been deemed potentially untrustworthy their phones are scanned on a regular basis by the police and their employers.
The dataset I examined for The Intercept was built using open-source Oracle software. The surveillance companies that built the facial recognition capacities of the system have multiple former and active links to Microsoft, as well as academic institutions, journals, and conferences in the North America and Europe. Indeed, many of the technologists who designed this software worked for companies like Adobe before joining surveillance firms, and have now gone back to Adobe since their companies have been placed on no-trade lists in the United States.
My point here is not to say that technologists who work for Chinese surveillance firms are doing something unusual; on the contrary, their work is quite similar to tech development elsewhere in the computer vision industry. Chinese technologists I have spoken to view their work as following nearly exactly the same ethical standards as U.S. based firms, which actively assist the U.S. military and police.
Writing this book forced me to think about the relationship between automated surveillance technology and state power. Currently there is very little regulation, other than existing privacy laws and citizen protections, concerning this relationship. This means that the only means of penalizing companies for abusing privacy is through consumer and worker advocacy. At the same time, the benefits that companies accrue from working with state security are enormous. Not only do they receive state capital to develop new technologies, but they are also able to access enormous data sets that allow them to finetune algorithms. In the absence of robust and finetuned regulation and penalties regarding surveillance, the protection of vulnerable populations like the Uyghurs — and also undocumented populations in the United States — falls to the good will of technologists and their shareholders.
“In the Camps” shows that within policing and camp systems, pervasive automated technologies have the effect of further normalizing immense cruelty. Because the technology systems are taken to produce a kind of truth when it comes to crime prediction, and because this truth cannot be questioned due to the black box effects of advanced technologies, the banality of unthinking bureaucratized procedures increases in exponential ways. Ultimately, reversing automated crimes against humanity will require a rethinking of technology design and penalties for harmful design.
Darren Byler on Life in Xinjiang, ‘China’s High-Tech Penal Colony’
Source: Frappler
0 Comments