The Huawei patent mentions the use of Uighur-spotting technology

  • By Leo Galion
  • Technical Desk Editor

A Huawei patent for a system to identify people of Uyghur origin among images of pedestrians has been brought to light.

The filing, discovered by a US research firm and shared with BBC News, is one of many involving leading Chinese technology companies.

Huawei has previously said none of its technologies are designed to identify ethnic groups.

Now plans to transfer the patent.

Forced labor camps

video title, Human Rights Watch: „This is horrific racial profiling”

The Uyghur people are a predominantly Muslim ethnic group who live mainly in Xinjiang province in northwestern China.

Government officials are accused of using high-tech surveillance against them and holding many in forced-labor camps, where children are sometimes separated from their parents.

Beijing says the camps provide voluntary education and training.

image source, Good pictures

image caption, China’s tech companies deny selling software used to single out Uyghurs from other people based on their appearance.

„One technical requirement of the Chinese Ministry of Public Security’s video-surveillance networks is to detect ethnicity — particularly of Uyghurs,” said Maya Wang of Human Rights Watch.

„While in other parts of the world, targeting and persecuting a people based on their ethnicity is completely unacceptable, persecution and severe discrimination in many aspects of Uyghurs’ lives in China has gone unchallenged because Uyghurs have no power in China.”

Body movements

It describes ways to use deep learning artificial intelligence techniques to recognize various features of pedestrians photographed or filmed on the street.

It focuses on the fact that different body postures – for example whether one is sitting or standing – can affect accuracy.

But the document also lists attributes by which a person may be targeted, including “ethnicity (Han). [China’s biggest ethnic group]Uighur)”.

video title, BBC News visited China’s Muslim „conversion” camps in 2019.

A spokesman said the note should not be included.

„Huawei opposes all forms of discrimination, including the use of technology to implement racial discrimination,” he said.

„Identifying the race of individuals has never been part of the research and development program.

„It should never become part of the application.

„We are taking proactive steps to correct it.

„We are constantly working to ensure that new and emerging technology is developed and used with the utmost care and integrity.”

A 'confidential’ document

It previously flagged a separate „secret” document on the Hawaii website that referred to work on a „Uyghur alert” system.

In that case, Huawei said the page was referring to a test rather than real-world use, and denied sales methods that identify people by their race.

On Wednesday, Tom Tugendhat, who chairs the UK Parliament’s Foreign Affairs Select Committee and the Conservative Party’s China Study Group, told BBC News: „Chinese tech giants supporting a brutal attack on the Uighur people shows why we as consumers and as a society should be careful about who we buy our products from or do business with. should be.

„Creating racial-labeling technology for use by a repressive regime is clearly not behavior up to our standards.”

Face recognition software

IPVM also found references to the Uighur people in patents filed by Chinese artificial intelligence company Senstime and image-recognition expert McVeigh.

Sentime filings, as of July 2019Discusses ways facial-recognition software can be used for more efficient „security protection,” such as searching for a „middle-aged Uyghur with sunglasses and a beard” or a Uyghur wearing a mask.

A Senstime spokesman said the notes were „regrettable”.

„We understand the importance of our responsibilities, which is why we started developing our AI code of ethics in mid-2019,” he said, adding that patents predate the code.

Ethnic-labeling solutions

image caption, Like Huawei, Megvii is now planning to withdraw the original version of its patent

It said it could be categorized by ethnicity, including „Han, Uyghur, non-Han, non-Uyghur and unknown”.

The BBC told the news agency it was withdrawing the patent application.

„Megvii recognizes that the language used in our 2019 patent application is open to misunderstanding,” it said.

„Megvii has not developed and cannot develop or sell racial or ethnic-labeling solutions.

„Megvii acknowledges that, in the past, we have focused on our business development and lacked proper control over our marketing, sales and operational products.

„We are taking steps to rectify the situation.”

Attribute-recognition model

IPVM also flagged image-recognition patents filed by China’s two biggest tech companies. Alibaba And ByeIt refers to the classification of people by ethnicity, but does not specifically refer to the Uyghur people by name.

Alibaba responded: „Racial or ethnic discrimination or profiling in any form violates our principles and values.

„We never intend to use our technology, and we will never allow it to be used to target specific ethnic groups.”

image source, Good pictures

image caption, Protests are taking place around the world against China’s treatment of the Uyghurs

And Baidu said: „When filing for a patent, the document references are an example of a technical description, in this case describing what the attribute-recognition model is rather than representing the expected implementation of the invention.

„We do not and will not allow our technology to be used to identify or target specific ethnic groups.”

But Human Rights Watch said there were still concerns.

„Any company selling video-surveillance software and systems to the Chinese police must ensure they meet the police’s requirements, including the ability to detect ethnicity,” Ms Wang said.

„The right thing for these companies to do is to immediately stop the sale and maintenance of surveillance equipment, software and systems to the Chinese police.”

READ  Go from thoughts to words using technology

Dodaj komentarz

Twój adres e-mail nie zostanie opublikowany. Wymagane pola są oznaczone *