Worldcoin, a project initiated by Sam Altman, the founder of ChatGPT, has faced numerous obstacles in overcoming privacy regulations. Recently, Worldcoin was ruled to have violated the Privacy Ordinance in Hong Kong and has been forced to stop collecting users’ iris and facial images.
Worldcoin uses iris scanning technology for human identity verification, and according to its official website, over 5 million people from more than 160 countries have already undergone verification.
Due to the serious privacy concerns associated with collecting iris data, the Office of the Privacy Commissioner for Personal Data in Hong Kong conducted 10 undercover operations between December 2023 and January 2024 at six Worldcoin operating locations in Hong Kong, including Yau Ma Tei, Kwun Tong, Wan Chai, Cyberport, Central, and Causeway Bay. On January 31, 2024, they obtained a court order to enter the operating locations for investigation. After two rounds of questioning, all relevant investigations have now been completed.
Based on the investigation results, the Privacy Commissioner for Personal Data, Ada Chung, ruled that Worldcoin’s operations in Hong Kong violated the Privacy Ordinance.
Further reading:
Foreign media: Worldcoin may collaborate with OpenAI and PayPal! If true, will it raise more regulatory concerns?
Worldcoin’s collection of facial and iris images deemed unnecessary
The Privacy Commissioner’s Office believes that the collection of facial and iris images by the Worldcoin project is unnecessary because verifying whether a user is human does not require iris scanning. It can simply be assessed by staff at the operating locations.
Furthermore, biometric data is sensitive personal information, and if used or disclosed improperly, it can lead to serious consequences. The Office believes that facial and iris images are not necessary when there are verification methods with a lower degree of privacy invasion.
Lack of important information in Chinese, failure to proactively inform people of risks
In addition, the content of Worldcoin’s Privacy Statement and Biometric Data Consent Form lacks a Chinese version. Moreover, the staff at the operating locations do not explain or confirm to participants that they “understand” the content of these documents, nor do they inform people of the risks of providing biometric data to the project or answer their questions.
Excessive retention of personal data
The investigation revealed that Worldcoin retains personal data for up to 10 years to train the AI models for identity verification programs. The Privacy Commissioner’s Office considers this retention period to be excessively long and an overretention of personal data.
The Privacy Commissioner in Hong Kong has issued an enforcement notice to Worldcoin, demanding that the project cease collecting the iris and facial images of Hong Kong residents using iris scanning devices. The Office of the Privacy Commissioner for Personal Data in Hong Kong stated that if anyone discovers Worldcoin still operating iris scanning devices at their locations, they can report it directly, prompting the Office to take enforcement action. Therefore, Worldcoin is currently in a “banned” state in Hong Kong.
Source:
CoinDesk, PCPD, Reuters