Personal Information Protection Law in China: Technical Considerations for Companies
China recently released the full text of the long-anticipated Personal Information Protection Law. As the date of effectiveness looms ever closer, we provide an overview of the key technical considerations for companies to be compliant with the new law, including potential pitfalls and advice on what functionalities must be designed and applied to existing IT systems.
On August 20, China published the full finalized text of the Personal Information Protection Law (PIPL), the first such law ever to be passed in the country. Effective from November 1, 2021, this new law, along with the Data Security Law and Cybersecurity Law, China has built its own extensive legal realm of security and personal information protection.
We anticipate that these laws will have a profound effect upon business operations in China with regards to security and privacy management, much as the European Union’s General Data Protection Regulations (GDPR) have in the rest of the world. It will also bring more challenges to foreign companies conducting business in China.
In this article, we provide a technical perspective of the law’s core concepts and important stipulations and provide some suggestions for building compliant IT systems for foreign companies to consider.
Technical considerations for compliance with the PIPL
For people familiar with the GDPR, the stipulations of the PIPL will largely come as no surprise as the main concepts of two laws are basically the same. In other words, the PIPL has ‘borrowed’ some concepts from the GDPR, though there are still some minor differences. To be compliant with the PIPL, companies need to make many technical considerations, especially for IT infrastructure and system application and design. Below we list a few noteworthy considerations to make IT systems compliant with the law.
Considerations for IT infrastructure design
Many foreign companies conducting business in China have already established a mature and universal IT infrastructure, either on-premises or on the cloud, before entering in China. Therefore, using the same platform for China’s business operations is often a natural choice.
However, Article 40 of the PIPL requires that personal data collected and generated by “critical information infrastructure (CII) operators and personal information processors who process personal information reaching an amount designated by the Cyberspace Administration of China” must be stored in China. This data localization requirement means foreign companies must consider deploying standalone IT infrastructure for their business in China.
Although the PIPL indicates that passing “a security assessment organized by the Cyberspace Administration of China” can act as a green light for cross-border personal information transfer, according to our reading, there will still be big challenges in practice as no operation guide or procedure has been publicized yet. The recent security review of Didi, China’s ride-hailing behemoth, is the first case of such a security assessment in China, however this security review process did not focus solely personal information protection.
Regarding cross-border data transfer, it is important to note that even if data is stored in China in standalone IT infrastructure, it would still be treated as cross-border transfer if a user outside of China has remote access to the data. It is critical that a company’s IT department bears this in mind this when designing the IT infrastructure.
Considerations for the application design
The PIPL grants several rights to users for the use of their personal information, some of which will require companies to make special considerations when designing and applying their IT systems.
Below we outline some of the key articles to note.
Automatic decision-making and profiling: Article 24 requires data processor to provide users with an alternative option or the ability to refuse the use of their personal characteristics for marketing and push information through automated decision-making mechanisms. This means the system needs to be able to receive recipients’ feedback and exclude certain users from automatic decision-making mechanisms, which requires special consideration when designing the system. We anticipate that striking a balance between collecting massive amounts of personal information for analysis, implementing automatic decision-making, and protecting individuals’ rights, will be a big challenge for large data-driven marketing services.
Personal data inquiry, copy, correction, and deletion: Articles 45 to 47 stipulate the rights of individuals to inquire about what personal data is being collected and stored by the data processor. They also allow the users to request a copy of their personal data, correct any inaccurate personal information, and delete their personal information when withdrawing consent or terminating the use of the product or service. Companies therefore need to consider how to quickly locate each user’s personal information within the IT system and predefine a way of exporting a copy and delivering it to the user. Companies also need to consider ways of making each user’s record ‘independent’ to ensure that the deletion of one user’s record will not impact other existing or in-use data.
A few things to note:
- The right to deletion requires the company to consider the deployment of a universal platform for saving related personal data, so that the data can be easily located and deleted from all locations. A common issue that can arise in practice is data only being deleted from the live system, with another copy kept in the backup system. A predefined retention policy should be considered to delete the data automatically once it has expired, which is a good way to comply the requirements of Article 47(1). The data processor should delete the data proactively once the agreed storage period is up or the purpose for the data processing is achieved.
- Companies also need to plan for a reasonable authentication mechanism to accurately recognize the user who makes an inquiry or requests a copy, update, or deletion. ‘Reasonable’ means striking a balance between collecting enough personal identification information to authenticate the user and hedging against the increased risks associated with being responsible for larger amounts of potentially sensitive data. In a recent case, a hacker was able to request an update to another user’s contact information by changing the user’s phone number to their own one. The hacker then used their phone number to ‘authenticate’ the victim’s identity, reset the account password, and ultimately obtain full access to the victim’s data. This case illustrates the challenge of authenticating an individual when receiving a request.
- Article 49 also stipulates that the rights of an individual shall be exercised by his or her next of kin when the natural person dies. This presents an even bigger challenge for data processors, who also need to be able to recognize and authenticate a user’s next of kin.
Data separation and masking: Companies should consider techniques for separating sensitive personal information into different systems or databases, or at least into different tables in the same database. This is to reduce the risk that full and complete records of personal information are shared or accessed when the purpose for processing the data may only require access to a part of the record.
For example, an employee in the client service department who is responsible for a customer survey would only need access to a customer’s phone number or email address. They would not need to access a customer’s entire record, which could contain sensitive information such as home address and credit card information.
Data masking is another good way of hiding sensitive information while still allowing staff to access other non-sensitive data. Both data masking and separation of personal information are methods that should be considered and planned for when designing and applying an IT system, as making changes once the system has been deployed may be difficult and costly.
Considerations for designing a privacy interface
A friendly and useful privacy interface is important for implementing privacy principles and protecting user rights. A privacy interface makes the data lifecycle transparent to users, allowing them to control what data is being used and how it is being processed, and access a copy of the collected data. The PIPL’s unique stipulations require companies to take special consideration when designing privacy interfaces.
Below are some of the core requirements:
Opt-in instead of opt-out: Several clauses of the PIPL requires the data processor to obtain the user’s explicit consent, and even requires separate consent in special situations. This means the privacy interface should use the opt-in strategy and submit the choice and control to the individual for consent. When designing the system, a pop-up window providing an explanation and requesting the user’s consent can be considered when separate consent is needed for a specific service.
Refusal of service: Article 16 stipulates that if a user does not consent to the use of their personal information or withdraw consent, the data processors may not refuse access to the product or service, unless the processing of the personal information is necessary to provide the product or service. This article tackles a common practice among mobile applications that requests excessive privileges, such as access to a smartphone’s microphone and camera, GPS, files and address book, and even messages, even though only one or two basic privileges would be required to deliver the core service, and the other privileges would only occasionally be used for other non-core services. According to the new regulations, mobile apps cannot refuse a user access to core services if they do not consent to the use of additional personal information that is not required to fulfill the core service. In short, the ‘all-or-nothing’ method that many apps have employed is not compliant with the PIPL. The design of privacy interface therefore needs to consider the inclusion of separate privacy notices and choices for users based on what kind of service is being offered.
Design for separate consent: Article 23 requires data processors to obtain a user’s “separate (nonbundled) consent” before it can share the personal data with a third party. Article 29 requires data processors to obtain separate consent from a user when processing sensitive personal information. The scope of ‘sensitive personal information’ in the PIPL is much broader than in the GDPR – financial information, transaction records, and location tracking are all regarded as sensitive personal information. Separate consent is also required when sharing personal information to a party outside of China, as specified in Article 39. To be compliant with these legal requirements, companies need to consider designing a standalone consent option or window in the privacy interface for the above-mentioned circumstances, in addition to the general consent request needed before the user starts using the service.
Withdrawal of consent: Article 15 requires data processor to “provide a convenient way to let the user withdraw their consent”. This clause did not appear in the first draft of the PIPL but was later added to the second draft and has now been kept for the final version. Accordingly, the data processor should consider designing a clear and easy way for users to withdraw their consent, such as letting the user easily de-register their service account. This has been a key focus for the Ministry of Industry and Information Technology (MIIT) in its compliance inspections of mobile apps in recent years. Many mobile apps have been asked to make corrections or have even been forced to delist from mobile app stores as a result of compliance failures.
Considerations for surveillance measures
Biometric data, such as that used for facial and fingerprint recognition, is regarded as sensitive personal information. It therefore requires special protection and processing procedures, including separate consent as described in the section above. The data processor should take special considerations when implementing surveillance measures.
Below are some key aspects to consider:
Facial recognition: This is an area of huge importance to China’s legislators. On July 27, 2021, the Supreme People’s Court published a judicial interpretation on the use of facial recognition technologies for processing personal information, which requires companies to “disclose rules for the processing of facial information and expressly indicate the processing purpose, method, and scope”. The judicial interpretation also prohibits the use of “bundling consent (for processing the user’s facial information) with any other authorization”. The violation of this clause would be regarded as an “infringement upon the personality rights and interests of a natural person”. Data processors therefore need to consider creating standalone privacy notices for disclosing information related to facial recognition and obtaining the explicit separate consent for facial information processing, as previously described in the section on designing a privacy interface. In addition, an alternative option should be designed into the system if facial recognition is currently the sole option for authentication. Companies that have forced users to log on to a system, enter an office, or log attendance using facial recognition without offering any alternative have come under fire in recent years, and this clause seeks to address such misuses of personal information.
Fingerprints: Compared to facial recognition, the use of fingerprints for authentication has a much broader scope and is widely used for entrance into buildings and offices. As with facial recognition, fingerprint information falls under the category of sensitive personal information and is therefore subject to the same measures and considerations as facial recognition.
CCTV: It is common practice to deploy CCTV cameras around or inside office spaces, factories, and other business locations for security reasons. Monitoring data from CCTV cameras should be well managed, with access authorization given only to a limited number of people. More importantly, data collected from CCTV cameras should only be used for express purposes, such as security, and cannot be used for other purpose, such as marketing services. Data processors should adopt predefined policies to regulate CCTV data usage and access, especially for CCTV systems that upload data to external vendor over-the-air.
Considerations for data collection from third parties
The Data Security Law (DSL), effective on September 1, 2021, requires data processors to be responsible for the legitimacy of the data obtained from a third party.
There is a common practice for companies to ‘call-up’ or integrate existing SDKs from other parties into their own Android mobile application to provide better services to users, such as using third party authentication SDKs to enable single sign-on (SSO).
This is a simple way to obtain new functionalities or enhance the features of an app without having to spend more time and money on in-house development. However, this practice also leaves open the risk that the third-party SDK collects personal information and transfers it out, sometimes without the user or mobile app operator even knowing.
The data processor should conduct careful due diligence of a third-party SDK to guarantee its security and compliance before adopting it. Information on the third-party SDK, the purpose of its use, and the scope of personal information it collects, should also be disclosed to the users.
Protecting users’ rights through compliant IT systems
The PIPL greatly limits many of the data misuses that have plagued Chinese consumers for years and goes to great lengths to protect user’s rights to privacy and control of their personal information.
As we have seen from recent crackdowns on mobile applications and online service providers, the PIPL is likely to be strictly implemented.
Compliance is therefore critical to remain on the right side of the law. Given the public backlash against data misuse in China, having fair and transparent data practices are also essential to maintaining a healthy relationship with your users and customers.
Building compliance into IT infrastructure and systems is key to achieving this goal. We hope that by listing some of the common issues companies may come up against during operations can help raise awareness of the requirements, and that companies will take prompt action to protect the personal information of their users.
In the coming months, we will continue to publish new articles with suggestions on processes and best practices for complying with the PIPL and other data security laws.
China Briefing is written and produced by Dezan Shira & Associates. The practice assists foreign investors into China and has done so since 1992 through offices in Beijing, Tianjin, Dalian, Qingdao, Shanghai, Hangzhou, Ningbo, Suzhou, Guangzhou, Dongguan, Zhongshan, Shenzhen, and Hong Kong. Please contact the firm for assistance in China at email@example.com.
Dezan Shira & Associates has offices in Vietnam, Indonesia, Singapore, United States, Germany, Italy, India, and Russia, in addition to our trade research facilities along the Belt & Road Initiative. We also have partner firms assisting foreign investors in The Philippines, Malaysia, Thailand, Bangladesh.
- Previous Article China’s New Deed Tax Law: What Does it Mean for Businesses?
- Next Article The PRC Personal Information Protection Law (Final): A Full Translation