As AI becomes commonplace in our daily lives, the Personal Data Protection Commission (PDPC) has released a set of guidelines on the use of personal data in AI systems to strike a balance between the development of AI systems and the protection of consumer rights to privacy, effective 1 March 2024.
The Personal Data Protection Act 2012 (PDPA) regulates the collection, use and disclosure of personal data, balancing individual privacy rights with organisational needs to collect and use data for reasonable purposes.
In today’s digital economy, vast amounts of data are required for AI analytics, making it impractical for organisations to seek consent for every new purpose. The PDPA allows for meaningful consent in such cases.
The Advisory Guidelines on the Use of Personal Data in AI Recommendation and Decision Systems clarify when organisations can use personal data for AI, ensuring consumer trust in AI-driven decisions and recommendations.
The guidelines are organised according to the typical stages of AI System implementation:
- Developing, Testing, Monitoring
- Using personal data for training and testing AI systems, and monitoring performance post-deployment
- Deployment
- B2C Provision of AI Systems
- Collecting and using personal data in already deployed AI systems
- B2C Provision of AI Systems
- Procurement
- B2B Provision of AI Systems
- Service Providers using personal data in organisations’ possession
- B2B Provision of AI Systems
Development, Testing and Monitoring Stage
Is Consent Required to Train AI?
Organisations can use personal data to train AI systems only if consent has been obtained, unless an exception within the PDPA applies. Two main exceptions include the Business Improvement Exception and Research Exception.
Business Improvement Exception
This applies when organisations are developing a new product, service, process or enhancing an existing one. This exception also caters for intragroup and intracompany sharing of personal data.
For example:
- Social media recommendation engines that suggest personalised content
- Job assignment systems that automatically assign jobs to platform workers
Research Exception
This applies when organisations conduct commercial research to advance science and engineering over new AI systems that have public benefit and if there is no immediate application. It also caters for intragroup and intracompany sharing of personal data, as well as between unrelated companies for jointly conducted commercial research to develop new AI systems. For example:
- AI System that improves understanding and development of science and engineering
- AI System to increase innovation in products or services that benefit society by improving the quality of life
Best Practices
When personal data is used, organisations should ensure that it is sufficiently protected. Steps to do so may include:
- Accountability Obligations
- Establishing and updating policies and practices for using personal data
- Technical, Process and Legal Controls
- Implementing appropriate controls when using personal data
- Use Anonymised Data Sets
- Anonymising datasets as far as possible with controls to prevent re-identification
- Security Measures
- Implementing protection measures within the development environment
- Privacy by Design
- Adopting a proactive approach in design and assessment to mitigate privacy risks and attacks
- Data Minimisation
- Using only personal data necessary for training and improving AI systems to reduce security risk
Deployment Stage (B2C)
When organisations deploy AI systems that collect and use personal data to provide new functionalities or enhance product features, organisations need to comply with Consent, Notification and Accountability obligations under the PDPA.
Consent and Notification Obligations
Deemed consent is necessary, unless an exception under the PDPA applies. Users must be notified of the types of personal data that will be collected, and the purpose and intended use of their personal data when seeking consent.
To ensure informed and meaningful consent, notifications should not be overly technical or detailed. Organisations can consider layering information and displaying the most relevant information prominently, whilst additional details can be linked elsewhere so that users can read up more if desired.
Example Case Study: Movie Streaming Service
Organisations should consider the perspective of consumers and create notifications that help users understand how their personal data will be processed to achieve the intended purpose.
A movie streaming service could provide the following information to its’ users in a notification to inform them of:
- Function of the product that requires use of personal data
- Recommendation of Movies
- General description of types of personal data to be collected
- Movie Viewing History
- Explanation of product data relevance to product feature
- Analysis of Viewing History to make movie recommendations
- Specific features of personal data more likely to influence product feature
- Whether the movie was viewed completely/multiple times
Legitimate Interest Exception
An exception from consent applies when personal data is collected, used or disclosed if it serves the legitimate interest of an organisation, and if such interests outweigh any adverse effect on the individual. When this exception applies, the organisation may process personal data without obtaining consent. Examples where this exception applies:
- Personal data as input in an AI system to detect or prevent illegal activities
- Personal data used for evaluative purposes, such as the managing or terminating of an employment relationship.
Accountability Obligation
The guidelines also encourage organisations to ensure that they have fulfilled their responsibility for personal data which it has collected or obtained for processing, or which it has control over. Examples of how organisations can do so include:
- Develop written policies to demonstrate internal structures and operational practices that ensure responsible use of personal data
- Users should be informed about the use of AI systems, with the level of detail provided proportional to the risks associated with each specific use case.
- Make written policies available upon request. Consider pre-emptively publishing the policies on your organisation’s website in a simple and concise format.
- Provide more information on data quality and governance during AI System Development.
Procurement Stage (B2B)
The guidelines provide guidance to Service Providers engaged by organisations for the development and deployment of bespoke or fully customisable AI systems. These service providers are considered data intermediaries when they process personal data on behalf of their customers. They also outline the obligations of the engaging organisations, however, are not relevant to organisations that develop AI Systems in-house, or use commercially available off-the-shelf solutions.
Obligations of Service Providers
Service Providers will also have to comply with applicable obligations under PDPA, including protection and retention obligations. They are also required to notify the organisation if they suspect that a data breach has occurred. They are encouraged to support their clients (organisations) in meeting their Consent, Notification and Accountability Obligations. Some examples of how they may provide support include:
- Being familiar with the types of information that the commissioning organisation is likely to need, based on its needs and the impact the AI system will have on its end-users.
- Adopt practices such as data mapping and data labelling to keep track of the data used to form the training dataset during the pre-processing stage.
- Maintaining a provenance record to document how the data has been transformed during data preparation.
- Provide training to ensure a thorough understanding of the system’s operations, including technical clarifications on how decisions or recommendations are reached.
- Designing systems to facilitate the extraction of relevant information to clients’ PDPA obligations.
While Service Providers support organisations in achieving their Consent, Notification and Accountability Obligations, organisers bear the primary responsibility for ensuring that the AI Systems they use meet the organisations’ obligations under the PDPA.
If you need clarification on how these guidelines apply to your business or want to know if you can rely on an exception for your use case, please reach out to us for a discussion.