THALES BLOG

The Relevance of Privacy-Preserving Techniques and Generative AI to DORA Legislation

October 29, 2024

Ollie Omotosho Ollie Omotosho | Director, Global Strategic Partnerships More About This Author >

The increasing reliance on digital technologies has created a complex landscape of risks, especially in critical sectors like finance. To address these challenges, the European Union introduced the Digital Operational Resilience Act (DORA) in 2022, designed to ensure that financial entities can withstand and recover from cyber threats while maintaining operational continuity. At that time generative AI was not a major consideration and novel privacy-preserving techniques (PPT) were not featured heavily on a CSO 5yr budgetary plan. The world has changed. The responsible use of GenAI, and adoption of PPT play a crucial role in aligning with DORA legislation while safeguarding sensitive data. Balancing resilience with data privacy becomes more complicated, and never before have vendor and advisor partner ecosystems become more valuable.

DORA and Its Focus on Operational Resilience

DORA establishes a comprehensive framework for digital operational resilience in the financial sector. It aims to ensure that financial institutions, ranging from banks to payment processors, can manage and mitigate risks associated with information and communication technology. Key areas covered by DORA include incident reporting, regular ICT risk assessments, third-party risk management, and maintaining robust governance frameworks.

The legislation is built to fortify the financial sector against a wide range of threats, from cyberattacks to ICT failures. According to McKinsey, 94 percent of financial institutions are fully engaged in understanding the detailed requirements of the legislation; most are doing so through a dedicated DORA program, with DORA as a board-level agenda item. The global CrowdStrike incident saw thousands of flights grounded, surgeries cancelled and lockouts from online banking due to ICT outages. However, DORA also requires financial entities to process significant amounts of sensitive data, such as personal information and financial records. This introduces the challenge of balancing operational resilience with privacy compliance, especially in the context of data protection laws like the General Data Protection Regulation (GDPR). Privacy-preserving techniques and integration of AI technologies can help organizations navigate this regulatory landscape.

The Role of Privacy-Preserving Techniques in DORA Compliance

Privacy-preserving techniques allow organizations to protect sensitive information while still utilizing data for essential operations, analytics, and compliance efforts. As DORA emphasizes both resilience and cybersecurity, these techniques are indispensable in ensuring that the financial sector can uphold privacy standards while meeting operational resilience goals.

1. Encryption

Under DORA, encryption plays a key role in securing financial and personal data, particularly during the transmission and storage of data in ICT systems. It ensures that in the event of a cyber incident or breach, the underlying data remains protected, minimizing the risk of data leaks and ensuring compliance with both DORA and GDPR.

Recommendation: apply encryption according to risk profile of the data and in line with company policy. A balance of cloud-native encryption and 3rd party tools such as CipherTrust Transparent Encryption will be required

2. Anonymization and Pseudonymization

Anonymization irreversibly transforms data to ensure it can no longer be traced back to an individual, while pseudonymization replaces identifiable information with fictitious data. These techniques are particularly useful for sharing data with third-party ICT providers or conducting incident reporting and assessments as required by DORA. By ensuring that no personally identifiable information is exposed, organizations can comply with DORA’s transparency requirements without infringing on GDPR’s data protection rules.

Recommendation: apply tokenization according to risk profile of the data and in line with company policy. Be sure to understand if format preservation is required in order for applications to continue to perform without disruption

3. Homomorphic Encryption and Zero-Knowledge Proofs (ZKP)

Advanced privacy techniques like homomorphic encryption and zero-knowledge proofs (ZKP) are becoming more relevant under DORA’s ICT risk management framework. Homomorphic encryption allows financial institutions to perform computations on encrypted data, which is useful for running risk analyses or audits while maintaining the confidentiality of sensitive information. ZKP enables one party to prove they possess certain information (such as compliance with a security protocol) without revealing the actual data, which is ideal for secure audits and compliance reporting in line with DORA.

Recommendation: evaluate the use of these innovative techniques within research environments. Then combine with deployment of classic encryption and tokenization. Underpin technologies with a unified centralized key management regime where appropriate. Remember that no one size fits all and a robust regime will be a blend of techniques.

The Impact of Generative AI on DORA Compliance

Generative AI, a subset of artificial intelligence that can create new content such as text, images, and even code, is becoming increasingly prevalent in the financial sector. Applications of generative AI range from automating customer service to generating predictive models for risk management. However, the introduction of such powerful tools presents unique challenges in the context of DORA compliance, especially concerning operational resilience and privacy.

1. Generative AI for Cybersecurity and Risk Management

Generative AI can significantly enhance cybersecurity and risk management efforts, both of which are central to DORA’s operational resilience requirements. AI models can identify patterns in vast datasets to detect potential vulnerabilities and anticipate cyber threats. AI-driven systems can continuously monitor network activities, recognizing anomalies that could signal an impending attack. This proactive approach to cybersecurity aligns directly with DORA’s emphasis on managing ICT risks and responding swiftly to operational disruptions.

Generative AI also plays a key role in conducting advanced risk assessments. AI algorithms can simulate potential cyberattacks, model their impact, and help organizations develop robust incident response strategies. This predictive capability ensures that financial entities are better prepared for emerging risks, strengthening compliance with DORA’s resilience mandates.

Recommendation: utilise threat analytics, data monitoring and compliance auditing tools such as those provided by Thales Imperva Data Security Fabric. Use other platforms for simulating incident response and automate manual processes, where possible, to incorporate into predictive incidence response strategies

2. Privacy Concerns with Generative AI

While generative AI holds significant promise, it also introduces new privacy risks, particularly when models are trained on sensitive financial and personal data. AI models can unintentionally expose underlying data through outputs, potentially revealing confidential information. For instance, a language model trained on customer interaction data might generate responses that include personally identifiable information (PII).

To address these concerns, organizations need to adopt privacy-preserving AI techniques. This may include the usage of data and the training of models within a Confidential Computing secure enclave. Techniques exist whereby the data and models could be encrypted before entering the enclave, with decryption only within an enclave that has passed attestation underpinned by cryptographic keys owned by the financial institution. Also, Differential privacy, a technique that ensures the outputs of generative AI models do not reveal any individual data points from the training set, can mitigate privacy risks. By embedding end-to-end confidential computing and differential privacy in generative AI training processes, financial institutions can comply with both DORA’s operational requirements and GDPR’s data protection rules, ensuring that AI models are both effective and privacy-conscious.

Recommendation: Thales End-to-End Data Protection, a collaboration with Intel and Microsoft to create a privacy-preserving generative AI pipeline using cloud-based secure enclave, encryption and attestation using enterprise-owned encryption keys, would be appropriate for confidential and responsible generative AI usage

3. AI-Driven Automation and Incident Reporting

Generative AI can streamline DORA’s incident reporting process, enabling financial institutions to automatically generate comprehensive reports on ICT-related incidents. This includes assessing the impact, identifying vulnerabilities, and recommending remediation steps. AI-driven automation not only accelerates the reporting process but also ensures accuracy, reducing the risk of human error in critical compliance tasks.

However, transparency is critical. Organizations must ensure that AI-generated reports remain transparent and interpretable to human regulators and auditors. This aligns with DORA’s governance standards, which emphasize accountability in ICT risk management and resilience measures.

Navigating the Future of Resilience with AI and Privacy

The future of financial services will undoubtedly be shaped by AI and data-driven innovations. For financial institutions adopting AI to enhance their operational resilience, the integration of privacy-preserving methods, such as encryption, anonymization, and differential privacy, ensures that AI does not become a privacy liability. More techniques are on the horizon such as Federated Learning, Homomorphic Encryption, and others. It is important to track these developments.

However, as the use of generative AI expands, maintaining compliance with DORA’s resilience requirements and privacy regulations like GDPR will require a thoughtful balance of innovation and privacy. By implementing robust privacy-preserving techniques alongside responsible AI practices, financial institutions can build a secure, resilient, and compliant digital future.

Learn more about DORA compliance and how Thales can help you prepare.