April 9, 2025, 8:33 AM
The Spring Meeting is the largest gathering of competition, consumer protection, and data privacy professionals globally, with lawyers, academics, economists, enforcers, journalists, and students from around the world. During the Spring Meeting, Axinn associates attended thought leadership panels to capture key insights. Below are top takeaways from the “Health, Privacy, and AI” panel that businesses should keep on their radar.
To read all the articles in the series, click here.
The February 2024 ransomware attack on Change Healthcare was the largest healthcare data breach in U.S. history. The attack disrupted operations—impacting patient care and provider finances—and potentially exposed the personal, health, and financial information of approximately 190 million people. The exposed information includes names, addresses, dates of birth, phone numbers, email addresses, health insurance details, medical records, billing information, and social security numbers.
Cyberattacks of this magnitude raise questions related to compliance with applicable laws and regulations, and the mitigation of risks associated with the collection, storage, and use of personally identifiable information (PII) and Protected Health Information (PHI). At the same time, businesses are under extreme pressure to continue to innovate, including through the use of AI, which requires vast amounts of data for training and inference.
- Having a plan for your data can help mitigate risks. Companies that collect, store, and use large amounts of PII or PHI should have a clear plan for where they store the data—whether in the cloud or on prem, how that data is ported, how much data is collected, and whether data is de-identified, based on their needs and risk profile. Additionally, companies should have well-documented data governance in place, which can mitigate legal risks in the event of an investigation or litigation.
- Data privacy due diligence is imperative during M&A. Merging parties as well as companies acquiring another’s assets should complete data privacy due diligence to identify any potential legal risks related to the sale or transfer of PHI and PII. Legal risks that can arise include those related to compliance with applicable laws and regulations, contractual obligations, and other private causes of action.
- Risks include algorithmic disgorgement. The financial cost of not completing data privacy due diligence may be substantial, and among the legal risks to companies is the possibility that investments made in artificial intelligence are lost due to algorithmic disgorgement. Algorithmic disgorgement involves the removal of data from algorithms and AI models, or the forced destruction of algorithms and AI models trained on certain data. Algorithmic disgorgement has been employed by the FTC as a legal remedy, the first time as part of the FTC’s settlement with Cambridge Analytica, and most recently in the FTC’s settlement with the Rite Aid Corporation.

To subscribe to our publications, click here.
Tags
News & Insights
News & Insights
Consumer Brands CPG Legal Forum 2026
Speaking Engagement
NBA CLS 39th Annual Corporate Counsel Conference
Sponsorship
Antitrust
New Frontiers of Antitrust – 16th Annual International Conference of Concurrences Review
Speaking Engagement
Antitrust
GCR Live Global: Navigating the Future of Antitrust 2026
Speaking Engagement
Antitrust
Axinn Earns Top Antitrust Rankings Coast to Coast in GCR 100
Awards & Recognitions
Why Should I Care About Property Controls?
Speaking Engagement
Antitrust
What’s the Score, Your Honor? Conversation With Judge Richard Boulware
Podcast
LCA Renaissance Symposium XIX 2025
Speaking Engagement
Antitrust
Stakeholders 25th Annual CYOC Career Development Conference
Sponsorship
Antitrust
Japan Competition Law Update
Axinn Viewpoints
Antitrust
