April 9, 2025, 8:33 AM
The Spring Meeting is the largest gathering of competition, consumer protection, and data privacy professionals globally, with lawyers, academics, economists, enforcers, journalists, and students from around the world. During the Spring Meeting, Axinn associates attended thought leadership panels to capture key insights. Below are top takeaways from the “Health, Privacy, and AI” panel that businesses should keep on their radar.
To read all the articles in the series, click here.
The February 2024 ransomware attack on Change Healthcare was the largest healthcare data breach in U.S. history. The attack disrupted operations—impacting patient care and provider finances—and potentially exposed the personal, health, and financial information of approximately 190 million people. The exposed information includes names, addresses, dates of birth, phone numbers, email addresses, health insurance details, medical records, billing information, and social security numbers.
Cyberattacks of this magnitude raise questions related to compliance with applicable laws and regulations, and the mitigation of risks associated with the collection, storage, and use of personally identifiable information (PII) and Protected Health Information (PHI). At the same time, businesses are under extreme pressure to continue to innovate, including through the use of AI, which requires vast amounts of data for training and inference.
- Having a plan for your data can help mitigate risks. Companies that collect, store, and use large amounts of PII or PHI should have a clear plan for where they store the data—whether in the cloud or on prem, how that data is ported, how much data is collected, and whether data is de-identified, based on their needs and risk profile. Additionally, companies should have well-documented data governance in place, which can mitigate legal risks in the event of an investigation or litigation.
- Data privacy due diligence is imperative during M&A. Merging parties as well as companies acquiring another’s assets should complete data privacy due diligence to identify any potential legal risks related to the sale or transfer of PHI and PII. Legal risks that can arise include those related to compliance with applicable laws and regulations, contractual obligations, and other private causes of action.
- Risks include algorithmic disgorgement. The financial cost of not completing data privacy due diligence may be substantial, and among the legal risks to companies is the possibility that investments made in artificial intelligence are lost due to algorithmic disgorgement. Algorithmic disgorgement involves the removal of data from algorithms and AI models, or the forced destruction of algorithms and AI models trained on certain data. Algorithmic disgorgement has been employed by the FTC as a legal remedy, the first time as part of the FTC’s settlement with Cambridge Analytica, and most recently in the FTC’s settlement with the Rite Aid Corporation.

To subscribe to our publications, click here.
News & Insights
News & Insights
Informa 35th Annual Advanced EU London Conference
Speaking Engagement
Antitrust
AHLA Health Care Transactions Program 2026
Sponsorship
Antitrust
Conspiracy Theories Newsletter, 2026 Edition
Byline Articles
Antitrust
Axinn Partners John Harkrider and Craig Minerva Win 2026 Concurrences Antitrust Writing Award
Awards & Recognitions
Antitrust
Axinn Wins Firm of the Year – Americas at 2026 Global Competition Review Awards
Awards & Recognitions
Antitrust
IPWatchdog Sixth Annual Live Conference
Speaking Engagement
Intellectual Property
Is Any Deal Safe From Review in the EU? Implications of the “Towercast” Judgment
Podcast
Antitrust
Axinn Antitrust Insight: HSR Overhaul: Out with the New, In with the Old
Axinn Viewpoints
Antitrust
The DOJ's Whistleblower Program and Leniency
Axinn Viewpoints
Antitrust
Squires Again Broadens Discretion for Post-Grant Proceedings
Axinn Viewpoints
Intellectual Property
