April 9, 2025, 8:33 AM
The Spring Meeting is the largest gathering of competition, consumer protection, and data privacy professionals globally, with lawyers, academics, economists, enforcers, journalists, and students from around the world. During the Spring Meeting, Axinn associates attended thought leadership panels to capture key insights. Below are top takeaways from the “Health, Privacy, and AI” panel that businesses should keep on their radar.
To read all the articles in the series, click here.
The February 2024 ransomware attack on Change Healthcare was the largest healthcare data breach in U.S. history. The attack disrupted operations—impacting patient care and provider finances—and potentially exposed the personal, health, and financial information of approximately 190 million people. The exposed information includes names, addresses, dates of birth, phone numbers, email addresses, health insurance details, medical records, billing information, and social security numbers.
Cyberattacks of this magnitude raise questions related to compliance with applicable laws and regulations, and the mitigation of risks associated with the collection, storage, and use of personally identifiable information (PII) and Protected Health Information (PHI). At the same time, businesses are under extreme pressure to continue to innovate, including through the use of AI, which requires vast amounts of data for training and inference.
- Having a plan for your data can help mitigate risks. Companies that collect, store, and use large amounts of PII or PHI should have a clear plan for where they store the data—whether in the cloud or on prem, how that data is ported, how much data is collected, and whether data is de-identified, based on their needs and risk profile. Additionally, companies should have well-documented data governance in place, which can mitigate legal risks in the event of an investigation or litigation.
- Data privacy due diligence is imperative during M&A. Merging parties as well as companies acquiring another’s assets should complete data privacy due diligence to identify any potential legal risks related to the sale or transfer of PHI and PII. Legal risks that can arise include those related to compliance with applicable laws and regulations, contractual obligations, and other private causes of action.
- Risks include algorithmic disgorgement. The financial cost of not completing data privacy due diligence may be substantial, and among the legal risks to companies is the possibility that investments made in artificial intelligence are lost due to algorithmic disgorgement. Algorithmic disgorgement involves the removal of data from algorithms and AI models, or the forced destruction of algorithms and AI models trained on certain data. Algorithmic disgorgement has been employed by the FTC as a legal remedy, the first time as part of the FTC’s settlement with Cambridge Analytica, and most recently in the FTC’s settlement with the Rite Aid Corporation.

To subscribe to our publications, click here.
Tags
News & Insights
News & Insights
National Bar Association 100th Annual Convention and Exhibits
Sponsorship
Antitrust
Keeping Pace: Updates in Cartel Enforcement
Webinar
Antitrust
GCR Live: Law Leaders Europe 2025
Speaking Engagement
Antitrust
Navigating the Hart-Scott-Rodino Act in 2025
Webinar
Antitrust
AHLA Annual Meeting 2025
Speaking Engagement
Antitrust
State by State, Pre-Merger Notifications Expand
Axinn Viewpoints
Antitrust
FTC claims diverging market from EU in Mars/Kellanova clearance
Media Mentions
Keeping an Eye on the Ball: America First Antitrust Weighs in on the “Uniquely American System of Scholar Athletics”
Axinn Viewpoints
Antitrust
Wait, We Have To Tell Who? What You Need To Know About State Premerger Notifications
Podcast
MBK | MSK Fellowship Summit 2025
Sponsorship
Antitrust