The healthcare sector is adopting AI faster than almost any other industry — and facing a compliance landscape that hasn't fully caught up. Two trends are creating a dangerous gap for small and mid-size practices: the rollout of HHS's new Cybersecurity Performance Goals (CPGs) and the explosive adoption of AI ambient scribes. Together, they represent the most significant HIPAA compliance exposure healthcare providers have faced in years.

If your practice has adopted any AI tool — ambient scribing, clinical decision support, patient communication, or documentation assistance — you need to understand what changed and what's at stake.

Note: This article provides general guidance on AI security and HIPAA-adjacent cybersecurity requirements. It is not legal advice. For specific HIPAA compliance decisions, consult your healthcare attorney and compliance officer.

What Are the HHS Cybersecurity Performance Goals?

In January 2025, HHS published voluntary Cybersecurity Performance Goals (CPGs) for healthcare organizations, developed with CISA. In early 2026, proposed rulemaking began to make several Essential CPGs mandatory under HIPAA's Security Rule. The comment period closed in February 2026, and enforcement guidance is expected before the end of Q2.

The CPGs divide into two tiers:

Goal Description Tier
Email Security (MFA) Multi-factor authentication on email and core systems Essential
Basic Cybersecurity Training Annual security awareness training for all staff Essential
Basic Incident Response Documented IR plan, tested annually Essential
Unique Credentials No shared login credentials for any system containing ePHI Essential
Patch Management Critical patches applied within 30 days Essential
Vendor/Supply Chain Risk Management Security assessments of third-party vendors handling ePHI Enhanced
Third-Party Assessment Annual independent security review Enhanced
Cybersecurity Insurance Documented cyber coverage appropriate to organization size Enhanced

The Essential CPGs are where small practices are most exposed. Most independent practices already have some version of MFA and training — but the "Unique Credentials" and "Basic Incident Response" requirements are failing conditions in the majority of small practice audits. And critically: AI tools are now explicitly part of the vendor risk management requirement.

The AI Ambient Scribe Problem

Ambient scribing tools — AI that listens to patient-provider conversations and generates clinical documentation — have become the fastest-growing AI category in healthcare. Tools like Suki, Nabla, Nuance DAX, and a growing number of EHR-integrated solutions are now in use at tens of thousands of practices.

The appeal is obvious: ambient scribes reduce documentation time by 30-50%, decrease after-hours charting, and improve note quality. The HIPAA risk is equally obvious — these tools process Protected Health Information (PHI) in real-time audio form, which is among the most sensitive data categories HIPAA regulates.

The BAA Gap

A Business Associate Agreement (BAA) is legally required under HIPAA any time a third party processes, stores, or transmits PHI on your behalf. Every ambient scribe vendor must have a signed BAA with your practice before you activate the tool.

In a 2025 survey of practices using ambient scribes, 31% had not completed a BAA with their scribe vendor. This is a per-incident HIPAA violation. Fines for missing BAAs start at $100 per violation and can reach $50,000 per violation in cases of willful neglect — with no cap on the number of violations in a given year.

Action item: If you are using an ambient scribe or any AI tool that processes patient conversations, billing data, or clinical records — pull up your vendor contracts today and confirm a signed BAA exists. If you can't find one, contact your vendor immediately. Most will have a standard BAA to execute in 24 hours.

Where Ambient Scribe Data Goes

Not all ambient scribe vendors handle data the same way. The key questions to ask any vendor:

Shadow AI in Healthcare: The Specific Risks

Beyond ambient scribes, the shadow AI problem is particularly acute in healthcare. Clinical staff routinely use personal AI tools to assist with:

Each of these scenarios potentially constitutes an unauthorized disclosure of PHI — a HIPAA violation regardless of whether harm results. And unlike a traditional data breach, there's no clear notification mechanism: the data went into a third-party system, and neither you nor your patient may ever know.

The June 2026 Compliance Cliff

HHS's proposed timeline puts enhanced HIPAA Security Rule enforcement — including the AI vendor requirements — in effect in mid-2026. Here's what that means practically:

Small practices are specifically targeted in HHS's enforcement prioritization. The reasoning: large health systems have compliance teams that respond to regulatory changes; small practices typically don't, creating a population of predictable non-compliers that generate enforcement actions and deter industry-wide negligence.

What to Do Before June 2026

1. Complete a BAA audit

List every third-party vendor that accesses, processes, or stores patient data. Confirm a signed BAA exists for each. Ambient scribes, AI clinical decision tools, telehealth platforms, and billing software that touches PHI all require BAAs. This should take one week.

2. Conduct an AI tool inventory

Use the anonymous survey approach from our AI security checklist. You need to know what AI tools your clinical and administrative staff are actually using — not just what you've approved. Clinical staff in particular are high-risk for shadow AI use because documentation pressure is high and the tools are genuinely useful.

3. Document vendor security assessments

For each AI vendor you're using, obtain their security documentation: SOC 2 Type II report, HITRUST certification if available, penetration test summary, and their HIPAA compliance attestation. File these in your compliance records. If a vendor can't provide any of these, that's a signal to find a replacement.

4. Update your staff training

Annual HIPAA training should explicitly cover AI-specific rules: what tools are approved, what patient data cannot go into personal AI tools, and what to do if they inadvertently share PHI with an unapproved system. Frame it as protection, not punishment.

5. Review your cyber insurance policy

Pull your cyber policy and check whether AI-related breaches are explicitly covered or excluded. Many 2024-vintage policies predate AI governance requirements and are ambiguous on AI incidents. If your policy renews before June, negotiate explicit AI breach coverage as a condition of renewal.

6. Conduct or commission a gap assessment

An independent security assessment against the CPGs and your current AI tool usage will identify specific gaps and prioritize remediation. For small practices, a focused assessment takes 1-2 days and costs significantly less than a single HIPAA fine.

The Bottom Line

Healthcare providers who have adopted AI tools without updating their HIPAA compliance posture are sitting on unreported violations. The question isn't whether the rules apply — they do — but whether an audit will surface the exposure before or after an incident forces the issue.

The practices that get ahead of this in the next 30-60 days will be compliant when enforcement ramps up. The ones that don't will be the examples in the next OCR press release.

Get Your HIPAA AI Compliance Gap Analysis

AICyberNav's free assessment includes a healthcare-specific compliance path — covering BAA gaps, AI tool exposure, CPG alignment, and prioritized remediation steps tailored to practice size and specialty.

Start Free Compliance Assessment →
No signup. Results in under 10 minutes. Healthcare-specific risk scoring.

Also see: AI Security Checklist for Small Businesses for the broader governance framework that applies beyond the healthcare context.