Incident Response

FFIEC 36-Hour Incident Notification Rule: What Banking Organizations Must Report, When, and to Whom

May 14, 2026 Rebecca Leung
Table of Contents

TL;DR

  • Banking organizations must notify their primary federal regulator within 36 hours of determining a qualifying “notification incident” — not of first detecting a potential incident.
  • The threshold is a “notification incident”: an event that has materially disrupted — or is reasonably likely to materially disrupt — banking operations, product delivery, or financial stability.
  • Bank service providers face a separate, narrower obligation: notify affected banking organization customers (not regulators) as soon as possible when disruptions last four or more hours.
  • This is an operational resilience rule, not a consumer breach notification rule — it runs parallel to your state breach notification obligations, not instead of them.

Your CISO just walked into your office. There’s been a ransomware attack. Backup systems are down. Core banking operations are impaired. Three things need to happen in the next few hours: containment, executive escalation, and — somewhere on a checklist that probably isn’t short enough — notifying your federal regulator.

That last one is exactly what the computer-security incident notification rule requires. And whether it’s the first thing your team reaches for or the last, it has a hard deadline.

Background: What the Rule Is and Where It Came From

In November 2021, the OCC, Federal Reserve, and FDIC jointly finalized the Computer-Security Incident Notification rule. The rule took effect April 1, 2022, with full compliance required by May 1, 2022. It is codified at 12 CFR Part 53 (OCC), 12 CFR Part 225 (Fed), and 12 CFR Part 304 (FDIC).

The agencies framed the purpose plainly: regulators need early awareness of significant cyber and operational events so they can assess systemic risk, deploy examiner resources, and coordinate responses before problems compound. The rule wasn’t designed to punish banks for getting attacked. It was designed to ensure regulators aren’t the last to know.

This is not your state breach notification rule. It’s not the SEC’s cybersecurity disclosure requirement. It runs parallel to those obligations — and in a real incident, you’ll likely be managing multiple notification timelines simultaneously.

The Two-Tier Framework: Banking Organizations vs. Bank Service Providers

The rule creates two distinct notification obligations that practitioners frequently conflate.

Tier 1: Banking Organizations

A banking organization supervised by the OCC, Federal Reserve, or FDIC must notify its primary federal regulator as soon as possible and no later than 36 hours after the banking organization determines that a notification incident has occurred.

Two elements carry the most weight here.

What is a “notification incident”? Not every cyberattack or system failure qualifies. Under the rule, a notification incident is a computer-security incident that has materially disrupted or degraded — or is reasonably likely to materially disrupt or degrade — any of the following:

  1. The banking organization’s ability to carry out banking operations, activities, or processes, or deliver banking products and services to a material portion of its customer base
  2. A business line that, upon failure, would result in a material loss of revenue, profit, or franchise value
  3. Operations whose failure would pose a threat to the financial stability of the United States

The agencies provided examples in the final rule preamble: a major computer-system failure, a DDoS attack that disrupts customer account access for an extended period, ransomware hitting a core banking system. What’s excluded: phishing attempts that don’t result in successful compromise, scheduled maintenance, minor outages affecting a small subset of customers, incidents contained before materially impacting operations.

When does the 36-hour clock start? This is where most compliance gaps emerge. The agencies were explicit: the clock starts when the banking organization determines that a notification incident has occurred — not when the incident is first detected.

A bank that detects a ransomware alert at 2am but needs time to assess whether core systems are materially impacted doesn’t start the 36-hour clock until it concludes they are. The agencies acknowledged that a “reasonable amount of time” is needed to investigate before a determination can be made.

This matters operationally. But “reasonable” has limits. Once the facts reasonably support a determination that a notification incident has occurred, the investigation period ends and the clock runs. The determination can’t be indefinitely deferred to extend the notification window.

Tier 2: Bank Service Providers

Bank service providers (BSPs) — cloud platforms, core banking systems, payment processors, technology vendors — operate under a different framework.

When a BSP determines it has experienced a computer-security incident that has caused, or is reasonably likely to cause, a material service disruption or degradation for four or more hours, it must notify each affected banking organization customer as soon as possible.

Key differences from the banking organization rule:

  • No specific hour deadline. The standard is “as soon as possible,” not 36 hours.
  • Four-hour threshold. Disruptions lasting less than four hours don’t trigger the notification obligation.
  • Banking organizations, not regulators. BSPs notify their banking customers directly — the regulator notification responsibility stays with the bank.
  • Scheduled maintenance is excluded. Disruptions from previously communicated maintenance, testing, or software updates don’t count.

This framework creates an important dependency: a bank relying on a BSP for critical services may not receive notification until the BSP has made its own determination — which could be hours into a disruption that is already consuming the bank’s 36-hour window. Closing that gap in vendor contracts is a practical necessity.

What to Actually Say When You Call the Regulator

The rule is intentionally minimal on format requirements. The initial notification does not need to be a formal written report. A telephone call or email to the bank’s primary federal supervisor is sufficient. The purpose is early awareness, not a complete incident disclosure.

Contact information by regulator:

  • OCC-supervised banks: 1-800-613-6743 (24-hour supervisory information line, confirmed in OCC Bulletin 2021-55)
  • Federal Reserve-supervised banks: SR 22-4 provides district-specific contact information
  • FDIC-supervised banks: The FDIC Regional Office and your case manager

The follow-up — a more detailed written account of the incident, your response, and remediation — typically comes during subsequent examiner dialogue. But the 36-hour notification is the gate event.

A practical recommendation: keep regulator notification contact information directly in your incident response playbook, alongside state breach notification deadlines and SEC disclosure timelines. Under a real incident, spending 20 minutes searching for a phone number is 20 minutes you don’t have.

How This Intersects With Other Notification Regimes

The 36-hour rule covers operational disruption — the degradation of a bank’s ability to function. It is separate from — and runs parallel to — several other notification obligations.

Consumer breach notification: If the incident involves compromise of customer personal information, state breach notification laws apply on their own timelines (typically 30–90 days, with some states tighter). Assessing consumer data exposure and operational disruption must happen simultaneously during incident triage, not sequentially.

SEC 8-K disclosure: Public companies must assess whether the incident constitutes a “material” cybersecurity incident under SEC rules and, if so, disclose on Form 8-K within four business days of making that determination. The materiality standard and the notification incident standard are different assessments by different functions.

FTC Safeguards Rule: Non-bank financial institutions covered by GLBA must notify the FTC within 30 days of discovering a breach affecting 500 or more customers. If you’re a fintech operating outside the banking organization definition but within GLBA’s scope, this is your primary federal notification obligation.

Managing all of these timelines in a real incident requires pre-built decision trees that run in parallel — assigning ownership, documenting the determination process, and tracking each notification timeline independently.

The BSP Gap: What Contracts Need to Address

One underappreciated implication of the rule is what it means for your vendor contracts.

If a critical vendor experiences an outage that impairs your banking operations, that BSP must notify you “as soon as possible” — but that’s measured from when they make their own determination, which may not happen until after your 36-hour clock has started running.

Best practice: negotiate notification timelines into critical vendor contracts that are tighter than the regulatory floor. If a vendor’s service goes down, you want to know within one to two hours — not “as soon as possible” measured from a determination timeline you can’t control. This is especially important for vendors providing core banking infrastructure, payment processing, and customer-facing systems.

Also build into your vendor breach response playbook an explicit decision point: “Does this vendor incident constitute a notification incident for our banking organization?” A vendor outage that impairs your operations may trigger your 36-hour obligation even if the incident originated externally. Waiting for the vendor to tell you that is not a defensible approach.

Building a 36-Hour-Ready Program

Most banking organizations that struggle with this rule aren’t missing the policy. They’re missing the operational infrastructure to execute it under pressure.

What ready looks like:

ComponentWhat It Requires
Contact directoryRegulator phone numbers and emails in the IR playbook — not a website URL you’ll search at 3am
Determination standardWritten criteria for what constitutes a “determination,” signed off by legal and compliance
Escalation pathWho makes the notification determination? Does it require legal sign-off? How does compliance get looped in during a fast-moving incident?
Vendor contract languageCritical BSPs required to notify within your internal window, not just “as soon as possible”
Tabletop testingAt least one exercise per year where a scenario explicitly tests the notification determination — when does this become a notification incident?

The incident response team owns containment. Compliance or legal typically owns the notification determination. In a fast-moving incident, both tracks must run simultaneously — not sequentially.

Common Gray Areas

“Has it materially disrupted operations — or is it likely to?” The “reasonably likely” standard creates a forward-looking obligation. If a ransomware attack has encrypted 40% of your systems and your security team believes core banking will be affected within hours, the notification obligation may arise before core systems are actually down.

System upgrade failures: The agencies confirmed in the final rule preamble that a planned system upgrade that fails and leads to widespread customer and employee access outages qualifies as a notification incident. It doesn’t need to be a cyberattack.

Third-party-caused outages: If your cloud provider goes down and your banking operations are materially affected, you may have a notification incident — even though the source was your vendor. Assess the impact on your operations, not the origin of the event.

So What?

If your incident response plan doesn’t explicitly address the notification determination — who makes it, based on what evidence, and how they contact the regulator — you have a program gap that examiners are actively probing. The rule has been in effect since May 2022, and supervisory teams are now asking about notification procedures as a standard part of cybersecurity reviews.

The answer isn’t just having the right contact number on file. It’s a documented determination standard, a tested escalation path, and vendor contracts that give you the notification lead time you need to meet your own deadlines.

The Incident Response & Breach Notification Kit includes a severity classification matrix that maps incident types to notification obligations — including the 36-hour rule, state breach notification timelines for all 50 states, and SEC disclosure triggers — in a single decision framework your team can use under real-time pressure, not after the fact.

Frequently Asked Questions

What exactly counts as a 'notification incident' under the 36-hour rule?
A notification incident is a computer-security incident that has materially disrupted or degraded — or is reasonably likely to materially disrupt or degrade — a banking organization's ability to carry out banking operations, deliver banking products and services to a material portion of its customer base, or operate business lines whose failure would result in material loss of revenue, profit, or franchise value. Examples include ransomware attacks on core banking systems, DDoS attacks that disrupt customer account access, and major system failures that impede banking operations. Phishing attempts that don't result in successful compromise and minor outages affecting a small subset of customers generally don't qualify.
When exactly does the 36-hour clock start — detection or determination?
The 36-hour clock starts when the banking organization determines that a notification incident has occurred — not when the incident is first detected. The agencies explicitly anticipated that banking organizations would need a 'reasonable amount of time' to investigate before making that determination. There is no prescribed investigation window, but the determination must be made reasonably promptly once the facts are clear. Indefinitely deferring a determination to extend the clock is not an acceptable strategy.
Are non-bank fintechs covered by this rule?
No — the rule applies to banking organizations supervised by the OCC, Federal Reserve, or FDIC. Non-bank fintechs are not directly subject to this specific rule. However, if your fintech operates with an FDIC-insured bank sponsor, that bank may pass notification obligations down contractually, and your own incident could trigger your sponsor bank's notification requirement. Separate notification requirements also apply under the FTC Safeguards Rule, SEC cybersecurity disclosure rules, and state breach notification laws.
What are bank service providers required to do under this rule?
Bank service providers (BSPs) must notify each affected banking organization customer as soon as possible when they experience a computer-security incident that has caused, or is reasonably likely to cause, a material service disruption or degradation lasting four or more hours. BSPs notify their banking organization customers — not the regulators directly. The BSP rule has a four-hour threshold the banking org rule lacks, and 'as soon as possible' rather than a specific deadline.
What does the actual notification to a regulator involve?
The initial notification does not require a formal report or specific format — a telephone call or email to the bank's primary federal supervisor is sufficient. The purpose is early regulatory awareness, not a complete incident disclosure. OCC contact: 1-800-613-6743 (24-hour line). Federal Reserve: SR 22-4 provides district-specific contact information. FDIC: notifications go to the Regional Office. A more detailed written account of the incident and response typically follows in subsequent supervisory dialogue.
How does the 36-hour rule interact with other notification requirements?
The 36-hour rule runs parallel to other obligations, not instead of them. State breach notification laws still apply if customer personal information was compromised (typically 30–90 days depending on the state). The SEC's 4-day 8-K disclosure requirement applies to public companies for material cybersecurity incidents. The FTC Safeguards Rule requires non-bank financial institutions to notify the FTC within 30 days of a breach affecting 500+ customers. In a real incident, your team will likely be managing all of these simultaneously.
Rebecca Leung

Rebecca Leung

Rebecca Leung has 8+ years of risk and compliance experience across first and second line roles at commercial banks, asset managers, and fintechs. Former management consultant advising financial institutions on risk strategy. Founder of RiskTemplates.

Related Framework

Incident Response & Breach Notification Kit

Step-by-step incident response playbooks and breach notification templates for all 50 states.

Immaterial Findings ✉️

Weekly newsletter

Sharp risk & compliance insights practitioners actually read. Enforcement actions, regulatory shifts, and practical frameworks — no fluff, no filler.

Join practitioners from banks, fintechs, and asset managers. Delivered weekly.