Attack Research

Scoring DDoS Attack Complexity: From Simple Floods to Sophisticated L7 Campaigns

Not all DDoS attacks are created equal. A volumetric SYN flood and a slow-rate application-layer campaign require fundamentally different defenses. Understanding attack complexity is the first step toward prioritizing what to protect against.

April 13, 2026 | 15 min read | Security Research

When security teams think about DDoS attacks, they often think about volume: gigabits per second, packets per second, requests per second. Bigger numbers mean bigger threats. This mental model is intuitive, widely taught, and dangerously incomplete.

A 100 Gbps volumetric flood aimed at saturating a network link is loud, visible, and well-understood by modern scrubbing services. A 500 Mbps application-layer attack that targets a specific API endpoint with valid-looking requests, rotates its fingerprint every 30 seconds, and adapts to the defenses it encounters is a fundamentally different problem. The first attack requires bandwidth. The second requires intelligence.

At DDactic, we have spent months cataloging, classifying, and scoring DDoS attack types. The result is a taxonomy of 233 distinct attack techniques and a matrix of 211 attack vectors mapped across six defense architectures. This article explains the framework we use to score attack complexity and why that score matters more than raw volume for defense planning.

233
Distinct DDoS attack techniques cataloged in DDactic's attack taxonomy

The Three Layers of DDoS

Before discussing complexity, it helps to establish the playing field. DDoS attacks operate across three layers of the network stack, each with different characteristics, different tooling, and different defense requirements.

Layer 3

Network / Volumetric

Floods the network pipe with raw traffic. SYN floods, UDP amplification (DNS, NTP, memcached), ICMP floods. Goal: saturate bandwidth or exhaust stateful device capacity. Measured in Gbps and Mpps.

Layer 4

Transport / Protocol

Abuses protocol mechanics. TCP state exhaustion, fragmentation attacks, SYN-ACK reflection, RST floods. Goal: overwhelm connection tables in firewalls, load balancers, and servers. Measured in connections per second.

Layer 7

Application

Targets application logic with valid-looking requests. HTTP floods, Slowloris, API abuse, cache bypass, login brute-force. Goal: exhaust backend compute, database, or business logic capacity. Measured in requests per second.

Most organizations have some protection at the network layer, typically through their ISP or a cloud scrubbing service. Fewer have effective protection at the transport layer. And at the application layer, where rate limits may not count traffic accurately and business logic is exposed, the gap between assumed and actual protection widens dramatically.

Layer 7 DDoS attacks now represent more than 70% of all DDoS incidents tracked by major vendors. This shift matters because L7 attacks are inherently more complex, harder to distinguish from legitimate traffic, and less dependent on raw volume to succeed.

Attack Complexity Scoring

DDactic classifies every attack technique on a 1-10 complexity scale. This score reflects how difficult the attack is to execute, how hard it is to detect, how resistant it is to standard mitigations, and how much expertise is required to deploy it effectively. The scale breaks into four tiers.

Simple

Score 1-3

Volumetric floods, SYN floods, UDP amplification, DNS reflection. High bandwidth requirements but straightforward to filter. Relies on volume over technique. Detectable by rate and signature. Mitigated by any competent scrubbing service.

Moderate

Score 4-6

Protocol abuse, IP fragmentation, Slowloris, HTTP floods with basic header randomization, cache-busting queries. Requires some understanding of target architecture. May bypass simple rate limits but still uses recognizable patterns.

Complex

Score 7-9

Application-layer attacks targeting business logic, API abuse with valid authentication, credential stuffing combined with DDoS, multi-vector campaigns that layer L3+L4+L7 simultaneously. Requires target reconnaissance and adaptive tooling.

Sophisticated

Score 10

Adaptive attacks that fingerprint the defense stack and modify behavior in real time. Zero-day protocol abuse (HTTP/2 Rapid Reset, QUIC reflection). Attacks that target specific vendor weaknesses. Requires deep knowledge of both the target and its protection vendor.

The scoring is not arbitrary. Each level corresponds to specific technical characteristics that determine how the attack interacts with defense layers.

What Each Score Level Actually Means

Score Detection Difficulty Mitigation Approach Example Techniques
1-2 Trivial. Volume spike is immediately visible. Blackhole routing, upstream scrubbing ICMP flood, UDP flood, DNS amplification
3-4 Easy. Pattern matches known signatures. Signature-based WAF rules, SYN cookies SYN flood, NTP amplification, basic HTTP GET flood
5-6 Moderate. Requires behavioral analysis. Rate limiting, JavaScript challenges, connection limits Slowloris, HTTP POST flood, cache bypass, header manipulation
7-8 Hard. Mimics legitimate traffic closely. ML-based anomaly detection, application-aware filtering API abuse with valid tokens, multi-vector L3+L7, geographic rotation
9-10 Very hard. May be indistinguishable from real users. Vendor-specific tuning, manual intervention, protocol-level patches HTTP/2 Rapid Reset, defense-aware adaptive campaigns, zero-day protocol abuse

Why Separate Complexity from Impact?

A simple attack (score 2) against an unprotected target can be devastating. A sophisticated attack (score 10) against a well-defended target may fail. Complexity measures the attack itself, not its outcome. Impact depends on the interaction between attack complexity and defense posture. Both dimensions matter, but they measure different things.

Why Complexity Matters More Than Volume

The DDoS industry has trained organizations to think in bandwidth. Vendor marketing highlights record-breaking attacks: 3.47 Tbps against Azure, 5.6 Tbps targeting OVH. These numbers are real, but they are misleading as a threat model for most organizations.

Here is the reality: a 1 Gbps application-layer attack can be more damaging than a 100 Gbps volumetric flood.

This is not a theoretical claim. It is a mathematical consequence of how modern infrastructure processes traffic at different layers.

The False Confidence Trap

Organizations that test only against simple attacks (score 1-3) develop false confidence in their protection. They see their CDN absorb a volumetric flood and conclude they are protected. Then a targeted L7 campaign at a fraction of the bandwidth takes their application offline. The defense worked perfectly against the attack it was tested for. It just was never tested against the attack that actually came.

This is why DDactic's DDoS testing methodology emphasizes complexity escalation rather than volume escalation. Starting at score 1 and incrementally increasing through the complexity spectrum reveals the actual breaking point of each defense layer.

The Attack Vector Matrix

A single attack technique can manifest differently depending on the target's architecture. An HTTP/2 multiplexing attack behaves differently against Cloudflare than against AWS ALB, and differently again against an on-premises Radware AppWall. The technique is the same, but the vector, meaning the specific path the attack takes through the target's defense stack, varies.

DDactic maintains a matrix of 211 attack vectors mapped across six common defense architectures:

  1. CDN + Cloud WAF (e.g., Cloudflare, Akamai, Fastly)
  2. Cloud-native protection (e.g., AWS Shield + WAF, Azure DDoS Protection + Front Door)
  3. On-premises appliance (e.g., Radware DefensePro, F5 BIG-IP, Arbor AED)
  4. Hybrid (cloud scrubbing + on-prem appliance)
  5. DNS-only protection (e.g., NS1, Neustar UltraDNS)
  6. Unprotected / minimal (direct-to-origin, ISP-only)
211
Attack vectors mapped across 6 defense architectures in DDactic's threat model

Each cell in the matrix contains three data points: the complexity score for that specific vector-architecture combination, the expected mitigation effectiveness of default configurations, and the escalation path if the initial technique is blocked.

This matrix is not theoretical. It is built from DDactic's internal research into vendor behavior, protocol handling, and defense bypass techniques. When we assess an organization's DDoS posture, the matrix tells us which attack vectors are most likely to succeed against their specific architecture, allowing us to prioritize testing and recommendations accordingly.

Correlation with OPI Scores

Our OPI benchmarking data shows a clear inverse correlation between an organization's OPI score and the highest complexity level that can penetrate its defenses. Organizations with OPI scores above 80 (Grade A) typically withstand attacks up to complexity 7-8. Organizations below 40 (Grade F) often fail at complexity 3-4. The matrix quantifies exactly where each architecture begins to break down.

Anatomy of a Multi-Vector Campaign

Real-world attacks rarely stay at a single complexity level. Modern DDoS campaigns are multi-phase operations that escalate through the complexity spectrum, probing defenses and adapting in real time. A typical sophisticated campaign follows a pattern like this:

// Phase 1: Reconnaissance (Complexity 1-2) probe target DNS records, identify CDN vendor probe HTTP headers, identify WAF vendor and version probe TLS fingerprint, map load balancer configuration duration: 5-15 minutes // Phase 2: Volumetric Distraction (Complexity 3) launch UDP amplification at 50 Gbps // Purpose: trigger scrubbing, force defense into volumetric mode duration: 10-30 minutes // Phase 3: Protocol Pressure (Complexity 5-6) launch Slowloris + HTTP POST flood while volumetric continues // Purpose: consume connection table capacity, test rate limits duration: 15-45 minutes // Phase 4: Application Targeting (Complexity 7-8) switch to authenticated API abuse with valid session tokens target endpoints identified during reconnaissance // Purpose: bypass WAF, exhaust application resources duration: 30-120 minutes // Phase 5: Adaptive Evasion (Complexity 9-10) rotate TLS fingerprint every 30 seconds mirror legitimate browser behavior profiles target vendor-specific weaknesses identified in Phase 1 // Purpose: evade ML-based detection, sustain pressure duration: hours to days

Each phase tests a different layer of defense. An organization that handles Phase 2 confidently may crumble in Phase 4. One that survives Phase 4 may be unprepared for the fingerprint rotation in Phase 5. The campaign's strength is not any single phase. It is the combination, and the attacker's ability to shift tactics based on what they observe.

Testing Across the Complexity Spectrum

If your DDoS testing only covers simple attacks, you are validating the locks on the front door while leaving the windows open. The problem is not that simple-attack testing is wrong. It is that it gives incomplete information, and incomplete information creates false confidence.

The Escalation Testing Approach

DDactic's simulation engine covers the full complexity spectrum using a methodology we call escalation testing. The process works like this:

  1. Baseline (Complexity 1-3): Start with known, well-understood attack types. SYN floods, UDP amplification, basic HTTP floods. This establishes whether foundational defenses are in place and functioning. Most organizations pass this stage.
  2. Pressure (Complexity 4-6): Introduce protocol-level attacks and evasion techniques. Slowloris, fragmented packets, cache-busting queries, header randomization. This tests whether defenses can distinguish between unusual traffic and malicious traffic. Many organizations begin to show gaps here.
  3. Targeted (Complexity 7-8): Move to application-aware attacks that require knowledge of the target's specific infrastructure. Multi-vector combinations, API endpoint targeting, authenticated request abuse. This tests whether application-layer defenses are tuned for the target's actual attack surface.
  4. Adaptive (Complexity 9-10): Deploy attacks that modify their behavior based on the defense response. Fingerprint rotation, challenge-solving, vendor-specific evasion techniques. This tests the upper bound of the defense stack's capability.

The escalation stops when defenses fail or when all complexity levels have been tested. The result is a clear picture of the highest complexity level each defense layer can handle, and the specific point where protection breaks down.

Why This Matters for Defense Planning

Knowing your breaking point changes how you allocate budget. If your infrastructure handles complexity 1-6 but fails at 7, you know exactly what to invest in: application-layer detection, API-specific rate limiting, behavioral analysis. You do not need to buy more bandwidth. You need smarter filtering.

If your infrastructure fails at complexity 3, the answer is different: you need foundational DDoS protection before worrying about advanced evasion. The complexity score prevents organizations from over-investing in sophisticated defenses while basic protections are missing, and prevents them from under-investing in L7 defenses while volumetric protection gives false comfort.

Breaking Point Diagnosis Priority Investment
Complexity 1-2 No DDoS protection or misconfigured scrubbing Cloud DDoS protection service, CDN deployment
Complexity 3-4 Basic protection present but WAF rules are default WAF tuning, rate limit configuration, origin protection
Complexity 5-6 Good fundamentals but protocol-level gaps HTTP/2 hardening, connection limits, challenge mechanisms
Complexity 7-8 Strong perimeter but application layer exposed API security, behavioral detection, application-aware WAF rules
Complexity 9-10 Comprehensive defense, advanced tuning needed Vendor-specific hardening, zero-day response plan, active threat hunting

Building a DDoS Threat Model Around Complexity

A DDoS threat model that only considers volume is like a building code that only considers earthquakes. Floods, fires, and wind exist too. An effective DDoS threat model considers all three dimensions: the volume an attacker can generate, the complexity of techniques they can deploy, and the duration they can sustain the campaign.

For most organizations, complexity is the dimension where the gap between assumed and actual protection is widest. They have bandwidth to absorb volume. They have timeouts to outlast duration. But they do not have the application-layer intelligence to survive a targeted, complex, adaptive attack.

The attack taxonomy and vector matrix exist to close this gap. By mapping every known attack technique to a complexity score and every vector to a specific defense architecture, DDactic provides security teams with a concrete, testable framework for understanding their DDoS threat model. Not in the abstract, but against specific attacks, at specific complexity levels, targeting their specific infrastructure.

Find Your Breaking Point

DDactic's free infrastructure scan reveals which DDoS attack complexity levels your defenses can handle and where they begin to fail. See your actual breaking point before attackers discover it for you.

DDoS Attack Complexity DDoS Attack Vectors DDoS Attack Types Application Layer DDoS L7 DDoS DDoS Threat Model Attack Taxonomy Multi-Vector DDoS DDoS Simulation