Layer 7 Dos: Dissecting Application Layer DDoS Attacks

Layer 7 attacks harness the web application logic and aim at exhausting the resources of a web server as it processes “tough” queries, as well as intensive processing functions or memory.

Distributed denial-of-service attacks zeroing in on popular websites are typically deployed using thousands of compromised devices. These onslaughts are mostly aimed at overwhelming the target systems with high-volume traffic that clogs up the communication channel. They are categorized as Layer 3 DoS/DDoS attacks hitting the network layer of the OSI model and featuring a large number of packets being fired at a host. As opposed to these, Layer 7 (application layer DoS/DDoS) attacks target weak links of web applications.

To begin with, here are some statistics based on the findings of Incapsula application delivery platform: from 2016 onwards, application layer DoS/DDoS attacks have been prevailing over classic network layer incursions.

One of the obstacles to identifying these types of attacks is that a web application cannot easily differentiate them from regular traffic. There are many factors underlying this hurdle, but perhaps the key reason is that IP addresses cannot be considered to be clear-cut indicators of compromise. When a network layer attack is underway, it’s possible to detect the rogue traffic and block the offending IP addresses (and only if attackers do not use advanced VPN services). In the case of Layer 7 DoS, though, this objective is more challenging because it presupposes determining the malicious entities without blocking regular users. Furthermore, routine use of a host (not attack-related) can exhaust its resources as well.


Main types of DoS/DDoS attacks

Volumetric attacks are aimed at overwhelming the bandwidth capabilities of a host infrastructure’s web application by sending an abnormally high volume of traffic to it. This traffic usually comes in the form of UDP/ICMP flood.

Layer 3 attacks typically exploit weaknesses of TCP protocol stack architecture. The adversary sends packets that overwhelm, distort or destroy connection status information, which causes extra load on the target host’s network processing functions and affects its overall responsiveness. The most common vectors of these attacks include TCP SYN flood, TCP fragmentation, and teardrop.

Layer 7 attacks harness the web application logic and aim at exhausting the resources of a web server as it processes “tough” queries, as well as intensive processing functions or memory.


Application capacity

Most web servers can process data simultaneously generated by several hundred regular users. The problem is, one attacker can generate enough traffic from a single host to cause denial-of-service of a web application. Balancing the load isn’t effective in this scenario.

The main concerns here are as follows: CPU utilization – when 99% of CPU capacity is used up, other critical processes will halt; RAM – unacceptable memory allocation, leaks and memory exhaustion cause other critical processes to stop responding; processes and flows – deadlock, forks, race condition; disk – the disk overflow issue.


RAM is one of the most important resources of a web server. The following attacks seek to exhaust it:

Recursion. Here’s a good example of recursive code – include (current_file.php). PHP will allocate a new portion of memory for each inclusion and iterate the process until there’s no memory left. This vulnerability can be identified as classic LFI (local file inclusion).

Zip bomb. Web applications that allow uploading compressed files and extracting their contents can be susceptible to such an attack, especially if the application (or library that processes the decompression) doesn’t check the file properly.

XML bomb. These entities can open up in character strings and also in the sequences of other entities. Whereas recursion can be prohibited by the standard, there are no restrictions regarding the permissible nesting depth. This allows for compact rendering of very long text lines – similarly to how archive solutions do it – and poses as the core of the so-called “billion laughs” attack.

Deserialization. Although this is a relatively new type of attack, it was included in OWASP Top 10-2017 A8-Insecure Deserialization. This technique comes down to restoring the initial state of data structure from the bit sequence. It can exhaust memory resources if there is poor user input control in place.

File headers. Manipulating the values of file headers can exhaust a server’s resources, too. This holds true if the computation is performed in the input file, where file size is stored in its header. These can be images, video files, documents, etc. The pixel flood attack makes a good example.

Reading infinite data streams. This technique boils down to reading /dev/zero or /dev/urandom via LFI, using 1TB Speedtest, etc.


CPU is another important parameter of a web server, where attacks exhausting available processing capacity can take down a web application.

reDOS – Regular Expression Denial of Service. This is a relatively new type of attack. It was originally discovered at Stack Overflow. It wasn’t pulled off by a rogue player. Instead, it was triggered by a user who included 20,000 blank space symbols in a fragment of code. The regular expression was written in such a way that it instructed the system to check a line consisting of 20,000 symbols in a very large number of backtracking steps (20,000 + 19,000 +… + 2 + 1). If a web application allows regular expressions, it makes sense to scrutinize the input data.

SQL injection. The use of SQL injection can reduce the productivity of a web application considerably, especially via functions like ‘sleep’, ‘benchmark’, etc.

Fork bomb. This attack invokes processes that iterate over and over, using up all resources of the system. The most common process of this sort is ‘:(){ :|:& };:’.

Resource/function abuse. A perpetrator can spot a resource-heavy command in a web application and generate numerous queries to it in order to exhaust available resources. Abusing password hashing functions is a good example.

SSRF. By exploiting Server Side Request Forgery vulnerabilities, a threat actor may be able to exhaust the resources of a targeted server.


Disk space is a critical characteristic of a web server, too.

Uploading large files onto a server. This is the most obvious method to flood a system with data. If a web application doesn’t have appropriate restrictions in place, an adversary can keep uploading data onto the system until the web server runs out of resources.

Flooding system logs. In a scenario where no log rotation function is in effect, an attacker can flood system logs or cause a huge number of these logs to be generated, which will ultimately exhaust disk space.


Web application testing tools

We will deliberately skip highly specialized utilities like LOIC (Low Orbit Ion Cannon) and HOIC (High Orbit Ion Cannon), which are aimed at destabilizing the work of specific web applications.

Malicious use of these tools is forbidden and can be subject to legal prosecution in your country of residence. Be sure to only use them for stress testing your own servers or the ones being stress tested with the official consent of their owner.

  • Slowloris is a well-known denial-of-service attack tool. There is also a corresponding NSE script for nmap.
  • HULK (HTTP Unbearable Load King) – this one generates a sizeable flow of unique queries that consumes a maximum of the web server’s resources. In order to tangle the flow filtering routine, HULK leverages different user agent values for each query, obfuscates the referrer, uses no-cache and keep-alive attributes, as well as unique URLs in these queries.
  • OWASP DoS HTTP POST is a tool by the OWASP community that generates “slow” HTTP requests.
  • The GoldenEye HTTP Denial of Service Tool is a Python app that exploits HTTP Keep-Alive + NoCache attack vector.



   One of the most effective ways to protect a web application is to stress test it. The goal of this technique is to evaluate system response under high or peak load that exceeds its routine parameters. This practice allows you to analyze the productivity of a system in abnormal load scenarios, identify the weak links of your web application and reduce the risk of application downtime in the future.



Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: