Validating Findings

Fuzzing often yields false positives – harmless anomalies that trigger the fuzzer's detection mechanisms but pose no real threat. This is why validation is a crucial step in the fuzzing workflow.

Why Validate?

Validating findings serves several important purposes:

Manual Verification

The most reliable way to validate a potential vulnerability is through manual verification. This typically involves:

  1. Reproducing the Request: Use a tool like curl or your web browser to manually send the same request that triggered the unusual response during fuzzing.
  2. Analyzing the Response: Carefully examine the response to confirm whether it indicates vulnerability. Look for error messages, unexpected content, or behavior that deviates from the expected norm.
  3. Exploitation: If the finding seems promising, attempt to exploit the vulnerability in a controlled environment to assess its impact and severity. This step should be performed with caution and only after obtaining proper authorization.

To responsibly validate and exploit a finding, avoiding actions that could harm the production system or compromise sensitive data is crucial. Instead, focus on creating a proof of concept (PoC) that demonstrates the existence of the vulnerability without causing damage. For example, if you suspect a SQL injection vulnerability, you could craft a harmless SQL query that returns the SQL server version string rather than trying to extract or modify sensitive data.

The goal is to gather enough evidence to convince stakeholders of the vulnerability's existence and potential impact while adhering to ethical and legal guidelines.

Example

Imagine your fuzzer discovered a directory named /backup/ on a web server. The response to this directory returned a 200 OK status code, suggesting that the directory exists and is accessible. While this might seem innocuous at first glance, it's crucial to remember that backup directories often contain sensitive information.

Backup files are designed to preserve data, which means they might include:

If an attacker gains access to these files, they could potentially compromise the entire web application, steal sensitive data, or cause significant damage. However, as a security professional, you will need to interact with this finding so that you do not compromise the integrity of the target or open yourself up to any potential blowback while proving the issue exists.

Using curl for validation

First, we need to confirm if this directory is truly browsable. We can use curl to validate if it is or isn't.

m4cc18@htb[/htb]$ curl http://IP:PORT/backup/

Examine the output in your terminal. If the server responds with a list of files and directories contained within the /backup directory, you've successfully confirmed the directory listing vulnerability. This could look something like this:

<!DOCTYPE html>
<html>
<head>
<title>Index of /backup/</title>
<style type="text/css">
[...]
</style>
</head>
<body>
<h2>Index of /backup/</h2>
<div class="list">
<table summary="Directory Listing" cellpadding="0" cellspacing="0">
<thead><tr><th class="n">Name</th><th class="m">Last Modified</th><th class="s">Size</th><th class="t">Type</th></tr></thead>
<tbody>
<tr class="d"><td class="n"><a href="../">..</a>/</td><td class="m">&nbsp;</td><td class="s">- &nbsp;</td><td class="t">Directory</td></tr>
<tr><td class="n"><a href="backup.sql">backup.sql</a></td><td class="m">2024-Jun-12 14:00:46</td><td class="s">0.2K</td><td class="t">application/octet-stream</td></tr>
</tbody>
</table>
</div>
<div class="foot">lighttpd/1.4.76</div>

<script type="text/javascript">
[...]
</script>

</body>
</html>

To responsibly confirm the vulnerability without risking exposure of sensitive data, the optimal approach is to examine the response headers for clues about the files within the directory. Specifically, the Content-Type header often indicates the type of file (e.g., application/sql for a database dump, application/zip for a compressed backup).

Additionally, scrutinize the Content-Length header. A value greater than zero suggests a file with actual content, whereas a zero-length file, while potentially unusual, may not pose a direct vulnerability. For instance, if you see a dump.sql file with a Content-Length of 0, it's likely empty. Although its presence in the directory might be suspicious, it doesn't automatically indicate a security risk.

Here's an example using curl to retrieve only the headers for a file named password.txt:

m4cc18@htb[/htb]$ curl -I http://IP:PORT/backup/password.txt

Output:

HTTP/1.1 200 OK
Content-Type: text/plain;charset=utf-8
ETag: "3406387762"
Last-Modified: Wed, 12 Jun 2024 14:08:46 GMT
Content-Length: 171
Accept-Ranges: bytes
Date: Wed, 12 Jun 2024 14:08:59 GMT
Server: lighttpd/1.4.76

These header details and the directory listing's existence provide strong evidence of a potential security risk. We've confirmed that the backup directory is accessible and contains a file named password.txt with actual content, which is likely sensitive.

By focusing on headers, you can gather valuable information without directly accessing the file's contents, striking a balance between confirming the vulnerability and maintaining responsible disclosure practices.


Exercise

TARGET: 94.237.52.164:43491

Challenge 1

Fuzz the target system using directory-list-2.3-medium.txt, looking for a hidden directory. Once you have found the hidden directory, responsibly determine the validity of the vulnerability by analyzing the tar.gz file in the directory. Answer using the full Content-Length header, eg "Content-Length: 1337"

Start with a recursive directory fuzz using ffuf with the following command:

┌──(macc㉿kaliLab)-[~/htb]
└─$ ffuf -w ~/SecLists/Discovery/Web-Content/DirBuster-2007_directory-list-2.3-medium.txt -ic -v - u http://94.237.52.164:43491/FUZZ -e .tar.gz -recursion

Output:

...
[Status: 301, Size: 0, Words: 1, Lines: 1, Duration: 77ms]
| URL | http://94.237.52.164:43491/backup
| --> | /backup/
    * FUZZ: backup

[INFO] Adding a new job to the queue: http://94.237.52.164:43491/backup/FUZZ

[Status: 301, Size: 0, Words: 1, Lines: 1, Duration: 76ms]
| URL | http://94.237.52.164:43491/ur-hiddenmember
| --> | /ur-hiddenmember/
    * FUZZ: ur-hiddenmember

[INFO] Adding a new job to the queue: http://94.237.52.164:43491/ur-hiddenmember/FUZZ

[INFO] Starting queued job on target: http://94.237.52.164:43491/backup/FUZZ
...
[INFO] Starting queued job on target: http://94.237.52.164:43491/ur-hiddenmember/FUZZ

[Status: 200, Size: 210, Words: 1, Lines: 2, Duration: 77ms]
| URL | http://94.237.52.164:43491/ur-hiddenmember/backup.tar.gz
    * FUZZ: backup.tar.gz

:: Progress: [441094/441094] :: Job [3/3] :: 518 req/sec :: Duration: [0:14:13] :: Errors: 0 ::

Now lets analyze a request to this file using curl

┌──(macc㉿kaliLab)-[~/htb]
└─$ curl -I http://94.237.52.164:43491/ur-hiddenmember/backup.tar.gz

Output:

HTTP/1.1 200 OK
Content-Type: application/x-gtar-compressed
ETag: "2730773173"
Last-Modified: Thu, 01 Aug 2024 13:38:21 GMT
Content-Length: 210
Accept-Ranges: bytes
Date: Fri, 07 Nov 2025 20:03:25 GMT
Server: lighttpd/1.4.76

flag: Content-Length: 210