HackTheBox : Intelligence
Intelligence is a Windows machine that showcases a number of common attacks in an Active Directory environment. After retrieving internal PDF documents stored on the web server (by brute-forcing a common naming scheme) and inspecting their contents and metadata, which reveal a default password and a list of potential AD users, password spraying leads to the discovery of a valid user account, granting initial foothold on the system. A scheduled PowerShell script that sends authenticated requests to web servers based on their hostname is discovered; by adding a custom DNS record, it is possible to force a request that can be intercepted to capture the hash of a second user, which is easily crackable. This user is allowed to read the password of a group managed service account, which in turn has constrained delegation access to the domain controller, resulting in a shell with administrative privileges.
Reconnaissance
Nmap
DNS — UDP/TCP/53
Here, we will first try to retrieve all DNS records using the following query :
As you can see, the DNS server returns various records including SOA, A and AAAA. Furthermore, we can see a new domain dc.intelligence.htb
that corresponds to the name server. Let’s add it to our hosts file.
Scenario : We have a nameserver and a domain name. With these two piece of information, we can try to perform a dns zone transfer using this query :
The zone transfer failed unfortunately.
HTTP — TCP/80
In this section, we’re going to enumerate the web server using different techniques. First of all, let’s take a quick look at the main page of the website :
The web server is static. Indeed, it does not generate any dynamic content.
Let’s now try to crawl the website for interesting attack vectors.
Crawling
In this section, we are going to use ReconSpider which is a crawler built by HackTheBox. A web crawler follows links from one page to another, collecting information such as emails, documents, javascript files, form fields, images, etc.
Once done, the crawler will generate a json file results.json that contains all the collected data.
Note : Crawling the website is really useful as it allows us to save huge amount of time. Indeed, it provides us with a global view of the website (urls, form fields, emails, external files) in just a few seconds.
From the output above, we retrieved two pdf documents. Let’s take a look at their content to see if we can find anything valuable.
Unfortunately, this doesn’t make sense. Nevertheless, when we take a close look at the file nomenclature, we can see that we have something like : /documents/YYYY-MM-DD-upload.pdf
. You may wonder what happens if we change YYYY-MM-DD
with numbers of our choice. Well, we can assume that we may potentially access other pdf files. However, impossible to be 100% sure until we test it. Let’s do it :
Guess what ? By changing the date to the 2nd of January, we have been able to access another file. That’s great ! Nevertheless, doing that manually could be time consuming. For that reason, we’re going to use this little python script to generate a date range :
import datetime
start = datetime.date(2020,1,1)
timelapse = 1080 # Dates from 2020-01-01 to 2022-12-31
dates = []
for day in range(timelapse):
d = (start + datetime.timedelta(days = day)).isoformat()
dates.append(d)
with open("dates.txt", "w") as f:
for d in dates:
f.write(d + '\n')
print("Dates successfully generated !")
The script above will generate a range of dates between 2020-01-01
and 2022-12-31
, then save the output into a file called dates.txt
in your current working directory. Here is how you can use it :
Fantastic ! Once the dates generated, we use ffuf
to enumerate the valid dates that correspond to a pdf file. To do that, we will apply a filter based on the http status code :
ffuf -ic -c -w dates.txt -u http://intelligence.htb/documents/FUZZ-upload.pdf -mc all -fc 404
Awesome ! Now that we enumerated the pdf files, we will need to take a look at the content of these pdf documents to see if we can find anything of value.
Well, we will try to automate the process because it’s gonna be cumbersome to take a look at almost 100 pdfs :/
That said, let’sfirst download the pdf files then convert them into txt files using a tool called pdftotext
:
Here, I used awk
to isolate the dates from the rest of the output returned by ffuf
. After that, I used a for
loop and curl
to download the pdf files :
for d in $(cat ../vdates.txt); do curl -s -O "http://intelligence.htb/documents/$d-upload.pdf"; done
Once the pdf files downloaded, we are going to convert them into plaintext files so that we can use grep
to check whether or not they store any sensitive information. Here is the command, we will use :
for pdf_file in $(ls); do pdftotext $pdf_file; done
Great ! Now that the pdf files are converted into txt, we can use grep
to search for sensitive information :
grep -iE (pass|pwd|token|secret|api|login|user) *.txt
Here is a quick break down of the command above :
-i
: ignores the case-E
: uses regular expressions
Awesome ! As you can see, it seems that 2020-06-04-upload.txt
stores juicy information. Let’s take a look at its content :
Initial Access
In the previous section, we found the company’s default password. However, to successfully authenticate to the target system, we will need to find valid usernames. Well, let’s take a look at the PDF files’ metadata as they may store valuable information at times. Let’s extract these metadata to see if there is something interesting there :
Haha, we found the creator’s name of the pdf file. Let’s see if we can perform the same operation for the remaining pdf files :
exiftool *.pdf | grep Creator | awk -F': ' '{ print $2 }'
Guess what ? It worked. I have been able to extract usernames from the pdf files. Let’s now save these usernames in a file, then use kerbrute
to check which ones are valid :
kerbrute userenum -d intelligence.htb --dc dc.intelligence.htb users.txt
As you can see, all users are valid.
Fantastic ! We can now proceed with a password spraying attack. One known tool to do that is netexec
. However, it is always good practice to know how to use different tools to perform one attack. Therefore, I decided to use kerbrute
to perform that attack :
kerbrute passwordspray -d intelligence.htb --dc dc.intelligence.htb users.txt 'NewIntelligenceCorpUser9876'
Well, we got a successful login as Tiffany.Molina.
Let’s enumerate her shares using smbmap
:
Let’s mount the Users share on our system. I generally opt for that option rather than downloading the share because we may sometimes fall across large shares :
After successfully mounting the share, we can take a look at its content using the ls
command :
This seems to be Tiffany’s working directory.
We should normally be able to retrieve the user flag by checking her Desktop folder :
Awesome ! We got the user flag. Let’s go now for the root flag.
Lateral Movement (Tiffany -> Ted -> svc_int$)
In this section, we’re first going to collect the domain information, then use bloodhound to find interesting attack paths.
Collecting domain information
Here, we are going to use bloodhound-python
to collect the domain information :
Let’s now visualize all this data in bloodhound.
One interesting feature of bloodhound is its Pre-Built Analytics Queries. These are predefined queries that you can use to discover interesting attack paths.
Let’s try for instance to enumerate the Domain Admins :
I looked for things like kerberoasting, principals with DCSync rights, AS-REP roastable users but found nothing interesting.
Therefore, I decided to focus on SMB shares enumeration.
Note : I checked the SYSVOL and NETLOGON share but found nothing.
Knowing that Tiffany has read access to the IT share, we can take a look at its content using the following smbmap
command :
Well, let’s mount the share on our attacker machine using mount.cifs
:
Once done, we can see that the share contains a powershell script downdetector.ps1
:
This script checks every 5 minutes the status of the web server. For that, it sends an http request to the web server with the default credentials of the user that run the script. Once the request sent, it checks if the returned status code is different from 200. If the status code is different from 200, it considers that the server is down and reports it to the user Ted Graves via email.
To take advantage of this script and obtain the credentials of the user that executes it, we are going to add a new web server to the domain’s DNS Zone. To do that, we can rely on dnstool
. This is a python script used to add, modify and delete Active Directory Integrated DNS (ADIDNS) records via LDAP.
Fantastic ! Our record web-exploit
has been added successfully to the DomainDNSZones. We can check that using the following command :
Note : As you can see, web-exploit
is a A
record that points to my IP address : 10.10.14.22
. This means that when downdetector.ps1
will try to resolve web-exploit
, it will point to my IP address and therefore I will receive the request.
To test that, we can launch a http server and see if we get any request :
Wonderful ! We indeed got a request from the target machine. Therefore to capture the user default’s credentials, we are going to useresponder
.
Note : Please make sure to enable HTTP server in your responder’s configuration file located at /usr/share/responder/Responder.conf
Once done, you can launch responder
by specifying the interface that corresponds to the IP address you used during the creation of your DNS record :
Excellent ! We captured Ted.Graves’ Net-NTLMv2 hash. Let’s now try to crack this hash using hashcat
:
We successfully cracked the Net-NTLMv2 hash. Let’s now check if we can authenticate to Ted.Graves’ account.
Great ! The authentication succeeded.
As usual, let’s enumerate the shares :
Ted seems to have the same permissions as Tiffany. After taking a look at its shares, I found nothing top notch, therefore I went back to bloodhound.
Here, I first enumerated Ted’s group memberships :
As you can see, Ted is part of the DOMAIN USERS
group as well as the ITSUPPORT
group.
Let’s now look for objects controlled by Ted. To find that, I clicked on the Group Delegated Object Control :
Here, the group ITSUPPORT
can retrieve the password of the Group Managed Service Account’s svc_int$
. Knowing that Ted is part of the ITSUPPORT
group, he can also retrieve the password of svc_int$
.
Group Managed Service Accounts (gMSA) are a special type of Active Directory object, where the password is managed by and automatically changed by domain controllers on a set interval. The intended use of gMSA is to allow certain computer accounts to retrieve the password for the gMSA, then run local services as the gMSA.
To retrieve svc_int$
’s password, we’re gonna use a tool call gMSADumper
. This tool reads any gMSA password blobs the user can access and parses the values. Here is how we can use it to retrieve svc_int$
’s password :
Here again, we will check if we can authenticate to svc_int$
account using the NT hash return by gMSADumper :
Fabulous ! The authentication succeeded.
Let’s enumerate svc_int$
‘s shares :
Again, I found nothing meaningful. Therefore, I keep on with bloodhound by taking a look at svc_int$
‘s properties :
svc_int$'s properties
As you can see, svc_int$
has trustedtoaauth
property enabled which means there might be some kind of delegation.
After performing more enumeration, I found that svc_int$
has the constrained delegation privilege to the computer account DC as you can see on the image below :
svc_int$ has AllowedToDelegate privilege to DC
Awesome ! Let’s now try to escalate our privileges.
Privilege Escalation (svc_int$ -> Administrator)
In this section, we will exploit a constrained delegation privilege to escalate our privileges to Domain Admins.
The constrained delegation privilege allows a principal to authenticate as any user to specific services (found in the msds-AllowedToDelegateTo attribute) on the target computer. Therefore, an object with this privilege can impersonate any domain principal (including Domain Admins) to the specific service on the target host. One important caveat is that impersonated users can not be in the “Protected Users” security group or otherwise have delegation privileges revoked.
That being said, let’s first try to find the target’s service principal name (SPN) set in its msds-AllowedToDelegateTo attribute :
As you can see the SPN is www/dc.intelligence.htb.
After that, let’s check that DC is not part of the “Protected Users” security group by taking a look at its groups :
Great ! DC is part of the ENTERPRISE DOMAIN CONTROLLERS
and DOMAIN CONTROLLERS
group.
Well, let’s now use impacket’s getST
to get a service ticket for the administrator user :
This provided us with a service ticket that we can then used with psexec
to authenticate to the Administrator account :
Finally, we can retrieve the root flag and complete the challenge \0/
Key Techniques Used
Here are the main techniques that we cover in this box :
- Creating a custom python script to brute force valid pdf files
- Converting pdf files into plaintext files (pdftotext), then use grep with regex to search for sensitive information
- Extracting pdf files’ metadata using exiftool
- Adding a new DNS record into the DomainDNSZones using dnstools
- Capturing and cracking Net-NTLMv2 hash using responder
- Visualizing domain information in bloodhound to find interesting attack paths for lateral movement and privilege escalation
- Reading gMSA password using gMSADumper
- Exploiting a constrained delegation privilege using bloodhound and impacket’s getST script
Wrap Up
That’s it guys. Congrats on having made it so far 👏
I hope you enjoyed the writeup.
If so, do not forget to click on the little clap icon below and subscribe to my newsletter to keep up with my latest articles !
Resources
https://www.baeldung.com/linux/pdf-metadata-terminal
https://github.com/micahvandeusen/gMSADumper
https://www.thehacker.recipes/ad/movement/mitm-and-coerced-authentications/adidns-spoofing
https://www.thehacker.recipes/ad/movement/dacl/readgmsapassword
Contact
GitHub : https://github.com/0liverFlow