Introduction

During the past weekend, I had the pleasure of exploring HackTheBox's latest Sherlock offerings. With Sherlocks you will be asked to dive into the aftermath of a targeted cyber attack and unravel the dynamics behind them, based on the knowledge provided.

Among the labs, Nubilum-1 stands out, leading you through the challenges of investigating a breach in the Cloud.

Our investigation will involve leveraging expertise in Splunk, AWS, and incident response to thoroughly investigate the threat actor's tools, tactics, and procedures (TTPs).

Link: Nubilum-1

Background

Our cloud administration team recently received a warning from Amazon that an EC2 instance deployed in our cloud environment is being utilised for malicious purposes. Our sysadmin team have no recollection of deploying this EC2, however do recall logging onto the AWS console and finding in excess of 6 EC2s running, which he did not recall deploying. You have been provided with the interview with our Cloud sysadmin within the triage.

Pre-Requisites

The only prerequisite, which is a common theme in most of my write-ups, is an innate hunger for learning.

Information Provided

└───nubilum_1_int
    ├───CloudTrail
    │   └───949622803460
    │       ├───CloudTrail
	  │       │───//<Logs in GZ Format from all regions>
    │       └───CloudTrail-Digest
    │           ├───//<Audit of CloudTrail Logs written to S3 Buckets>
    ├───forela-storage //Contents of the S3 Bucket which were publicly accessible
    │   └───backups
    └───unusual-directory //Copy of the suspicious DIR from the EC2 instance
        └───poshc2
            └───money
                ├───downloads
                ├───payloads
                └───reports

Setting the Stage

The CloudTrail folder contains 13,271 Files, 337 Folders which is the default standard for AWS to dump Trails. I would be really impressed if you could grep your way into finding the answers, which is one of the way we can investigate as the logs are in ASCII.

But, we’re going to process the CloudTrail logs within CloudTrail\\949622803460\\CloudTrail into one easy-to-digest file for Splunk using /usr/local/sof-elk/supporting-scripts/aws-cloudtrail2sof-elk.py which is contained within a SOC-ELK instance.

For ease of use, I’ve attached the file below:

aws-cloudtrail2sof-elk.py

python3 aws-cloudtrail2sof-elk.py -f -r .CloudTrail/949622803460/CloudTrail/ -w ReadyToIngest.json

The ~60 MB ReadyToIngest.json is now ready to be fed into Splunk. To avoid inflating this writeup, I’ll assume you’re well versed with the relatively straightforward process of uploading the JSON to Splunk.

Regardless, please feel free to explore the sections below if you require assistance with uploading the JSON to Splunk and adding the Final Touches before we commence our investigation!