Fusing global artefact collection into a Tanium Threat Hunting programme

19 Jul 2021

The challenge

In the search for evidence of actor activity, threat hunting analysts seek to gain a broad view of an organisation's IT estate by collecting and combining data from various sources. This task often presents a logistical challenge, as specific forensic artefacts that are of interest to a threat hunting analyst and not typically collected by default EDR solutions may exist on endpoints on the other side of the globe, in remote sites, restricting the speed with which collection and analysis can occur.

The solution

We address this challenge using Tanium’s Threat Response module to collect a wealth of behavioural telemetry from our clients’ endpoints. Our bespoke Atlas capability supplements this telemetry with additional data from rich forensic artefacts that provide adjacent threat hunting opportunities.

A global presence - Project Atlas

We have used Google Cloud Platform to build a global, resilient, high-throughput platform for processing the varied forensic artifacts and enable analysts to hunt for threats at scale across this data. Google Cloud Platform can be leveraged to build a global, resilient, high-throughput platform.

Google has a globally available presence with over 142 geographically dispersed access points that can be used to ensure we have the ability to rapidly collect data into our platform for processing, from wherever our client sites are. Leveraging high-throughput global load balancing and Google Cloud Storage’s multi-region buckets, analysts can rapidly retrieve forensic artefacts of interest from any endpoints across a client’s environment, using Tanium to coordinate the collection and transmission over the S3 protocol.

Whether it’s a single log file from a near on-premises location, or thousands of machine images from a large data-centre, Google’s cloud infrastructure scales with us, ensuring the data our analysts need is never far away. Once persisted into the relevant Google Cloud Storage Bucket, Google Cloud Dataflow parses each artefact and writes the data into multiple event processing pipelines. Dataflow is a serverless data processing platform built on top of Apache Beam, making it highly parallelised, ensuring that it's feeding our analytics platforms as fast as we’re collecting data.

Intelligent event routing built on top of Google Cloud Pub/Sub ensures that every event is fed into the systems that want it, and guarantees no event is lost by creating a queue, should a large amount of data arrive in a short period. Each event is passed to our analytics platform in near real-time; presenting our analysts with enriched and contextualised telemetry to hunt through.

Figure 1: Our internal Atlas capability
Internal Atlas capability

The impact on our services

PwC often uses Tanium to perform Compromise Assessments for global clients, using our market-leading behavioural threat detection ruleset and analytics platform. An invaluable component of our methodology relies on the Atlas pipeline and Tanium’s Live Response collection capability to retrieve forensic artefacts at an enterprise-wide scale. This enables us to ensure maximum visibility of live endpoint activity, as well as allowing us to gain historical insights into previously undetected and potentially still active compromises.

Ensuring maximum visibility

A new hunting technique may require the collection of a new forensic artefact, and our threat hunting analysts can leverage the Atlas pipeline to have this data quickly at our fingertips.

For example, the PowerShell ‘ConsoleHost History’ log is a rich data source that keeps a record of all PowerShell commands that have been executed within an interactive session. Forensic investigators often retrieve this file to gain visibility of PowerShell commands that may have been executed by an attacker. Commands executed within an interactive PowerShell session are not generally captured by endpoint monitoring solutions, although commands invoked through the Windows-native terminal are more reliably captured by EDR tools.

The retrieval of this file enterprise-wide presents an opportunity to address a gap in visibility and collect extra command-line telemetry that may have otherwise been missed. The data contained in this forensic artefact is sent through the Atlas pipeline into our analytics platform, and reviewed alongside other telemetry.

On a recent Compromise Assessment, this artefact was collected and analysed from thousands of endpoints on our client’s network. As a result, we identified two separate endpoints where similar and suspicious PowerShell commands were executed, which were categorised as “download cradles”. A “download cradle” is a one-liner script download and execution, which prevents files being written to disk and makes subsequent investigation far more challenging. Further investigation revealed remote desktop sessions linking these endpoints as well as other evidence that malicious activity had taken place. This activity was undetected by the client’s existing antivirus and EDR tooling, and only identified by applying our additional threat hunting methodology.

Figure 2: PowerShell ‘download cradles’ observed on an engagement
PowerShell ‘download cradles’ observed on an engagement
PowerShell ‘download cradles’ observed on an engagement

Historical insights

The collection of forensic artefacts that have a large retention period (months or even years of historical data) can allow threat hunters to identify threat activity that took place before an endpoint monitoring solution was even deployed into the environment. Leveraging the Atlas pipeline, we process such logs at scale in an effort to surface previously undiscovered compromises.

Web server access logs store a backlog of data on HTTP requests made to websites and applications, and can be used to detect historical exploitation attempts on public-facing infrastructure. We use Tanium to identify the locations of web server logs across the environment, collect the files from these locations and send these through the Atlas pipeline. Advanced analytics are performed here to flag anomalous HTTP requests and suspicious patterns that may be indicative of, for example, a file inclusion or SQL injection vulnerability.

A series of successful exploitations of clients’ public-facing infrastructure have been identified with the insights afforded to us by the Atlas pipeline and Tanium. In one case, we identified successful PHP code injection (Figure 3) designed to insert a webshell into the client’s infrastructure; these requests were made in 2018 and were the first evidence of a sustained compromise. In another case, this log collection enabled us to confirm the exfiltration of customer data via a SQL injection (Figure 4).

Figure 3: PHP Code Injection
PHP Code Injection
Figure 4: SQL Code Injection
SQL Code Injection

Get in touch with us

We have been working closely with Tanium for seven years, using the power of near real-time visibility into endpoints to detect, contain and remediate targeted intrusions for our global client base. Our Compromise Assessments delivered with Tanium identify evidence of malicious activity within your IT estate, drawing on dedicated and experienced threat hunters with an in-depth understanding of how attackers compromise networks.

Are you concerned that you have been breached? Are you looking to proactively hunt for indicators of compromise and advanced threats? Would you like to gain confidence in your security infrastructure? Please get in touch.

Contact us

Follow us
Hide