Just two more weeks to the Magnet Virtual Summit 2025! If you’ve been procrastinating on registering, don’t miss out. It’s completely free! The conference is scheduled for February 10th through the 14th.
I’m excited to present two talks this year. The first one is titled “Unlocking DFIR: Free Resources for Efficient Triage and Acquisition.” In this talk, I’ll cover free triage acquisition solutions for Windows, Mac, and Linux.
The second talk is called “Zen & the Art of Digital Forensics: Enhancing Insight through Mindfulness.” In this talk, I’ll explore how applying Zen principles like mindfulness, non-attachment, and the ‘beginner’s mind’ can lead to improved investigations and mental well-being.
At this point, we’ve all heard the expression ‘There is no cloud; It’s just someone else’s computer.’ While there is some truth to that, there are some fundamental differences when it comes to digital forensics when cloud resources are part of the investigation.
I’ve been doing DFIR for about 15 years now. In the early days, almost all investigations involved having hands on access to the data or devices being investigated. As I moved into Enterprise Incident Response, it became more and more frequent that the devices I would be investigating would be in a remote location, be it another state – or even another country. As the scope of my investigations grew, so did my techniques need to evolve and adapt.
Cloud Forensics is the next phase of that evolution. While the systems under investigation may still be in another state or country, extra factors come into play like multi-tenancy and shared responsibility models. Cloud Forensics Demystified does a solid job of shedding light on those nuances.
The book is divided into three parts.
Part 1: Cloud Fundamentals
Part 2: Forensic Readiness: Tools, Techniques, and Preparation for Cloud Forensics
Part 3: Cloud Forensic Analysis: Responding to an Incident in the Cloud
Part 1: Cloud Fundamentals
This section provides a baseline knowledge of the three major cloud providers, Amazon Web Services (AWS), Google Cloud Platform (GCP) and Microsoft Azure. It breaks down the different architectural components of each, and how the platforms each handle the functions of virtual systems, networking and storage.
Part 1 also includes a broad yet thorough introduction to the different Cyber and Privacy legislation that come into play for cloud investigations. This section is not only valuable to investigators. Whether you’re a lawyer providing legal counsel for an organization, or responsible for an organizations overall security at a CISO level, this material is beneficial in understanding the challenges and responsibilities that come from hosting your data or systems in the cloud, and the different legislation and regulations that follow those choices.
Part 2: Forensic Readiness: Tools, Techniques, and Preparation for Cloud Forensics
As with enterprise investigations, logging is often where the hunting for incident indicators begins with telemetry and the correlation of different log sources. This section focuses on the different log sources available in AWS, GCP, and Azure. It also provides a detailed list of log types that are enabled by default and those that require manual activation to ensure that you have access to the most relevant data for your investigations when an incident occurs. This section also covers the different providers offerings for log analysis in the cloud including AWS Cloud Watch, Microsoft Sentinel and Google’s Cloud Security Command Center (Cloud SCC) as examples.
Part 3: Cloud Forensic Analysis: Responding to an Incident in the Cloud
As an Incident Responder, this was the section I enjoyed the most. While the first two sections are foundational for understanding the architectures of networking and storage, part three provides detailed information on how to acquire evidence for cloud investigations. The section covers both log analysis techniques as well as recommendations for host forensics and memory analysis tools. The book covers the use of commercial forensic suites, like Magnet Axiom, as well as open-source tools like CyLR and HAWK. Besides covering investigations of the three Cloud Service Providers (CSPs), there is also a section covering the cloud productivity services of Microsoft M365 and Google Workspace, as well as a brief section on Kubernetes.
Summary
Whether you’re a gray-haired examiner like me, or a neophyte in the world of digital forensics, chances are high that if you’re not running investigations in the cloud yet – you will be soon enough. Preparation is the first step in the Incident Response lifecycle. To properly prepare for incidents you need to know both what sources will be most informative to your investigations, as well as the methodology to capture and process that evidence efficiently.
Cloud Forensics Demystified is a comprehensive guide that covers cloud fundamentals, forensic readiness, and incident response. It provides valuable insights into cloud investigation techniques, log analysis, and evidence acquisition for major cloud providers and productivity services. The book is valuable for both experienced and novice digital forensics professionals to prepare for cloud investigations.
The latest update to CyberPipe (the code formerly known as CSIRT-Collect), has been revised to leverage the free triage collection tool, MAGNET Response. As with previous versions it also runs Encrypted Disk Detector, another free tool from MAGNET.
Script Functions:
Capture a memory image with MAGNET DumpIt for Windows, (x32, x64, ARM64), or MAGNET RAM Capture on legacy systems,
Create a Triage collection* with MAGNET Response,
Check for encrypted disks with Encrypted Disk Detector,
Recover the active BitLocker Recovery key,
Save all artifacts, output and audit logs to USB or source network drive.
The setup is simple. Save the CyberPipe script to a USB drive. Next to the script is a Tools folder with the executables for MAGNET Response & EDD. Before running, customize the script to select a collection profile. Run the script from the USB drive and collect away. Move on to the next PC and run it again.
Network Usage:
CyberPipe 5 also has the capability to write captures to a network repository. Just un-comment the # Network section and update the \\server\share line to reflect your environment.
In this configuration it can be included as part of automation functions like a collection being triggered from an event logged on the EDR.
Prior Version (KAPE Support):
If you’re a prior user of CyberPipe and want to use the previous method where KAPE facilitates the collection with the MAGNET tools, or have made other KAPE modifications, use v4.01.
Download:
Download the latest release of CyberPIpe on GitHub.
Enterprise customers running Windows Defender for Endpoint have a lot of capability at their fingertips. This includes the Live Response console, a limited command shell to interact with any managed Defender assets that are online. Besides its native commands you can also use the console to push scripts and executables to endpoints.
Note: there is a specific security setting in the Defender console if you want to allow unsigned scripts.
Microsoft has its own triage package capability, but you can also push your own tools like Magnet RESPONSE or KAPE. With a little bit of PowerShell mojo you can use your favorite collection utilities using the Defender Live Response console as your entry point into the remote asset.
The console enables you to pull back files from the remote endpoint, even when it’s been quarantined. One limitation of this console function is that you’re limited to retrieving files of 3GB or less.
For many triage collections this could be under the limit, but depending on the artifacts you’re collecting you might exceed that. So what do you do when you have an isolated endpoint but you need to pull back files over 3GB? That’s where Ginsu comes in.
Ginsu is a PowerShell script that you can upload to your Defender console along with the command line version of 7zip. You configure the script with the directory with the contents you want to transfer. The script acts as a wrapper for 7zip and will create a multipart archive, splitting the files into 3GB segments.
Once you pull the archives back to your workstation, you can use 7zip to extract the files back into their original properties.
In testing, the file transfer capabilities were a bit buggy, whether it was transferring 3GB Ginsu files or other smaller files from the asset. I’m hoping this improves as the Defender console matures. If you’re able to text Ginsu in your environment, I’d love to hear how it performs.