Saturday, October 12, 2019

Use KAPE to collect data remotely and globally

     If you have been following along with the amazing utility that KAPE is then you are aware that it is a game changer to the forensics community.  KAPE, from Eric Zimmerman, makes it possible for a first responder to be able to collect anything from a compromised machine, and automatically process the collected data.  This allows the first responder to go from nothing, to having a triage image and results that can quickly be analyzed.  KAPE is able to accomplish this by giving you complete control of the data to acquire and process, through a series of targets and modules that you can select, edit, and even create.  All of this functionality is available 100% for free.

     Well, if the benefits of using this utility for acquisition and processing are so apparent, can we then use KAPE to acquire data remotely from one or 100 machines?  Allow me to tell you that the answer is an absolute YES!  As a matter of fact, what if I were to tell you that you can use this utility to acquire data from a machine outside of your network, anywhere in the world, would you take advantage of it?  Of course you would!  I will show you how.

     Eric has even added functionality to send the acquired and processed data to an SFTP server.  This means that KAPE can be used to collect and process anything from a machine and subsequently send the results to an SFTP server that you control.   This receiving SFTP server can be set up locally in your organization or it can be publicly in the cloud.  How cool is that!! 

     The purpose of this article is to show you how to use KAPE, to collect data from a system or 100 systems and send the data from all of those systems to an SFTP server in the cloud.  This will open up the possibilities to execute KAPE on a machine outside of your environment and still be able to send the data from that machine to the cloud server that you control.  This can be useful, if you are in consulting and/or you are working with a client remotely.  Send your client one command and done.  Let’s get to it!

What we need:

     In order for us to be able to pull this off, and be able to acquire data from a machine and send it to an SFTP server where are going to need four things:
   KAPE can be downloaded from here
2) A powershell script
   This powershell script is a very simple script containing the list of commands  required to download and run KAPE.
3) A web-server
   This web-server is going to host and serve KAPE.  This can be a local web-server or a web-server in the cloud.
4) An SFTP server
   This SFTP server will receive and store the data sent from KAPE.  This can be a local SFTP server or an SFTP server in the cloud.  It can be the same as the web-server.

Setting up the Environment:  

     OK, with the “what we need” out of the way, let us now talk about how to actually do this.  In this section I am going to go over each one of the elements of setting up this technique for data collection.

     The number one requirement for this technique to work is KAPE.  Download KAPE and unzip it.  Play with it and be sure that you know how to use it.  This article assumes that you are familiar with KAPE and you know it works.  We will not be talking about how to run KAPE as this is an article on how to use it remotely, not on how to run it.  If you have created Targets and Modules for KAPE, add them.  If you have an executable that is required for one of your modules, then add it to the BIN directory.  Once your files are where you want them, you are going to ZIP it up.  We will push this ZIP file to the web server in the cloud a little later.

2) A powershell script
     The next step of this technique is an important one.  This next step in the process is going to involve creating a powershell script that is going to hold the commands needed to download and run kape on the remote machine.  Since the purpose of this article is getting KAPE to run remotely, we need to figure out a way to get KAPE to the remote machine before it can run.  While speaking with my friend Mark Hallman about how to accomplish this he gave me the idea of hosting KAPE on a web-server and simply have the remote machine download and run it.  That is exactly what this powershell script is going to do.  It will download kape, unzip it, run it, send the data to the SFTP server, and then clean up.  What we just talked about can be accomplished with the following lines.

Invoke-WebRequest -Uri -OutFile .\
     This command will download the previously zipped up file, hosted on a web-server located at sample IP and will be written to the current directory as

Expand-Archive -Path .\
     This commands expands the newly downloaded zip file

KAPE\KAPE\kape.exe --tsource C: --tdest C:\temp\kape\collect --tflush --target PowerShellConsole --scs --scp 22 --scu user --scpw 12345678 --vhdx host --mdest C:\temp\kape\process --mflush --zm true --module NetworkDetails --mef csv --gui
     This command will run KAPE.  This is a sample Kape execution.  Kape will run your targets and modules of choice depending on what it is that you intend to capture and process.  The acquired data is going to be sent to an SFTP server listening on port 22 at sample IP with user “user” and password “12345678”.  The KAPE command, server location, port, user, and password need to be changed to match the server that you created.  This SFTP server can be the same as the web server that will serve the KAPE zip file downloaded by this powershell script. 
Remove-Item .\KAPE\ -Recurse -Confirm:$false -Force
     This command removes the kape directory that was unzipped, the same directory that contained your KAPE executable and the targets and modules as well as your files in the BIN directory.

Remove-Item .\
     This command removes the KAPE zip file that was downloaded by the Invoke-WebRequest at the beginning of the script.

     Folks, that is it.  That is the entirety of the powershell script.  Get these five commands all together on a text file.  Save it with a ps1 file extension and your script is ready.  Alternatively, you can download a sample script from my github here.

3) A web-server
     The next step of this technique is to set up a web-server to host the file.  This web-server can be hosted locally or publicly in the cloud.  For the purposes of this article and for ease of use, I decided to host an ubuntu web-server in AWS.  I installed apache on it and made sure that port 80 was publicly accessible.  I then uploaded the file and the X1027_Get_DoKape script to /var/www/html so that both could be publicly accessible.

4) An SFTP server
     The last step is to set up an SFTP server, that can receive the data that KAPE will send to it.  This requires a server with ssh listening on the port of your choice, traditionally port 22, but it doesn’t have to.    For the purposes of this article I went ahead and used the same server from step 3.  Now, a couple of things to keep in mind.  Since this is an ssh server in the cloud, I created a new user with a very strong password and changed the ssh port from 22 to another port.  If you are also going to use AWS, don't forget to edit the sshd_config file so that is can accept password authentication.

     That's it.  That is all the setup that is required.  We are now ready to test this.

The Test:

     Now that the environment has been set up, all you have to do is get the remote machine to run the powershell script.  If you want to accomplish this task against 100 machines, one way to do this is to push the script to the remote machines via invoke-command.  This requires powershell remoting to be enabled in your organization, and if it is, you are in luck.

     Prior to pushing the script to the remote machine, I like to test the invoke-command execution.  One way that you can do this is by issuing the below command.  This command connects to the remote machines and runs “hostname” to get a visual that the machines you wanted to talk to, are in fact responding.

Invoke-Command -ComputerName (Get-Content .\computers.txt) -Credential carlos -ScriptBlock {hostname}

     As you can see, we were able to communicate with two systems via the invoke-command execution.  One was labeled FOR01 and the other was labeled FOR02.  These are exactly the two hostnames inside of the computers.txt file.

    We are now ready to push the script over to the machines.  This is an example of how to push the script using invoke-command.

Invoke-Command -ComputerName (Get-Content .\computers.txt) -Credential carlos .\X1027_Get_DoKape_V0.1.ps1

     This command will push the powershell script named X1027_Get_DoKape_V0.1.ps1 to a list of hosts contained inside of “computers.txt” using credentials for user “carlos”.  I ran this command against two machines named FOR01 and FOR02.  Two minutes later the data was waiting for me on the SFTP server in AWS. 

     All that is left to do now is to download the data from the SFTP server to your examination machine and begin your analysis.

     There is one more thing that I would like to talk about...  Do you remember that we mentioned that this technique could be used globally as well?  As this is a technique that is started by running a powershell script, we can actually host and download that powershell script on any web-server and it will do its job.  As a demonstration of this capability I launched a Windows Server 2016 in AWS and connected to it via RDP.

     I then ran the below command on an Administrator prompt from the “Desktop” directory.  This command will go to our web-server, download the script and start it.   
PowerShell "Invoke-Expression (New-Object System.Net.WebClient).downloadstring('')"

     A few minutes later the data form the server was sitting in our “collection” server available to be analyzed.

     It should be noted that we ran this command from an CMD prompt, not even a powershell prompt.  You can simply email this command to your client and it can be reused on as many systems as they like, anywhere in the world.  I hope that you like that!


     This is an amazingly powerful utility that is/will become the de-facto standard for triage acquisition and processing.  I hope that you are able to use this technique during your acquisitions.  If this procedure helped your investigation, we would like to hear from you.  You can leave a comment or reach me on twitter: @carlos_cajigas


  1. Awesome demo Los! For little to nothing this gives the entire community of DFIR professionals the ability to provide services to anyone just about anywhere. Thank you for sharing.

  2. Awesome post Los! (Also thanks to EZ for KAPE). This gives the entire DFIR community the ability to provide services to just about anywhere in the world for little to no cost. Thanks for sharing.

  3. Goog stuff Carlos. This will be a great utility for IR.

  4. Great Demo, thanks for sharing!