<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Chapin Bryce</title>
    <description>The latest articles on Forem by Chapin Bryce (@chapindb).</description>
    <link>https://forem.com/chapindb</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/chapindb"/>
    <language>en</language>
    <item>
      <title>1 minute Canaries</title>
      <dc:creator>Chapin Bryce</dc:creator>
      <pubDate>Sun, 16 Oct 2022 13:54:53 +0000</pubDate>
      <link>https://forem.com/chapindb/1-minute-canaries-a15</link>
      <guid>https://forem.com/chapindb/1-minute-canaries-a15</guid>
      <description>&lt;p&gt;Visibility is &lt;em&gt;everything&lt;/em&gt; in cyber security. Let's increase the visibility of suspicious activity in your environment in 1 minute.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Ready?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1&lt;/strong&gt;: Visit &lt;a href="https://canarytokens.org"&gt;https://canarytokens.org&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--aejUZ470--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1qsi09n6ebrkrfnx6i5o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--aejUZ470--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1qsi09n6ebrkrfnx6i5o.png" alt="CanaryTokens.org home page" width="880" height="555"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2&lt;/strong&gt;: Select the type of canary token that matches your system or your risk. For example, you may choose an Excel or Word document on a corporate device, or AWS keys or a MySQL dump on a developer or server. There are a lot of options here, freely available for your use.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wOPlwza7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ew3dv70y99iwgp5ckfsu.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wOPlwza7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ew3dv70y99iwgp5ckfsu.gif" alt="The many options for Canary Tokens" width="730" height="560"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3&lt;/strong&gt;: Enter the contact email address or a web hook URL (or both!) to notify when your canary is used.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---Kg7m0K5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9k9xspqxlzhmabuwknk2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---Kg7m0K5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9k9xspqxlzhmabuwknk2.png" alt="Configuring your token" width="880" height="577"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Step 4: Click "Create my Canarytoken" to generate the token to place wherever you like!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vurHTHJX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/arb3z880758ha7ba4go4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vurHTHJX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/arb3z880758ha7ba4go4.png" alt="The generated token" width="880" height="917"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The site provides some recommendations, though feel free to get creative - put a token in an email, in a file named &lt;code&gt;passwords.docx&lt;/code&gt;, on a file share, in your &lt;code&gt;~/.aws/credentials&lt;/code&gt; file, or if you're crazy enough, you can put them on your website.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight ini"&gt;&lt;code&gt;&lt;span class="nn"&gt;[default]&lt;/span&gt;
&lt;span class="py"&gt;aws_access_key_id&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;AKIAYVP4CIPPHKZTDHPV&lt;/span&gt;
&lt;span class="py"&gt;aws_secret_access_key&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;s5Qi2UmF8jZoES/9q7+/jN6c0uAieT7gZn5Vb9oW&lt;/span&gt;
&lt;span class="py"&gt;output&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;json&lt;/span&gt;
&lt;span class="py"&gt;region&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;us-east-2&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's all! Now you will get a heads up when someone is snooping around or accessing resources. Have fun with it, share your creative use cases, and pass along this tip to a friend.&lt;/p&gt;

</description>
      <category>security</category>
      <category>cybersecurity</category>
      <category>aws</category>
    </item>
    <item>
      <title>Two-minute InfoSec — Shell History Timestamps</title>
      <dc:creator>Chapin Bryce</dc:creator>
      <pubDate>Thu, 05 Mar 2020 11:31:29 +0000</pubDate>
      <link>https://forem.com/chapindb/two-minute-infosec-shell-history-timestamps-jp3</link>
      <guid>https://forem.com/chapindb/two-minute-infosec-shell-history-timestamps-jp3</guid>
      <description>&lt;h3&gt;
  
  
  Two-minute InfoSec — Shell History Timestamps
&lt;/h3&gt;

&lt;h4&gt;
  
  
  A new series with a goal on sharing quick wins that can assist organizational security, forensic investigations, incident response and more that you can implement within two minutes or less.
&lt;/h4&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--99gA3MvE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/1024/0%2AJ20fz0IwdoSD3ayZ" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--99gA3MvE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/1024/0%2AJ20fz0IwdoSD3ayZ" alt="" width="880" height="586"&gt;&lt;/a&gt;Photo by &lt;a href="https://unsplash.com/@kaitlynbaker?utm_source=medium&amp;amp;utm_medium=referral"&gt;Kaitlyn Baker&lt;/a&gt; on &lt;a href="https://unsplash.com?utm_source=medium&amp;amp;utm_medium=referral"&gt;Unsplash&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Today’s post is focused on a a feature of nearly any shell — command history. This file is a rich source of evidence for prior user activity, especially on Linux/Unix/macOS systems. One major draw back is that by default, this file does not store timestamps, making analysis of the data difficult and cost a lot of valuable investigative time.&lt;/p&gt;

&lt;p&gt;In this post we will cover how to quickly implement timestamps in some common shells including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Bash&lt;/li&gt;
&lt;li&gt;Zsh&lt;/li&gt;
&lt;li&gt;Fish&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Not all Linux/Unix/macOS platforms are made the same! These are general ways to accomplish this goal, but always test before putting things into production.&lt;/p&gt;

&lt;h3&gt;
  
  
  Bash
&lt;/h3&gt;

&lt;p&gt;To add for user accounts, modify the &lt;code&gt;~/.bashrc&lt;/code&gt; or &lt;code&gt;~/.bash_profile&lt;/code&gt; files and add the below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;export HISTTIMEFORMAT ="%F %T %z "
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This same line can be placed in /etc/bashrc to load across user profiles.&lt;/p&gt;

&lt;p&gt;Source: &lt;a href="https://linux.die.net/man/1/bash"&gt;https://linux.die.net/man/1/bash&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Zsh
&lt;/h3&gt;

&lt;p&gt;For user accounts, add the below line to &lt;code&gt;~/.zshrc&lt;/code&gt; or &lt;code&gt;/etc/zshrc&lt;/code&gt; for system wide implementation.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;setopts EXTENDED_HISTORY
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will not only place a timestamp of execution but also the duration of execution — a very handy data point in investigations! Some Z shells, such as csh, though it doesn’t hurt to check!&lt;/p&gt;

&lt;p&gt;Source: &lt;a href="http://zsh.sourceforge.net/Doc/Release/Options.html#Options"&gt;http://zsh.sourceforge.net/Doc/Release/Options.html#Options&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Fish
&lt;/h3&gt;

&lt;p&gt;Enabled by default! Though check your history file is located at:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;~/.local/share/fish/fish_history
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Have another shell you use and prefer? Or maybe an alternative implementation on a specific OS? Comment and we can add it in to this post for ease of future reference!&lt;/p&gt;

</description>
      <category>infosec</category>
      <category>technology</category>
      <category>security</category>
      <category>dfir</category>
    </item>
    <item>
      <title>3-Step RDP Honeypot: Step 3 | Build the Bot</title>
      <dc:creator>Chapin Bryce</dc:creator>
      <pubDate>Sat, 15 Feb 2020 14:26:44 +0000</pubDate>
      <link>https://forem.com/chapindb/3-step-rdp-honeypot-step-3-build-the-bot-1110</link>
      <guid>https://forem.com/chapindb/3-step-rdp-honeypot-step-3-build-the-bot-1110</guid>
      <description>&lt;p&gt;&lt;a href="https://medium.com/@chapindb/3-step-rdp-honeypot-step-3-build-the-bot-c6552fab1740?source=rss-3eefa9ecbc14------2"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--y4L09zlx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/600/1%2A41EPHXSqGYN8v5j5hAeYZw.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this mini-series, we have setup our honeypot, extracted valuable features from our PCAP data, and now we operationalize this intel.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/@chapindb/3-step-rdp-honeypot-step-3-build-the-bot-c6552fab1740?source=rss-3eefa9ecbc14------2"&gt;Continue reading on Medium »&lt;/a&gt;&lt;/p&gt;

</description>
      <category>security</category>
      <category>forensics</category>
      <category>cybersecurity</category>
    </item>
    <item>
      <title>3-Step RDP Honeypot: Step 2 | Operationalize PCAPs</title>
      <dc:creator>Chapin Bryce</dc:creator>
      <pubDate>Sat, 15 Feb 2020 13:03:14 +0000</pubDate>
      <link>https://forem.com/chapindb/3-step-rdp-honeypot-step-2-operationalize-pcaps-1c5h</link>
      <guid>https://forem.com/chapindb/3-step-rdp-honeypot-step-2-operationalize-pcaps-1c5h</guid>
      <description>&lt;p&gt;&lt;a href="https://medium.com/pythonic-forensics/3-step-rdp-honeypot-part-2-operationalize-pcaps-2abb7ba2649a?source=rss-3eefa9ecbc14------2"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---0hrt3CA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/1241/1%2AmqJBC9mjncnKLwxbbVTa1A.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With our RDP Honeypot PCAP data captured, let’s analyze it. We will leverage Moloch to assist us with extracting valuable PCAP features.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/pythonic-forensics/3-step-rdp-honeypot-part-2-operationalize-pcaps-2abb7ba2649a?source=rss-3eefa9ecbc14------2"&gt;Continue reading on Pythonic Forensics »&lt;/a&gt;&lt;/p&gt;

</description>
      <category>security</category>
      <category>forensics</category>
      <category>cybersecurity</category>
    </item>
    <item>
      <title>3-Step RDP Honeypot: Step 1 | Honeypot Setup</title>
      <dc:creator>Chapin Bryce</dc:creator>
      <pubDate>Sat, 15 Feb 2020 13:02:31 +0000</pubDate>
      <link>https://forem.com/chapindb/3-step-rdp-honeypot-step-1-honeypot-setup-1bcc</link>
      <guid>https://forem.com/chapindb/3-step-rdp-honeypot-step-1-honeypot-setup-1bcc</guid>
      <description>&lt;p&gt;&lt;a href="https://medium.com/@chapindb/3-step-rdp-honeypot-part-1-honeypot-setup-b5f01af19f4d?source=rss-3eefa9ecbc14------2"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--tw_Md87T--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/1276/1%2AEVx4erBmFzY9ux3NvP-iyQ.jpeg" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Step 1 in our process is creating our Honeypot service and start capturing the request data. This brief post dives into building the most…&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/@chapindb/3-step-rdp-honeypot-part-1-honeypot-setup-b5f01af19f4d?source=rss-3eefa9ecbc14------2"&gt;Continue reading on Medium »&lt;/a&gt;&lt;/p&gt;

</description>
      <category>security</category>
      <category>forensics</category>
      <category>cybersecurity</category>
    </item>
    <item>
      <title>3-Step RDP Honeypot: Step 0 | Introduction</title>
      <dc:creator>Chapin Bryce</dc:creator>
      <pubDate>Sat, 15 Feb 2020 13:01:56 +0000</pubDate>
      <link>https://forem.com/chapindb/3-step-rdp-honeypot-step-0-introduction-53kn</link>
      <guid>https://forem.com/chapindb/3-step-rdp-honeypot-step-0-introduction-53kn</guid>
      <description>&lt;p&gt;&lt;a href="https://medium.com/@chapindb/3-step-rdp-honeypot-part-0-introduction-7b572d563e83?source=rss-3eefa9ecbc14------2"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vuhpcbJo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/1203/1%2A4dKCu0aaL2aUvmUhrBhXvQ.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Easily set up your own RDP Honeypot, capture bots scanning for vulnerable systems, and operationalize the data to help the InfoSec…&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/@chapindb/3-step-rdp-honeypot-part-0-introduction-7b572d563e83?source=rss-3eefa9ecbc14------2"&gt;Continue reading on Medium »&lt;/a&gt;&lt;/p&gt;

</description>
      <category>security</category>
      <category>forensics</category>
      <category>cybersecurity</category>
    </item>
    <item>
      <title>Build your own RDP Honeypot</title>
      <dc:creator>Chapin Bryce</dc:creator>
      <pubDate>Wed, 20 Nov 2019 01:46:08 +0000</pubDate>
      <link>https://forem.com/chapindb/build-your-own-rdp-honeypot-a5</link>
      <guid>https://forem.com/chapindb/build-your-own-rdp-honeypot-a5</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6APpK-eC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/972/1%2AKYlALfnVj5R13KQx68WhBA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6APpK-eC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/972/1%2AKYlALfnVj5R13KQx68WhBA.jpeg" alt="Photo by Matthew T Rader on Unsplash" width="880" height="655"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is a short post, largely inspired by &lt;a href="https://medium.com/u/7413f5a9ff32"&gt;alt3kx&lt;/a&gt; and their awesome post here:&lt;/p&gt;


&lt;div class="ltag__link"&gt;
  &lt;a href="https://medium.com/@alt3kx/build-an-easy-rdp-honeypot-with-raspberry-pi-3-and-observe-the-infamous-attacks-as-bluekeep-29a167f78cc1" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--sIiQI_gQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/fit/c/96/96/1%2AzOkghDdYWzYCEzeZUTek1w.jpeg" alt="alt3kx"&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="https://medium.com/@alt3kx/build-an-easy-rdp-honeypot-with-raspberry-pi-3-and-observe-the-infamous-attacks-as-bluekeep-29a167f78cc1" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;Build an easy RDP Honeypot with Raspberry PI 3 and observe the infamous attacks as (BlueKeep) CVE-2019–0708 | by alt3kx | Medium&lt;/h2&gt;
      &lt;h3&gt;alt3kx ・ &lt;time&gt;Jun 5, 2019&lt;/time&gt; ・ 
      &lt;div class="ltag__link__servicename"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hnDHPsJs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/medium-f709f79cf29704f9f4c2a83f950b2964e95007a3e311b77f686915c71574fef2.svg" alt="Medium Logo"&gt;
        Medium
      &lt;/div&gt;
    &lt;/h3&gt;
&lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;


&lt;p&gt;Please go check it out!&lt;/p&gt;

&lt;p&gt;Where this post differs is implementing on a Debian cloud instance of your choice and adding in email reporting. We will run through all the steps just to have a cohesive post, though most of the steps do mirror those in the aforementioned post.&lt;/p&gt;

&lt;h2&gt;
  
  
  # Requirements
&lt;/h2&gt;

&lt;p&gt;We will be setting up in a virtual/cloud-based environment. This has pros/cons (versus your own RaspberryPi or other hardware), though we can save that for the comments ;)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Access to a Windows 2008 R2 server that you don’t mind making insecure temporarily. One option is to spin up an EC2 instance in a new security group just for this sole use. You can also use Azure/other platforms supporting Windows 2008 R2 images.&lt;/li&gt;
&lt;li&gt;A Debian based compute instance on your cloud hosting provider of choice. This will actually host our honeypot, so we will want to set this up separate from everything else. This guide won’t walk thru a hosting provider specific setup, as there are too many to cover. You will also want to ensure you have X11 forwarding configured over SSH.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  # Windows 2008 Setup
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Spin up both your Windows 2008 R2.&lt;/li&gt;
&lt;li&gt;RDP into the Windows 2008 R2 instance and disable “Network Level Authentication” (&lt;a href="https://www.parallels.com/blogs/ras/disabling-network-level-authentication/#WS2008WS2008R2"&gt;this guide walks through the process&lt;/a&gt;).&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  # Debian/Honeypot Setup
&lt;/h2&gt;

&lt;p&gt;This setup has a few more stages:&lt;/p&gt;

&lt;h3&gt;
  
  
  ## Installing and configuring RDPY
&lt;/h3&gt;

&lt;p&gt;This honeypot will leverage the &lt;a href="https://github.com/citronneur/rdpy/"&gt;GitHub project RDPY&lt;/a&gt;, as it covers the tooling needed to setup an RDP listener on a Linux host. This can be installed using a Python 2.7 version of &lt;code&gt;pip&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip install rdpy
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;We will also need to install a few more tools using &lt;code&gt;apt install python-qt4 xauth screen freerdp2-x11&lt;/code&gt;. You may need to install &lt;code&gt;pip&lt;/code&gt; as well.&lt;/p&gt;

&lt;p&gt;Once these are setup (and X11 forwarding is configured for your SSH session) we will generate the fake RDP prompt that will show on connection.&lt;/p&gt;

&lt;p&gt;I recommend using screen for this part, though you can also open a second SSH connection to the host.&lt;/p&gt;

&lt;p&gt;In one terminal, run the below command, where &lt;code&gt;1.1.1.1&lt;/code&gt; is the IP address of your Windows 2008 R2 host:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;rdpy-rdpymitm.py -o ./ 1.1.1.1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;In the second terminal, we will use our xfreerdp client to connect to our MiTM proxy, providing the username Administrator and forcing the client not to use Network Level Authentication:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;xfreerdp --no-nla -u Administrator 127.0.0.1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;blockquote&gt;
&lt;p&gt;In this case we do want to use the IP address &lt;code&gt;127.0.0.1&lt;/code&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;A window should pop up and show the &lt;code&gt;xfreerdp&lt;/code&gt; interface with a prompt to select a user account. There is no need to interact with the interface at this point and we can close out.&lt;/p&gt;

&lt;p&gt;Feel free to use &lt;code&gt;ctrl-c&lt;/code&gt; in the terminal with our MiTM script. You should see a new file in the same directory as the script, with the extension &lt;code&gt;rss&lt;/code&gt;. This file contains a play back (which for bonus points you can view with &lt;code&gt;rdpy-rssplayer.py&lt;/code&gt;) needed for the honeypot script.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;At this point we will want to use &lt;code&gt;screen&lt;/code&gt; or &lt;code&gt;nohup&lt;/code&gt; to start our honeypot and keep it running on disconnect (so we don’t have to keep our connection alive for the honeypot to remain up).&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;We can safely destroy our Windows 2008 R2 instance now and save on the cloud hosting fees!&lt;/p&gt;

&lt;p&gt;Now we are ready to run the honeypot script, it is as simple as the below, where you provide the name of your RSS file as the only argument:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;rdpy-rdpyhoneypot.py \*.rss
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Congrats! The honeypot is live!&lt;/p&gt;
&lt;h1&gt;
  
  
  ## Getting intel out of your honeypot
&lt;/h1&gt;

&lt;p&gt;This next step is where we can automate a few tasks to send reports to our mailbox.&lt;/p&gt;

&lt;p&gt;To start, we will want a few tools to capture, process, and report on the activity. Let’s install them:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;apt install tcpdump bro bro-aux zip
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;We first will use tcpdump to capture all traffic on TCP/3389 into a PCAP file:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;tcpdump tcp port 3389 -i eth0 -vvX -w ‘rdp.%FT%H-%M-%S.pcap’ -C 500 -G 86400 -W 10
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;blockquote&gt;
&lt;p&gt;Please sub in your own interface (ie eth0) and file naming convention. For full details on &lt;a href="https://explainshell.com/explain?cmd=tcpdump+tcp+port+3389+-i+eth0+-vvX+-w+%27rdp.%25FT%25H-%25M-%25S.pcap%27+-C+500+-G+86400+-W+10"&gt;this tcpdump command line, check ExplainShell&lt;/a&gt;. This capture should also run as an admin in a screen or &lt;code&gt;nohup&lt;/code&gt; session to prevent it closing on disconnect. Neither of these solutions will assist in the case of a reboot.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Now we can move to our report generation! For this, we will use bro to generate easy to consume logs from the PCAP files. We will then take these logs and create a summary report that we send via email. This is all possible in the below script (please modify as you see fit!)&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;You will need to sub in your own information for Mailgun if you want to use that API, otherwise feel free to plug in your own email sending routine.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Now that we have the script setup, we can create a cron job to run it for us daily and fill our inbox with a summary of what we caught in our honeypot!&lt;/p&gt;

&lt;p&gt;Please play around with this configuration and fine tune further to your liking. If possible, please share back to the community!&lt;/p&gt;




</description>
      <category>security</category>
      <category>dfir</category>
      <category>honeypot</category>
    </item>
    <item>
      <title>Useful Commands for Log Analysis: Part 2 — Sed</title>
      <dc:creator>Chapin Bryce</dc:creator>
      <pubDate>Sat, 06 Oct 2018 17:38:37 +0000</pubDate>
      <link>https://forem.com/chapindb/useful-commands-for-log-analysis-part-2-sed-3o0a</link>
      <guid>https://forem.com/chapindb/useful-commands-for-log-analysis-part-2-sed-3o0a</guid>
      <description>&lt;h3&gt;
  
  
  Useful Commands for Log Analysis: Part 2 — Sed
&lt;/h3&gt;

&lt;p&gt;To follow up on the prior post &lt;a href="http://pythonicforensics.com/useful-cmds-1/" rel="noopener noreferrer"&gt;discussing bash tips &amp;amp; tricks&lt;/a&gt; here are a few more commands that are useful in text, specifically log, file analysis.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F0%2AktI0q4oGqa3xj4DB" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F0%2AktI0q4oGqa3xj4DB"&gt;&lt;/a&gt;Photo by &lt;a href="https://unsplash.com/@brunus?utm_source=medium&amp;amp;utm_medium=referral" rel="noopener noreferrer"&gt;Bruno Martins&lt;/a&gt; on &lt;a href="https://unsplash.com?utm_source=medium&amp;amp;utm_medium=referral" rel="noopener noreferrer"&gt;Unsplash&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Intro to using sed
&lt;/h3&gt;

&lt;p&gt;In this example, we will use sed to help normalize the data we are processing. This utility is a stream editor, useful in manipulating data. In our scenario below, we will use it to find and replace text strings.&lt;/p&gt;

&lt;p&gt;In our example below, we have a squid proxy access log and will work to extract a unique list of domains from the log file. Using a similar process as last time, we will iteratively build out our command to further refine our results. To start, let’s take a look at the format of the log:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ head -n 3 access.log 
10.53.56.81 - - [26/Nov/2017:14:20:26 +0000] "CONNECT aus5.mozilla.org:443 HTTP/1.1" 200 0 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:47.0) Gecko/20100101 Firefox/47.0" TAG\_NONE:HIER\_DIRECT 
10.53.56.81 - - [26/Nov/2017:14:20:26 +0000] "GET https://aus5.mozilla.org/update/6/Firefox/47.0.2/20161031133903/WINNT\_x86-msvc-x64/en-US/release/Windows\_NT%2010.0.0.0%20(x64)/SSE3/default/default/update.xml HTTP/1.1" 200 946 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:47.0) Gecko/20100101 Firefox/47.0" TCP\_MISS:HIER\_DIRECT 
10.53.56.81 - - [26/Nov/2017:14:20:27 +0000] "GET http://download.mozilla.org/?product=firefox-56.0-complete-bz2&amp;amp;os=win&amp;amp;lang=en-US HTTP/1.1" 302 585 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:47.0) Gecko/20100101 Firefox/47.0" TCP\_MISS:HIER\_DIRECT
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As seen above, we have the URL in the 7th column. Using awk, as we did in the &lt;a href="http://pythonicforensics.com/useful-cmds-1/" rel="noopener noreferrer"&gt;prior post&lt;/a&gt;, we can isolate this field as shown below by selecting the 7th space delimited column:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ head -n 3 access.log | awk '{ print $7 }' 
aus5.mozilla.org:443 https://aus5.mozilla.org/update/6/Firefox/47.0.2/20161031133903/WINNT\_x86-msvc-x64/en-US/release/Windows\_NT%2010.0.0.0%20(x64)/SSE3/default/default/update.xml 
[http://download.mozilla.org/?product=firefox-56.0-complete-bz2&amp;amp;os=win&amp;amp;lang=en-US](http://download.mozilla.org/?product=firefox-56.0-complete-bz2&amp;amp;os=win&amp;amp;lang=en-US)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;These first three entries are all the same domain, though we need to clean up our URLs to extract just the domains. To start, let’s remove the query string data. Query strings contain parameters passed between pages on a web site and follow the ? character found in the URL. To remove this, let's use awk again, setting the delimiter to ? and displaying the field prior to the character:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ head -n 3 access.log | awk '{ print $7 }' | awk -F '?' '{ print $1 }' 
aus5.mozilla.org:443 https://aus5.mozilla.org/update/6/Firefox/47.0.2/20161031133903/WINNT\_x86-msvc-x64/en-US/release/Windows\_NT%2010.0.0.0%20(x64)/SSE3/default/default/update.xml 
[http://download.mozilla.org/](http://download.mozilla.org/)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now that we’ve addressed query strings, let’s remove the protocol statement. This includes http:// and https://, though can be other protocols such as ftp://. Since in this log we only see the HTTP and HTTPS protocols, we will manually specify those to be removed; while there are other methods to do this, let's use sed:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ head -n 3 access.log | awk '{ print $7 }' | awk -F '?' '{ print $1 }' | sed 's|http\(s\)\*://||' 
aus5.mozilla.org:443 aus5.mozilla.org/update/6/Firefox/47.0.2/20161031133903/WINNT\_x86-msvc-x64/en-US/release/Windows\_NT%2010.0.0.0%20(x64)/SSE3/default/default/update.xml 
download.mozilla.org/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the above, we have specified a sed substitution expression. The initial s in the sed command executes the substitute function. The syntax following this function is described below (and further in the man page):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;| - This character is the delimiter between the substitute function sections. It is generally a / character, but since we are including / as a character in our pattern, we have to use another character. It is common to instead use the | as an alternative.&lt;/li&gt;
&lt;li&gt;The first section is the pattern sed will look for. In our case, we are specifying that we want to look for anything that matches the regex http(s)*:// which will match both the http:// and https:// protocol specifiers.&lt;/li&gt;
&lt;li&gt;The second section is empty here because we are using sed to remove the string. In other situations, we could put a replacement string between the last two pipe characters.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Our URLs still have page URL paths and port numbers, which we can remove using awk. We will use two awk statements, with different delimiters to remove these values:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ head -n 3 access.log | awk '{ print $7 }' | awk -F '?' '{ print $1 }' | sed 's|http\(s\)\*://||' | awk -F '/' '{ print $1 }' | awk -F ':' '{ print $1}' 
aus5.mozilla.org 
aus5.mozilla.org 
download.mozilla.org
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Great! We are down to subdomains, which we can generate stats on using the usort alias &lt;a href="http://pythonicforensics.com/useful-cmds-1/" rel="noopener noreferrer"&gt;we created in the prior post&lt;/a&gt;). To demonstrate this further, lets replace the head -n 3 statement with cat to review the full file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ cat access.log | awk '{ print $7 }' | awk -F '?' '{ print $1 }' | sed 's|http\(s\)\*://||' | awk -F '/' '{ print $1 }' | awk -F ':' '{ print $1}' | usort 
1 36cc206a.akstat.io 
1 a.scorecardresearch.com 
1 a.visualrevenue.com 
1 a1.vdna-assets.com 
1 ads.lfstmedia.com 
[...] 
146 cdn.images.express.co.uk 
149 px.moatads.com 
186 pagead2.googlesyndication.com 
205 dt.adsafeprotected.com 
380 [www.google.com](http://www.google.com)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now that we have this list, we can further process this data in a number of ways:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;remove subdomains to get a better count of the number of domains&lt;/li&gt;
&lt;li&gt;feeding the domains into an API to gather intelligence (whois, ASN, GeoIP, reputation, etc.)&lt;/li&gt;
&lt;li&gt;searching for activity specific to one or more of the domains (ie What searches were made with google.com)&lt;/li&gt;
&lt;li&gt;if we add back in the port numbers and/or protocol specifiers we can look into the frequency of those data points in our access log&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Originally published at&lt;/em&gt; &lt;a href="https://pythonicforensics.com/useful-cmds-2/" rel="noopener noreferrer"&gt;&lt;em&gt;pythonicforensics.com&lt;/em&gt;&lt;/a&gt; &lt;em&gt;on October 6, 2018.&lt;/em&gt;&lt;/p&gt;




</description>
      <category>security</category>
      <category>bash</category>
      <category>logs</category>
      <category>linux</category>
    </item>
    <item>
      <title>Useful Commands for Log Analysis</title>
      <dc:creator>Chapin Bryce</dc:creator>
      <pubDate>Mon, 01 Oct 2018 01:54:07 +0000</pubDate>
      <link>https://forem.com/chapindb/useful-commands-for-log-analysis-36of</link>
      <guid>https://forem.com/chapindb/useful-commands-for-log-analysis-36of</guid>
      <description>&lt;p&gt;Over the past few months, I’ve been performing more and more analysis in Linux environments and had the opportunity to refine my go-to commands, picking up a few new (to me) tricks. In this post, I’ll share some of the techniques I like to use and encourage you to share other tips/tricks you’ve used to perform analysis on Linux systems, either as your analysis environment or as your target evidence (or both)!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BF5vRn4e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/1024/0%2A2yo2yxGMeNb8JZKW" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BF5vRn4e--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/1024/0%2A2yo2yxGMeNb8JZKW" alt="" width="880" height="556"&gt;&lt;/a&gt;Photo by &lt;a href="https://unsplash.com/@danielleone?utm_source=medium&amp;amp;utm_medium=referral"&gt;Daniel Leone&lt;/a&gt; on &lt;a href="https://unsplash.com?utm_source=medium&amp;amp;utm_medium=referral"&gt;Unsplash&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is written at the introductory level, to help those who may have not have experienced performing analysis within a bash/sh/zsh (or other) command line environment before.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;Warning: this post does not contain Python&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Spoiler — I didn’t get as far this weekend on this post as I wanted, so I’ll keep this one short and put another one out soon with more tips &amp;amp; tricks.&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  For those newer to the command line
&lt;/h3&gt;

&lt;p&gt;If you are newer to the bash command line, please use the manual (man) pages for documentation. It takes a little while to understand how they are written and where the detail you are looking for lives, so it is best to start using them early. Here's an example of using the man command to learn more about ls:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Common man pages are found on &lt;a href="https://linux.die.net/man/"&gt;linux.die.net&lt;/a&gt;, though they should be available on the same system as where the command is found, as shown above. There are also great sources such as &lt;a href="https://explainshell.com/"&gt;explainshell.com&lt;/a&gt; and &lt;a href="https://jvns.ca/"&gt;Julia Evan's&lt;/a&gt; reference illustrations &amp;amp; cheat sheets (such as &lt;a href="https://twitter.com/b0rk/status/802011306136109057"&gt;this on about the man command&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;For those experienced with the command line please share your favorite resources!&lt;/p&gt;

&lt;h3&gt;
  
  
  Working with logs
&lt;/h3&gt;

&lt;p&gt;Log data is not only a very common source of evidence on Linux platforms but is also easier to work with at the command line since it is, generally, semi-structured text data.&lt;/p&gt;

&lt;h3&gt;
  
  
  Identifying the most interesting content in unknown logs
&lt;/h3&gt;

&lt;p&gt;As part of the process we, in DFIR, like to preview and get a sense of what is useful versus what is noise in a log file. To assist with this data reduction, we can use a few tools and processes to cut down on review time.&lt;/p&gt;

&lt;p&gt;One trick I like to employ is the use of less in combination with grep -v. In the example below, we will be looking at a server's &lt;em&gt;auth.log&lt;/em&gt;, a common log file, and we are interested in seeing successful authentications. While some of us may know to start looking for strings such as &lt;strong&gt;"Accepted publickey"&lt;/strong&gt; from experience, we will walk through getting to that point using the &lt;code&gt;grep -v&lt;/code&gt; method. While grep is a great utility for searching datasets, we want to find the inverse of our pattern and need to use the-v parameter:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Uw2keJsr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://cdn-images-1.medium.com/max/1024/0%2AXgGDjGem2ilydxaz.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Uw2keJsr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://cdn-images-1.medium.com/max/1024/0%2AXgGDjGem2ilydxaz.gif" alt="" width="880" height="352"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As seen above, we add keep adding new log messages to remove from our output, until we see something of interest (ie the &lt;code&gt;Accepted publickey statement&lt;/code&gt;). Now, we can instead grep for &lt;code&gt;Accepted publickey&lt;/code&gt; as shown below:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;A few notes on this method:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;We can use OR statements (&lt;code&gt;|&lt;/code&gt;) to form one larger grep statement, though do what is most comfortable for you&lt;/li&gt;
&lt;li&gt;If the log dataset is small enough it may be best to scroll through the text file&lt;/li&gt;
&lt;li&gt;Inversely to this method, we could start by searching for IP addresses, usernames, and timestamps depending on how much of that information is already known (as sometimes we aren’t lucky enough to have any of those indicators up-front)&lt;/li&gt;
&lt;li&gt;After identifying what string is useful, go back and confirm you didn’t accidentally over-exclude content through the use of one or more of your patterns&lt;/li&gt;
&lt;li&gt;We interchange use &lt;code&gt;fgrep&lt;/code&gt;, and egrep in place of &lt;code&gt;grep&lt;/code&gt;. These are variations on the standard grep interface, though &lt;code&gt;fgrep&lt;/code&gt; is essentially an alias for &lt;code&gt;grep -f&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;fgrep&lt;/code&gt; runs much faster as it only searches fixed strings. It is a good default as a fair amount of the time we are searching for a string without any patterns.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;egrep&lt;/code&gt; allows for extended patterns and is a bit slower. It changes the behavior of the patterns, further detailed in &lt;code&gt;man re_format&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Pulling out useful statistics from log files
&lt;/h3&gt;

&lt;p&gt;Another useful technique is to extract information such as ‘how many IP addresses attempted authentication to the machine’ and related ‘what usernames were they using’. To do this, using the same log as before, we can leverage grep, less, and awk.&lt;/p&gt;

&lt;p&gt;Let’s use a pattern we discovered previously, “Invalid user”, to pull these types of answers. The below output shows the first 5 attempts, using head, where we see the username and IP address in the same message.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Since we want to only extract the IP address, for the first part of the question, let’s use the awk command. This command allows us to process text, in this case, printing selected columns of data. By default, awk will split on spaces though we can change the delimiter if needed. While awk has many functions, we will use the print feature to select the column with the IP address. Since awk will split on spaces, we will select the 10th column (column numbering starts at 1).&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Great — we now have a list of IP addresses. Let’s now generate some stats using sort and uniq. As the names suggest, we will generate a unique list of IP addresses and gather a count of how many times they appear in those messages:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The new statements &lt;code&gt;sort | uniq -c | sort -rn&lt;/code&gt; is what generates our nicely formatted list. The &lt;code&gt;uniq&lt;/code&gt; command requires sorted input to properly deduplicate, &lt;code&gt;uniq -c&lt;/code&gt; provides a count in addition to a deduplicated list, and finally &lt;code&gt;sort -rn&lt;/code&gt; provides a numerically sorted (&lt;code&gt;-n&lt;/code&gt;) list in reversed order (&lt;code&gt;-r&lt;/code&gt;). Since this is a statement I use fairly often, I have made two aliases that I find useful:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;And now I can re-run the prior command using the alias:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;A few notes on this method:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Using space delimiters is dangerous, especially in log files. Imagine, for example, if a username (somehow) contained a space character. We would no longer be able to use column 10 as our IP address column for that row and would need to employ a different technique&lt;/li&gt;
&lt;li&gt;The aliases provided only read from stdin. This works for my use case but is an important consideration. Worst-case scenario, we can always run &lt;code&gt;cat $file | usort&lt;/code&gt; to leverage the alias.&lt;/li&gt;
&lt;li&gt;Adding in the username, or any other field, would be as easy as specifying an additional column number in the awk statement. We would have to reconsider how we generate statistics though, as the &lt;code&gt;usort&lt;/code&gt; alias will read the whole line when providing the counts.&lt;/li&gt;
&lt;li&gt;We can use other tools, such as &lt;code&gt;cut&lt;/code&gt; to provide similar functionality to &lt;code&gt;awk&lt;/code&gt;. Find the ones you like and can remember and use those :)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;One last piece on useful statistics, we can quickly generate larger counts using the &lt;code&gt;wc&lt;/code&gt; utility. Leveraging the above command, we will use &lt;code&gt;wc&lt;/code&gt; to get a count to the number of lines containing "Invalid user":&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ fgrep Invalid\ user auth.log | wc -l 
3549
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;This utility allows us to count other values, such as characters and words, but in this case, we specified &lt;code&gt;-l&lt;/code&gt; to only get us the number of lines.&lt;/p&gt;

&lt;p&gt;Sorry for the abrupt and early stop, but I wanted to memorialize this before it became another multi-weekend project that took too long to release. I hope to continue to put smaller posts like this out, hoping they help someone looking to add more bash/sh/zsh/other shell command line environment into their casework.&lt;/p&gt;

&lt;p&gt;Next post ideas:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Working with JSON data at the command line&lt;/li&gt;
&lt;li&gt;Writing useful loops&lt;/li&gt;
&lt;li&gt;List of useful aliases and one-liners&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Thoughts on the above? Leave a comment below!&lt;/p&gt;

&lt;p&gt;Originally posted 2018–09–30&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Update 2019–01–18&lt;/strong&gt; : Since writing this post, I’ve come across a useful tool for prototyping longer or iterative bash statements. The &lt;a href="https://github.com/akavel/up"&gt;ultimate plumber&lt;/a&gt;, up, is a really great tool for testing new statement and saving the final iteration to a script for re-use.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Originally published on October 1, 2018.&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag__link"&gt;
  &lt;a href="https://medium.com/pythonic-forensics/useful-commands-for-log-analysis-89093d8c51a4" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nnOfDfEH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/fit/c/96/96/1%2Ah2DhYGKhXkcRjxTLyUct1A.png" alt="Chapin Bryce"&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="https://medium.com/pythonic-forensics/useful-commands-for-log-analysis-89093d8c51a4" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;Useful Commands for Log Analysis. Over the past few months, I’ve been… | by Chapin Bryce | Pythonic Forensics&lt;/h2&gt;
      &lt;h3&gt;Chapin Bryce ・ &lt;time&gt;Jan 19, 2019&lt;/time&gt; ・ 
      &lt;div class="ltag__link__servicename"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hnDHPsJs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/medium-f709f79cf29704f9f4c2a83f950b2964e95007a3e311b77f686915c71574fef2.svg" alt="Medium Logo"&gt;
        Medium
      &lt;/div&gt;
    &lt;/h3&gt;
&lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;






</description>
      <category>logs</category>
      <category>forensics</category>
      <category>bash</category>
      <category>cybersecurity</category>
    </item>
  </channel>
</rss>
