Wednesday, May 13, 2015


I completely forgot to mention something in my last post. My friend Tom is doing a Year of Python series on his Ram Slack blog. He is posting a new Python 2 project each week for a whole year in his quest to learn. He's doing some very cool stuff and I recommend you check it out.

Tuesday, May 12, 2015

Trying to learn Python.....again

Every so often, the urge to learn Python reappears and I give it another shot. I work on it for a while and then something happens; I give up, get distracted or otherwise just don't get very far. I've tried various books, YouTube videos, etc, but just never seem to stick with it long.

Despite my inability to code thus far, I do believe it is important for those of us doing any kind of DFIR to at least have a basic grasp of some language, be it Python, Perl, Ruby or what have you. Not only that, I just think it's cool and I want to learn. Having the ability to work up a script to accomplish some task would be great, even if it's not anything anyone else would ever want or need.

The problem with choosing Python is that you also have to decide which Python you want to learn. Python 2.x and Python 3.x are two different animals; at least to some degree. I initially decided on 2.x, since most Python users I know have gone that route and I know there are a lot of available resources for that version.

I created an account at CybraryIT and started taking the free Python course. I've been enjoying this course and the instructor seems knowledgeable. CybraryIT is a wonderful, free training resource and I'm taking another course there as well. They have a couple other courses in the pipeline I'm going to take when they're made available. But I digress.

I recently noticed a new book by Al Sweigart called Automate the Boring Stuff with Python at No Starch Press. I thought it sounded interesting and ordered it, failing to notice the focus was on Python 3. I started reading and really liked Sweigart's style and the material held my attention.

Then, last week I saw the Introduction to Python video series with Jessica McKellar on sale at Oreilly and the focus was on Python 3. Having watched other talks by McKellar on YouTube, I knew that I enjoy her presentation style and that she was someone who knows what she's talking about. I ordered the videos and started watching right away.

I feel like I've finally found some training materials that are really helping me. Maybe it's a combination of that and finally having the resolve to follow through. Whatever the case, I do believe I'm making progress.

So, while I still have interest in Python 2, I've decided to forge ahead with 3 since I've got such great materials to learn from. I may do actual reviews of these materials at some point in the future, but for now I'll only say I'm quite happy with them at this point and plan to continue.

If you decide you want to learn Python 2, I certainly recommend the course at CybraryIT.It's free, so what have you got to lose? If you want to learn 3, then I do recommend the book and videos I mentioned above.

Friday, January 9, 2015

Nominations Open for 2014 Forensic 4cast Awards!

Hello all,

As the title of this post states, nominations are now open at the Forensic 4cast website for this years awards. These awards ceremony has become a highly anticipated event each year at the SANS DFIR Summit in Austin, Tx. I'm proud to say I've been nominated twice and won once, winning the 2013 Digital Forensic Blog of the Year.
I've already got some nominations in mind for this year and I plan to submit them soon. As far as blogs go, there are several that can be counted on year after year to provide first-rate content. You can always find excellent content on Corey Harrell's Journey Into Incident Response blog and Harlan Carvey's Windows Incident Response blog. I'm still making up my mind over those and a couple others.

Some candidates I'm considering nominating for Digital Forensic Examiner of the Year include Ken Johnson, Eric Zimmerman and Frank McClain. Ken has done some amazing work on Windows 8 forensic artifacts, while Eric has done excellent work in the area of shellbags artifacts and even released a tool called Shellbags Explorer. Frank is this guy who is always there for people. If you're on any of the DFIR related email lists, you've no doubt seen him there. He is often the first to reply to a request for help and can always be counted on for good advice and suggestions.

Without a doubt, I will be nominating The Art of Memory Forensics for Digital Forensic Book of the Year. I don't remember the last time a forensic book generated as much excitement at the time of its release as this one did. It's a huge book with so much good information. I will be very surprised if this book doesn't take the award. Win or lose, I offer my congratulations to Michael Hale Ligh, Jamie Levy, Andrew Case and AAron Walters on the success of this book.

For the software tool category, I'm looking at such good candidates as Brian Moran's Live Response and Eric Zimmerman's Shellbags Explorer. I'm sure others will come to mind, but these two are definitely in the running.

I do very little in the area of mobile device forensics, so I really don't have any opinions on the Phone Forensics categories. Likewise, I haven't had the opportunity to try out any new hardware, so I really have nothing to add there as well.

I'm still making up my mind on other nominations. If I failed to mention you or your favorites above, that doesn't necessarily mean I'm not considering them as well, so please don't be offended. These are the ones that stand out in my mind at this moment. There are so many good people, blogs and tools out there that it's hard to remember each one as I write this.

I hope you'll take the time to submit your nominations. Thanks again to Lee Whitfield for putting the awards program together each year.

Friday, January 2, 2015

Happy New Year!

I have been absent from the blogging scene for a while now (again). To be honest, I haven't had a great deal worth writing about and didn't really have time anyway. I did want to mention a couple things, though.

I was pleasantly surprised to be nominated for election to the board of the Consortium of Digital Forensic Specialists and even more surprised to find out I got elected. I gave it considerable though before accepting the nomination. I decided to go for it because I do care about the CDFS and the role it can play in our field. I'll have more to say about it as I get involved with the board. Thank you very much to all those who voted for me.

I was fortunate to attend both the Open Memory Forensics Workshop and the Open Source Digital Forensics conference back in November. As expected, both were very much worth attending. I plan to talk more about them in a (hopefully soon) future post, but I just wanted to say thanks to the Volatility crew and Basis Technology for such a great couple of days. Besides the great talks, I was happy to connect with friends I hadn't seen in a long time. I was also happy I got my copy of the Art of Memory Forensics signed by all four authors. I will be very surprised if this book doesn't win a 4cast award this year I was also the lucky recipient of a $100 Amazon gift card at the OSDF conference!

My friend Carlos Cajigas has a new post up on his Mash that Key blog talking about using the built-in tools in Linux to view text based logs. He goes through auth logs from his Linux server and shows how to use grep, cut, head and other commands to narrow down the data to what you're really wanting to see. This is well worth a read if you find yourself parsing through server logs to the point of driving you nuts.

That's all I've got for right now. 2014 was a great year for DFIR and I look forward to seeing what this new year will bring.

Thursday, January 1, 2015

Book Review--Penetration Testing

Welcome to the long overdue review. I was contacted by the good people at No Starch Press early in 2014 and asked if I would like to review Penetration Testing by Georgia Weidman when it came out. I jumped at the chance, as I had no background in pen testing, but I've always found the subject interesting. I thought learning about attack techniques might help me be a better forensic investigator as well. I received the book soon after the initial contact but due to a number of things failed to get this review done till now. My apologies to No Starch and Georgia Weidman for taking so long to get this posted.

This is a big book, with 20 chapters comprising a total of 476 pages, not including the index. There are supplemental materials and a Linux virtual machine available for download that allow the reader to  work the examples in the book. Additionally, guidance is given on setting up your entire virtual lab. The guidance includes setting up Windows XP, Windows 7, along with Android emulators. I loved how detailed the instructions were for setting everything up. There were quite a few files to download for the labs, but it was well worth my time and bandwidth to get them.

Along with the above, a torrent is available to download the same version of Kali Linux used in the book. I was unable to use it with VMWare Workstation and it turned out it would run in VMWare Player, but not necessarily in Workstation. I wound up building my own Kali virtual machine and used it through all the labs.

The book covers a little programming in some spots, so a programming primer was included. I am definitely NOT a programmer, so I found this primer to be very helpful.

Throughout the rest of the book, topics such as Metasploit, information gathering, finding vulnerabilities and even post-exploitation are covered. Instruction is given on web application testing, wireless attacks, exploit development and mobile device hacking are also covered in great detail.

After reading this book, I understand so much more about penetration testing than I did before. I learned a lot about how pen tester's gather the information and use it to their advantage through social engineering and other means. I also now have a much greater understanding of how attacks are done and I believe that understanding will help me do my work as a forensic investigator even better.

Weidman does an outstanding job of covering a pretty big range of topics in this book. With the wide range of topics, I can see how it would be difficult to put it all in one book and wind up with something that works, but she managed to pull it off. I enjoy her writing style and loved the labs, too. I don't know how long it took her to put this book together, but it's obvious she spent a lot of time writing and creating the labs and supplementary materials.

If you want to learn about many aspects of penetration testing, I highly recommend this book to you. This book is everything, including the kitchen sink and after reading this book you'll come out with a much better understanding of what pen tester's do and how they do it.

Monday, September 8, 2014


This past weekend, I was fortunate enough to attend the inaugural ArchCON in St. Louis, MO. It would be a massive understatement to say that I had a great time. I came away from the event feeling like my time had been well spent and that I had learned a lot.

The two-day event included optional training courses (for extra fee) on Friday. The two available courses were Malware Analysis, taught by Tyler Hudak and Network Analysis with the Bro Platform taught by Liam Randall. I chose the malware course, although I would have loved to have taken the Bro class as well.

The malware class was a one-day version of what is normally a two-day course. Tyler started out the day teaching about basic static analysis and then moved into dynamic analysis after lunch. There was plenty of hands-on work to keep you interested and involved. Tyler provided an excellent course manual that included the material for the full two-day class. The class was very good and I felt like I came away better off for having attended.

The conference continued on Saturday with the full line-up of speakers. ArchC0N reminded me of another conference that started off with a bang, the WACCI conference a few years back. ArchC0N had some of the biggest names in infosec, a three track lineup of speakers, provided lunch both days and did all of this for an extremely low price. The keynote speakers were Richard Bejtlich, Bruce Schneier, Emily Brandwin and Charlie Miller. Sadly, I had to hit the road before Charlie Miller spoke, but I found the other three keynotes very good. Emily Brandwin was a very pleasant surprise and I enjoyed her talk, which had nothing to do with infosec.

The list of speakers for all the sessions was just as impressive. It was very difficult to pick which ones to attend, due to the high quality of all the sessions. You can view the speakers lineup on the ArchC0N site. I attended great talks by Kyle Wilhoit, Ken Johnson, Tyler Hudak, Scott Roberts and Kyle Maxwell. There were others I wanted to attend, but could only find the means to be in one place at one time.

I want to say kudos to Paul Jaramillo and Jason Barnes for putting together a fantastic first of what I hope will be many many annual returns. I also want to thank the sponsors. You can't hold such a great conference without the backing of some great sponsors and I appreciate their doing so.

If you didn't make it to ArchC0N this year, I urge you to be there next year. Our part of the country needs more conferences and I'm thrilled Paul and Jason took the bull by the horns and got the job done.

Sunday, July 13, 2014

From China with Love? (Part 2)

In part 1 of this series, I detailed an intrusion to my SSH honeypot. If you didn't read part 1, you might want to for background info before reading this one.

Linux forensics/incident response is a new thing for me. I've never had occasion thus far to conduct a "real" investigation into a Linux machine. This "intrusion" into my honeypot inspired me to conduct my own attack and investigation so I could learn more about the subject. I'm a noob to this stuff, so if you see things I did wrong or could have done better, I'd be delighted to hear constructive criticism from you.

Given my lack of experience with Linux based investigations, I've searched for information on the subject but haven't found a lot. I read the excellent book by Chris Pogue, Unix and Linux Forensic Analysis DVD Toolkit a few years ago and still refer to it from time to time. A note to Chris, I'd still love to see a second edition of the book! But I digress...

I began by creating a new Ubuntu Server 12.04 server virtual machine. I installed openssh-server so I could connect to the VM from the host and placed the VM network settings to Host Only. I also set a password for the root user, so I could log in immediately with root privileges, just as my honeypot attackers had done.

I installed Inetsim on my host machine and placed a copy of the file my attackers had downloaded to my honeypot to the "fake" files so I would be able to download it in the VM. Prior to starting my attack, I configured Inetsim to be listening on the Host Only network and started the program. I also started tcpdump, using the -i option to listen to the host only, vmnet1 adapter and -w to output a .pcap file. Finally, the time to attack had arrived.

With my virtual machine running and logged out, I opened a terminal on the host machine (Ubuntu 14.04 LTS) and typed the command:

ssh root@

I was greeted by the usual warning that the system can't identify the authenticity of the host I'm trying to connect to and wanting me to verify I really wanted to do it. I hit Y and the host was added to the .ssh/known_hosts file. After entering the password, I was in and ready to do my nefarious deeds.

I decided to type in all the same commands my honeypot attacker had, with a couple of exceptions. When I got to the point of downloading the file, I decided to just make up a fictional URL instead of using an IP address with port number, as the real attacker had done. I also decided to simply execute the file once downloaded instead of using the nohup command so I could be a little more sure I would see everything in my memory capture. I may do it again later with the nohup command and compare the results of memory analysis.

After going through all the commands the real attacker had, including downloading the .bash_root.tmp3 file to the /tmp directory and running it, I entered the ps x command, just to see what I could see before logging out. I noted the downloaded file was running in memory as expected. I then logged out and disconnected from the ssh session.

My initial plan was to conduct the attack on a virtual machine, then pause it and collect the .vmem file for memory analysis, followed by grabbing the .vmdk file for disk analysis. However, that didn't seem very realistic, so I changed my plan.

For the memory collection, instead of using the .vmem file, I decided to use Hal Pomeranz' Linux Memory Grabber script (lmg). lmg makes use of Joe Sylve's excellent LiME, a loadable kernel module which allows the collection of RAM from a Linux or Android system. lmg also makes use of Volatility and dwarfdump to create a Linux profile of the system for use with Volatility. I have used LiME by itself before with good results, but I wanted to try lmg and see how it worked automating the collection of memory and the creation of the Volatility profile.

I followed the directions for setting up lmg on an 8 gb thumb drive, but had some problems getting lmg to setup properly. Hal was kind enough to help me with it and I was able to use it for this investigation. I won't go into how to set it up here. You can read the install directions on the lmg Github page. I added a static version of dwarfdump and a freshly downloaded copy of Volatility 2.3.1 to the thumb drive and tested it on a non-infected system with positive results.

I went back to the virtual machine, logged in as root and mounted the lmg thumb drive. The simple command ./lmg -y started the script and it executed with no problems. lmg captures RAM and saves it in a capture folder on the thumb drive. As noted previously, it also automatically creates a Volatility Linux profile and saves it to the thumb drive as well. The VM only had 1 gb of RAM, so this process didn't take long. Once it was complete, I shut the VM down.

I next went to the VM folder on the host machine and used dc3dd to make a copy of the .vmdk file. Kinda silly, I guess, but I wanted to do things semi-realistically. I could have booted the VM again to a live CD and imaged it that way, but decided I would save a little time.

So now, I have a RAM capture, a disk image and a .pcap file. I'll be taking a look at the RAM capture first with the awesome Volatility and post about that in part 3. I will also post the Linux profile to my Linux Volatilty Profiles Github page soon.