Quantcast
Channel: Hacking Exposed Computer Forensics Blog
Viewing all 877 articles
Browse latest View live

Daily Blog #152: Forensic Lunch 11/22/13

$
0
0
Hello Reader!,
            Another Forensic Lunch already? I know right! Another week has gone by and we have a great forensic lunch for you this week, not that i would tell you its not great every week. This week Mari DeGrazia join us to talk about her work building a python parser for recovering deleted data from SQLite databases and Eric Zimmerman came on to talk to us about passing the new X-ways Xpert certification and the upcoming OSTriage v2 which will be available for non law enforcement use!

You can read Mari's blog here: http://az4n6.blogspot.com/ 
To read up more on OsTraige read the forensic focus thread here: http://www.forensicfocus.com/Forums/viewtopic/p=6565347/


Want to be on the lunch? Just email me dcowen@g-cpartners.com and I'd love to have you on!

BTW If you've been wanting to listen to the lunch as a podcast you can now! Just subscribe to it here: http://www.learndfir.com/?feed=podcast


Daily Blog #153: Saturday Reading 11/23/13

$
0
0
Hello Reader,
           It's Saturday! I had a bit too much fun playing the heartstone beta last night so I didn't post this the night before like I usually do, no reason not to share good links though!

1. Forensic Lunch went down yesterday! We had Mari DeGrazia on to talk about her research into SQLite deleted data recovery and Eric Zimmerman talking about being the first Xways Xpert and OsTriage v2. Watch it here http://hackingexposedcomputerforensicsblog.blogspot.com/2013/11/daily-blog-152-forensic-lunch-112213.html.

2. Yogesh Khatri has been putting up some good blog posts this week in regards to changes in USB device forensic in Windows 8. He's done this in two posts this week, the first is on new registry entries from USB device removal with timestamps, http://www.swiftforensics.com/2013/11/windows-8-new-registry-artifacts-part-1.html very cool! The second is talking about which event logs are not being created on USB device insertion and removal http://www.swiftforensics.com/2013/11/event-log-entries-for-devices-in.html. This is great stuff and hopefully he'll keep going!

3. In an interesting civil case over on the CYB3RCRIM3 blog a unhappy consumer sued best buy an represented himself, http://cyb3rcrim3.blogspot.com/2013/11/the-laptop-malware-and-consumer-sales.html. This case is interesting to me because claims revolved around not just the typical warranty issues but also the malware/spyware found on his computer. Good reading for anyone buying computers and warranties from a retailer.

4. On forensic focus there is a new article up on new metadata found in OSX Mavericks, read it here http://articles.forensicfocus.com/2013/11/13/os-x-mavericks-metadata/. The article goes into two different types of new metadata found in OSX Mavericks, email attachments saved to disk and file tagging.

5. Harlan has a new post up on using the 'sniper forensics' methodology of examination to quickly find malware and reduce analysis time. He then goes into working with Volatility and his steps taken in using it for memory analysis. A good read you can see here http://windowsir.blogspot.com/2013/11/sniper-forensics-memory-analysis-and.html.

6. If you are doing forensics on OSX systems your going to run into virtual machines as most users run their Windows apps in Parallels of Fusion. This can be a pain as you want a forensic image to work with in most of your tools. This article on appleexaminer goes through how to convert these images to raw/dd images using qemu http://www.appleexaminer.com/MacsAndOS/Analysis/VirtDiskConv/VirtDiskConv.html.

7. Dealing with dropbox on Windows XP and want to decrypt more of the databases? Magnet forensics has updated their tool to now work against any Dropbox database and its free! http://www.magnetforensics.com/decrypting-the-config-dbx-file/

8. Forensic Femmes has a good interview with Sk3tchmoose aka Melissa Augustine about her work in DFIR http://christammiller.com/2013/11/19/forensic-femmes-4-melissa-augustine/

9. The Volatility guys put up some more training dates, http://volatility-labs.blogspot.com/2013/09/2014-malware-and-memory-forensics.html, this is a class I'd like to take in the future!

That's all for this week, lots of good stuff out there. Sunday Funday is coming up shortly after!

Daily Blog #154: Sunday Funday 11/24/13

$
0
0
Hello Reader,
        It's Sunday Funday time! Let's get into some more real world scenarios and combine some different types of analysis.

The Prize:

  • A $200 Amazon Gift Card

The Rules:
  1. You must post your answer before Monday 11/25/13 2AM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:
Your board of directors have received an email from a Gmail address sent from Thunderbird mailer at 9pm at night with insider information about the company with a demand for action or the sender will go to the press. IT security has found the IP address of the companies firewall of one of the smaller company branches in the email header and passed the data to you. The branch has only 8 employees and normal office hours end at 5pm.

Please detail how you will:
1. Determine which system sent the email
2. Determine which user of the system sent the email

Good luck!

Daily Blog #155: Sunday Funday 11/24/13 Winner!

$
0
0
Hello Reader,
     Another Sunday Funday come and gone and some good entries this week!I really liked this weeks winner because he went into both physical access control (badge logs), network access controls (firewall logs), and host access controls (event logs and artifacts) to narrow down his suspects and fully examine the facts at hand. Great job Steve M!

The Rules:
  1. You must post your answer before Monday 11/25/13 2AM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:
Your board of directors have received an email from a Gmail address sent from Thunderbird mailer at 9pm at night with insider information about the company with a demand for action or the sender will go to the press. IT security has found the IP address of the companies firewall of one of the smaller company branches in the email header and passed the data to you. The branch has only 8 employees and normal office hours end at 5pm.

Please detail how you will:
1. Determine which system sent the email
2. Determine which user of the system sent the email

The Winning Answer:
Steve M
Considering this case may involve litigation, I would stress collecting good case notes and following proven processes throughout the investigation.

To initially narrow the search of suspects, I would first check physical access badge logs to determine if anyone was in the office at 9pm.  If employees can access the Internet via this firewall when connected via VPN, I would check VPN access logs as well.

On the network side, I would begin by looking at the perimeter firewall traffic logs for mail connections to Google/gmail, specifically on ports 993/TCP (IMAP), 995/TCP (POP3), and 25/TCP or 587/TCP (SMTP) which would be used by Thunderbird.  Google may switch services between various IP ranges, but as of right now it appears most mail related addresses resolve within the 173.194.0.0/16 subnet.  It is probably a safe assumption to say Google won't move their mail services outside of this network/port range in the foreseeable future.  Once I have identified an internal IP address as communicating to Google over these ports during the window the email was received, I would look for DHCP logs or an asset inventory system to determine which internal system had the IP address at that time.

Once I have identified a suspect internal system, I would proceed with host level forensics to find the user with that account's information in his/her Thunderbird profile (assuming Win7):

- First, I would take an image of the system for investigation and evidence purposes.  I would also duplicate it for a working copy.

- I would sweep the system for directories matching "C:\Users\\AppData\Roaming\Thunderbird\Profiles\", to see which Windows logon accounts use Thunderbird.

- At this point, I would perform a raw text search of the entire contents of this directory (including "ImapMail" and "Mail") for the specific source email address in question.  If hits were found, I would focus the remainder of my investigation on this profile.

- If no hits were found, I may have to extract the usernames from all Thunderbird profiles on this system.  Considering both usernames and passwords can be decrypted with this method, I would NOT do this on a large number of potentially unrelated profiles due to privacy concerns.  Instead I would revisit my case notes to see if I can pinpoint the suspected user via other means (event log records and other filesystem activity within the same time period for example).

- Once I've identified a profile with hits, I would extract the profile folder from the disk for further investigation.  Mozilla stores the usernames and passwords encrypted in the "moz_logins" table of the "signons.sqlite" sqlite db and the encryption keys in the "key3.db" file.

- I would use a tool such as "ThunderbirdPassDecryptor" to decrypt the username/password from this profile using these files.  At that point, the source email address (and password) should be known, and the owner of the Windows profile can be considered the source.

Daily Blog #156: The search for the next great intern

$
0
0
Hello Reader,
         I'd like to take the time this post to inform you of two things:

1. We are accepting applications currently for a paid internship for the spring 2014 semester.
If you are willing to move to DFW metro area, or are already here, and looking for an internship I'm looking for you! This is a paid research internship focused on file system forensics and other new types of analysis were you will be helping break new ground! Mail your resumes to dcowen@g-cpartners.com with the subject 'Intern Spring 2014'.

2. We are preparing a game of sorts for what i like to call the 'next great intern' contest for fall 2014

To make things more interesting and to help us find candidates who shine with their problem solving skills better than their resumes we are preparing a contest. The contest will be a series of puzzles that lead to each other that you can solve at your own pace. The winner of the contest will be given a paid internship for the Fall 2014 semester!

So email me now and keep a look out for our first clue coming soon!

Daily Blog #157: Metadiver!

$
0
0
Hello Reader,
            Another tool from our lab has escaped into the light of day. David Dym on his RedRock blog has posted up the first version of Metadiver, http://redrocktx.blogspot.com/2013/11/introducing-metadiver.html. Metadiver was born out of the frustation of most forensic suites inability to display all the relevant metadata embedded with many of the file format available today.

What metadiver does is call shell32 to query all the metadata fields the driver for that file type makes available, recurses through directories in doing so and then writes it all out so you can quickly review for the metadata your interested in. You should give the tool a shot and see if you find some metadata your tools aren't showing you.

Daily Blog #158 Happy Thanksgiving!

$
0
0
Hello Reader,
          It's Thanksgiving in America and I had a great day of cooking and enjoying family time. My wife always tells me that I should break up the technical posts with a recipe now and then so let's do that.

The HECF Blog Smoked Turkey

If you are my friend on Facebook you would know that on fathers day I got  a Weber Smokey Mountain smoker and I've been reaching my full potential as a Texan ever since by using it. I used it this morning to make a smoked turkey that was met was happy bellies.

I started my recipe by following the recipe listed here:
http://amazingribs.com/recipes/chicken_turkey_duck/ultimate_smoked_turkey.html

There is a lot of words there, and if you want to make the most amazing meal of your thanksgiving days you should read all of them. The summary though is as follows.

1. It takes multiple days to actually defrost a turkey, make sure you buy it 4 days before and let it rest in the fridge or cooler to fully defrost.
2. Many turkeys sold in stores today have already been put into some kind of brine, don't brine your turkey unless it does not say some version of 'contains up to x% water and spices'
3. The night before you are going to cook, get a rub (either store bought or follow a recipe on amazing ribs) and mix it with equal parts olive oil to make it a wet rub.
4. Cut off the excess fat around the neck and cavity as well as anything binding the legs together
5. Place the rub on the skin but also reach into the neck cavity and get it under the skin and on to the meat directly and then let it sit overnight
6. The morning of place onions/garlic/orange peels and thyme in the cavity
7. Take a little extra olive oil and salt and place it on the skin so it will crisp better
8. Make a drip pan to put under your turkey with onions, herbs, and chicken broth and the giblets
9. Heat up your smoker/grill to 325 and then place the bird on with a drip pan underneath to catch the drippings
10. Cook the bird until the internal temp hits 160
11. Remove the bird from the smoker and then let it rest 30mins
12 . Take the dripping pan and dump it into a blender
13. Mix the blended drippings, flour and heavy cream w/ chicken broth and pepper and stir to make a gravy

This is what the turkey will look like!


There you go!

Daily Blog #159: Making the forensic lunch

$
0
0
Hello Reader,
       This is normally where I would post a video and tell you about the guests we have on the forensic lunch this week except, there was no forensic lunch this week! Seeing as it is a holiday weekend in America it seemed unlikely to succeed so we have punted until next week. So I thought this week I would address a question I get asked a lot, what equipment are you using to make the forensic lunch?

We have 4 microphones all from sterling audio on mic stands that run into a focursrite 18i20 for preamp and mixing. We then have an output running from the focusrite into the line in on the PC in the room. For video we have a hp lifecam on top of the TV to get a wide angle into the room.

From there its all about the right combination of google services with the right account to make things work. If you want to get a hangout on air event like we do you need to do the following:

1. Link a youtube account to a google+ page
2. SMS verify your account, this will enable hang outs on air
3. As the same google+ page you then create an event for the time you want the event to occur, do not enable as a hangout
4. On the day of your event create a hangout on air as the google+ page
5. Once it has started enable the Q&A module which will, if you did all the right things as the right account, find your google+ event and link the hangout on air to it
6. Give the hangout link to the people you'd like to join
7. Use the cameraman app to set your options for how the video will be displayed and recorded within the hangout
7. Click go on air when you are ready to broadcast
8. Take questions with the Q&A app and make sure to click which ones you are answering for those watching later
9. Take the youtube link when your done to share the video

So there you go! The next hard part is finding people to come on the air. Speaking of I'm looking for guests for next week's forensic lunch so please email me, dcowen@g-cpartners.com, to get on.

Daily Blog #160: Saturday Reading 11/30/13

$
0
0
Hello Reader,
        It's Saturday! Time for more links to make you think as we go through what I've been reading this week.

1. The handler diaries is a blog I've been following recently, this isn't a new post but I don't think I linked it before. This post covers the discovery phase of an incident which can be the most crucial phase, http://blog.handlerdiaries.com/?p=128.

2. Our DFIR friends in NOLA have put out a master's class on registry forensics over on hacker academy. It only costs $399 and its gotten high marks from Ken Pryor. I plan to give it a go later on this month, https://hackeracademy.com/masterclass/registry-forensics.

3. There is a good article up on forensic focus on what we can determine from the usage of the Windows 8 file history feature, http://articles.forensicfocus.com/2013/11/24/2869/.

4. Lee Whitfield over on the Forensic 4:cast has announced the beginnings of the 2014 Forensic 4:Cast awards and is looking for comments on how to improve them. If you have ideas please let him know! http://forensic4cast.com/2013/11/4cast-awards-2014/

5. Hexacorn has a new blog up on how to setup your work environment do get things done faster, http://www.hexacorn.com/blog/2013/11/25/doing-things-faster/. It's not directly forensic related but following his tips could help you find your zen like analysis state.

6. The Accessdata Users Conference opened up a CFP for the first time, https://www.ad-users.com/call-for-speakers.  It's a fun conference and a week separated from CEIC and both in Las Vegas, so if you speak at both you could stay in Vegas even longer!

7. SANS has been accredited to offer a Masters program in conjunction with their training, http://www.sans.edu/accreditation. That's pretty neat and a nice option for those of you that are looking to get both technical training and higher education credentials.


That's all for this week, things get slower around Thanksgiving in the states. I hope you had a great holiday with your families as I did with mine. Please make time tomorrow for another Sunday Funday!

Daily Blog #161: Sunday Funday 12/1/13

$
0
0
Hello Reader,
        It's Sunday Funday time! Let's get into some more real world scenarios and combine some different types of analysis.

The Prize:


  • A $200 Amazon Gift Card

The Rules:
  1. You must post your answer before Monday 12/2/13 2AM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:
You are involved in a case that involves emails containing confidential information being sent to outside parties. You've been given an image of one of the outside parties computer and the name of the email that was sent by subject and date.  The image was created two weeks after the email was sent.You've located the message on the image but the suspect has denied accessing any attachments.

Please detail how on a Windows 7 system running Outlook 2007 you can determine:
1. What attachments were accessed in the last two weeks
2. When attachments were accessed
3. How many times attachments were accessed

Good luck!

Daily Blog #162: Sunday Funday 12/1/13 Winner!

$
0
0
Hello Reader,
                     Another challenge behind us and this week saw fewer responses than normal, just one submitted in time in fact. I'm not sure if you all were in turkey coma or if the challenge itself stumped you. This weeks answer is good as it covers the standard analysis one would expect but it is missing data from the USN journal and the behavioral patterns of Outlook 2007. This sounds like a good series of posts to work on this week. Congratulations of Simon Mccabe for winning this weeks contest!

The Challenge:
You are involved in a case that involves emails containing confidential information being sent to outside parties. You've been given an image of one of the outside parties computer and the name of the email that was sent by subject and date.  The image was created two weeks after the email was sent.You've located the message on the image but the suspect has denied accessing any attachments.

Please detail how on a Windows 7 system running Outlook 2007 you can determine:
1. What attachments were accessed in the last two weeks
2. When attachments were accessed
3. How many times attachments were accessed

The Winning Answer:

Simon Mccabe
Once the image has been acquired and processed in EnCase, I would navigate to C:\Users\ and export his/her NTUser.dat file. This NTUser.dat file would be the most recent NTUser.dat file for the suspect, without going into restore points. In order to find out what attachments were accessed in the last two weeks, I would open the user's NTUser.dat file in AccessData's Registry Viewer and navigate to:

"HKEY_CURRENT_USER\Software\Microsoft\Office\12.0\Outlook\Security"

Outlook 2007 (sometimes) creates a temp folder for attachments that were opened directly from an email. The registry location listed above would tell you the temp folder, which would be in the user's "AppData\Local\Microsoft\Windows\Temporary Internet Files\Content.Outlook\\" directory. Any files held within the temp folder would have timestamps showing the creation time, so if the creation time was in the last two weeks, this would be evidence.

For further evidence, I would then use Harlan Carvey's RegRipper to rip the user's NTUser.dat file to a txt file. I would open the NTuserRIP.txt file and search for 'recentdocs'. This could show the last write time and information about which files were last opened, with the most recent appearing at the top.

I would then green-plate the entire folder structure in EnCase and sort by file extension. I would be looking for lnk files. More specifically, I would then look for lnk files which point to any of the attachments. Lnk files show the MAC (Modified, Accessed, Created) times of the files, so this may provide important evidence about when files were accessed.

To find out how many times attachments were accessed, I would look for the 'userassist' records in the NTuserRIP.txt file I had made. I should be able to see the count in brackets, for example: (3). This would suggest that the suspect had opened the attachment three times, along with when it was last opened.

Daily Blog #163: Solving Sunday Funday 12/1/13 part 1

$
0
0
Hello Reader,
           This weeks Sunday Funday seems a good candidate for further explanation given the small number of responses received. So this week lets go through the artifacts I would look at (and have looked at in real cases) to solve this challenge. As a reminder here was the questions asked:

Please detail how on a Windows 7 system running Outlook 2007 you can determine:
1. What attachments were accessed in the last two weeks
2. When attachments were accessed
3. How many times attachments were accessed

So let's start with Question 1. How do you determine what attachments were accessed in the last two weeks on Windows 7 using Outlook 2007. In prior versions of Outlook and Windows this was a pretty easy task. As the winner pointed out you can go to the temporary directory where Outlook will extract attachments to see what exists. Standard protocol prior to his combination of OS and Outlook versions would then allow you to see what number is appended to the end of the file name.

For example if an attachment was named 'TopSecret.doc' and accessed three times over a period of time you would see the following attachments extracted in the OLK directory for older versions of Outlook/Windows:
TopSecret.doc
TopSecret(2).doc
TopSecret(3).doc

You would then look at the Filename timestamps in the MFT to determine when the attachment was accessed as Outlook will reset the Stdinfo timestamps to the time the email was sent. Outlook would rarely clean up after itself in prior versions leaving great evidence for quite some time.

This protocol is not as useful as it used to be for the following reasons:

1. Outlook 2007 will delete an attachment when it is closed from the Content.Outlook directory. If Outlook closes while the attachment is still open then the attachment will remain there.
2. Windows 7 will defrag once a week meaning older attachments deleted entries will be purged from the MFT
3. Outlook 2007 will extract two copies of the attachment, up to three on preview and open, meaning you can't rely only on the number after the filename.

So we are left looking for alternate sources, the one we rely on the most now for these accesses is the USN Journal. The USN Journal tracks all file creations, renames, opens, closes, changes and deletions. As such we filter the USN for the MFT reference number of the Content.Outlook folder to find all the attachments opened within it, timestamped to their opening. You might be tempted at this point to take the time of open and close of a file to indicate the total time open, but this can be a red herring as our testing has shown that file handles are closed while a process is backgrounded and then reopened later when activity resumes.

Tomorrow I'll go into detail on the artifacts left in the USN.

Daily Blog #164: Solving Sunday Funday 12/1/13 Part 2

$
0
0
Hello Reader,
        Yesterday we left off with a general description on what changed between Windows XP/Vista and 7 as well as versions of Outlook prior to 2007 and their impact on the forensic analysis of attachment access. Today let's go deeper into the artifacts with what is created when attachments are accessed in a Windows 7 system running Outlook 2007.

In this post we will go over whats left in the $MFT, USN Journal and $Logfile when an attachment is accessed and closed before outlook closes. Tomorrow we will focus on what the artifacts look like when the attachment is left open after outlook closes and then finish up this series with conclusions we can draw to answer sundays challenge.

In my test image, which I will provide for you to download tomorrow, I have a Windows 7 system that I have done a couple tests on. I have installed Outlook 2007, Word 2007 and Adobe Reader on this system. I then created a new email account with Yahoo! called sensorsuspect@yahoo.com and emailed it two different attachments, one pdf version of my cv and one docx version of my cv. I sent two different file types as they create different artifacts from normal access.

When first thinking about this issue you might think that the MRU's for the extensions pdf and docx would be get updated with these files. I have searched the NTUSER.DAT registry for the user minutes after these accesses and within the allocated registry there is no entry made for the opening of these pdf and docx files. Contained within the unallocated space in the registry using yaru from TzWorks I was able to find the following key:
"C:\Users\Suspect\AppData\Local\Microsoft\Windows\Temporary Internet Files\Content.Outlook\AGI0N64C\cv_cowen_2013.docx"
Left behind by the Microsoft Office Word Previewer when I previewed the message for a second time before opening the attachment directly within Microsoft Word.

So no LNK, no Jumplist, no MRU recent documents key gets created in this access. Within the $MFT I can see only one of the temporary files that was deleted from my first viewing, the other having already been purged out.

This leaves us with the $logfile and USN Journal. The $logfile is helpful here but as the Content.Outlook directory is on a system drive by default it won't last long. So let's look at what the USN Journal knows about my attachment access.


In the 34 seconds that passed between when I opened and closed both attachments we got 18 records from the USN Journal. We can see when each attach was first opened by looking at the file_created flag and we'll have two file_created events for each file as Outlook 2007 by default is extracting two copies of the file on access. I don't know why this is yet and I have seen occasions when it has been only one.

We can see when I closed the file here with the file_deleted, file_closed event. Outlook deletes the file when I close it and this indicator can be a false positive to file deletion. It does represent thought when the open and active file handle closed against the file.

So why does our second file have a (2) after it? For example cv_cowen_2013.pdf and cv_cowen_2013 (2).pdf? The reason is that Outlook never overwrites an existing file and if there are older copies of the attachment that still exist within the 'Content.Outlook' directory the next number in sequence will be used for the newly extracted file.

Interested yet? The USN Journal is an amazing artifact for us in our analysis of Windows Vista and 7 systems, if you are not looking at it, you should be!

Tomorrow lets see what it looks like when I the attachment remains and talk about possible conclusions we can draw.

Daily Blog #165: Solving Sunday Funday 12/1/13 Part 3

$
0
0
Hello Reader,
           In the past two posts we've discussed how analysis of attachment access has changed and what artifacts are left behind in the USN Journal when an attachment is closed while Outlook is still running on Windows 7. Today let's talk about another scenario, what happens when an attachment is still open when Outlook closes/the system reboots/etc.. Any condition that makes Outlook not an active process to clean up the attachment opened.

We talked in the last post about how Outlook will delete files after they have been closed, but not what happens when Outlook is no longer running when they are closed. In this case the second temporary extracted file will be deleted while the primary will remain. In this post we will show what this looks like.

First let's look at the MFT, here is what TzWorks Gena tool see's within the MFT as recoverable files using ntfswalk:

There are the two attachments that remain that I sent myself, previewed and then opened. You can see here that the two files are still active and will remain so now until such time as the Content.Outlook folder is emptied, typically from a disk clean up process looking for temporary files to reover. Of the two secondary copies made (the files that end with (2) ), only the pdf version is still recoverable as a deleted record within the MFT.

Within the USN Journal though we have a different story as seen below:

Click on the image to see it full size and view the records.

From accessing two attachments twice,  first as a preview within the message and then opening it within the external program associated, we have 35 USN entries!

The first creations have to do with my preview of the data, after the preview is completed the data is deleted!  We can see that this file was deleted and then re-extracted between the preview and open with external application by comparing the first creation in the USN Journal with the  creation date within the MFT of the file still remaining.

Here is the first creation in the USN relating to the preview:

Here is the second creation is the USN relating to the access with an external associated program:

Here is the MFT Filename attribute dates for the file that remains from the second access:

As you can see the MFT entry relates to the second creation, meaning that Outlook 2007 even cleans up after every preview. Luckily for us the preview still calls out to an external program, the previewer or that file type, so we will still get a USN entry for those accesses as well.

Next week I will show what happens when I open up the attachment again forcing the counter to increase for the filename and what other recoverable artifacts there might be with this artifact.

I'm uploading this test image now for your own verification and I'll update this blog when the link is ready.

Update: Here is the test image for download!


Daily Blog #166: Forensic Lunch 12/6/13

$
0
0
Hello Reader,
          We had a great forensic lunch this week! With this week's guests in alphabetical order:

Robert Haist, talking about his research with page_brute in recovering command execution and other fun things from the pagefile, read his blog about it here: http://blog.roberthaist.com/2013/12/restoring-windows-cmd-sessions-from-pagefile-sys-2/

Amber Shroader, talking about Device Seizure 6.5 and a great discussion on what happens behind the scenes in your mobile forensics tools as well as the future of cloud phone data acquisition. You can find out more about Device Seizure here: http://www.paraben.com/device-seizure.html

Joakim Schicht, discussing his tools and research, including how he approaches these projects and develops them. You can find his google code repository here: http://code.google.com/p/mft2csv/ with all the tools mentioned today and more!



Daily Blog #167: Saturday Reading 12/7/13

$
0
0
Hello Reader,
            It's Saturday! It's really cold here so I would advise that you make a fresh pot of coffee and get a warm blanket, its time to for links to make you think!

1. We had a great forensic lunch this week! Robert Haist, Amber Schroader and Joakim Schicht joined us this week. Robert talked about his ongoing research into recovering cmd.exe sessions with page_brute.py analysis of the pagefile, Amber talked about Device Seizure 6.5 and the challenges of mobile forensics, Joakim talked to us about the development of setmace and his development story. You can watch it here: https://www.youtube.com/watch?v=S5xP4ALhqSU

2. Speaking of Robert Haist's research, you can read his blog here to follow the research he talked about this week on the fornesic lunch here: http://blog.roberthaist.com/2013/12/restoring-windows-cmd-sessions-from-pagefile-sys-2/ 

3. Are you utilizing shellbags in your forensic analysis? You should be! You should also read this very well written blog by Dan Pullega on his extensive testing of shellbags. If you need to explain how and why timestamps get set on shellbags you need to read this blog http://www.4n6k.com/2013/12/shellbags-forensics-addressing.html

4. Harlan has a new blog up this week with his own news and links, http://windowsir.blogspot.com/2013/12/links-and-news.html. Since I focus mainly on deadbox forensics you should take the time to check out his view of the world.

5. Corey Harrell has a new post up this week that I found fascinating. He covers a new artifact that records which programs were executed in the last day on a Windows 7 systme through the 'recentfilecache.bcf' give it a read http://journeyintoir.blogspot.com/2013/12/revealing-recentfilecachebcf-file.html

6. If you liked Corey's research and are looking into Windows 8 systems then you need to read Yogesh Khatri's blog post on the new Windows 8 artifact 'amcache'http://www.swiftforensics.com/2013/12/amcachehve-in-windows-8-goldmine-for.html This amazing artifact is even recording the sha1 hashes of executables run on the local system!

7.Speaking of shellbag forensics, Chad Waibel finished his research project at Champlain College on Shellbags, read it here http://chadwaibelforensics.blogspot.com/2013/12/final-summary.html. Chad focused on a different side of shellbags, in what instance they are created and where. Reading this and Dan's post should really get you up to speed!

8. Jake Williams has a new blog up on memory image, http://malwarejake.blogspot.com/2013/12/memory-image-file-formats.html, and all the different formats we deal with today.

9. Let's end this Saturday with something fun, SANS has a memory challenge up for you to try out https://www.surveymonkey.com/s/JQ9QFHP. The winner will get a free simulcast viewing of a training class at DFIRCON, no matter which SANS class you pick to attend remotely thats a very big prize.

Tomorrow is Sunday Funday, so get ready!

Daily Blog #168: Sunday Funday 12/8/13

$
0
0
Hello Reader,
        It's Sunday Funday time! I hope you've been working your mental muscles and your ready to go because I have challenge for you this week. If you watched the forensic lunch this week you got to hear a series of topics, one of them might help you this week! Watch the forensic lunch here: https://www.youtube.com/watch?v=S5xP4ALhqSU

The Prize:

  • A $200 USD Amazon Giftcard that will be emailed to you

The Rules:
  1. You must post your answer before Monday 12/9/13 2AM CST (GMT -5)
  2. The most complete answer wins
  3. You are allowed to edit your answer after posting
  4. If two answers are too similar for one to win, the one with the earlier posting time wins
  5. Be specific and be thoughtful 
  6. Anonymous entries are allowed, please email them to dcowen@g-cpartners.com
  7. In order for an anonymous winner to receive a prize they must give their name to me, but i will not release it in a blog post

The Challenge:
You have a Windows 2008 system with two partitions, one system and one data partition for file storage and sharing. You recovered a application compatibility cache entry showing that setmace.exe ran but don't know what was changed. You need to answer the following questions:

1. How can you detect timestamp manipulation via setmace on the system disk
2. How can you detect timestamp manipulation via setmace on the data disk
3. How can you recover what files setmace was pointed at
4. How can you recover what commands were executed



Daily Blog #169: Sunday Funday 12/8/13 Winner!

$
0
0
Hello Reader,
            Another Sunday Funday come and gone, an interesting change this week that I'm curious on your feedback for. For the first time I themed this weeks Sunday Funday to topics discussed in this weeks Forensic Lunch. I was hoping this would give many of you a leg up in getting started so please in comments let me know what you thought, is this something you'd like to see done again?

With that said here is this weeks winning answer!

The Challenge:
You have a Windows 2008 system with two partitions, one system and one data partition for file storage and sharing. You recovered a application compatibility cache entry showing that setmace.exe ran but don't know what was changed. You need to answer the following questions:

1. How can you detect timestamp manipulation via setmace on the system disk
2. How can you detect timestamp manipulation via setmace on the data disk
3. How can you recover what files setmace was pointed at
4. How can you recover what commands were executed

The Winning Answer:
 Anonymous

I tried to answer them in order but the answers quickly got mixed together as I thought it would be better to explain my process in order.

1. How can you detect timestamp manipulation via setmace on the system disk
2. How can you detect timestamp manipulation via setmace on the data disk
3. How can you recover what files setmace was pointed at
4. How can you recover what commands were executed

According to your blogpost #130 on Detecting Fraud, "setmace cannot access the physical disk of any system volume, but it can access the physical disk of non system volumes" on Windows Vista/7/8. I would imagine that this is true for Windows 2008 as well as its based on Windows NT 6.x. As a result there wouldn't be any timestamp manipulation via setmace on the system disk.

I would first examine userassist and prefetch to determine if and when setmace has been run.

I would run a keyword search for setmace in an attempt to determine any potential artefacts in slack space. I would examine the pagefile/hiberfil and (hopefully) RAM dump using the processes shown in "Extracting Windows Command Line Details from Physical Memory" and "Restoring Windows CMD sessions from pagefile.sys". This may provide me with clues as to which files were modified. 

I would then create a timeline of activity and look for the low hanging fruit; files with created times when the computer was off, prior to OS or after seizure. This may allow me to determine if setmace has been run on the data disk (as there would be a reference to the drive letter in the command) and may tell me the files that the program was run across.

I am also able to examine the shell artefacts in jumplists/lnk files/shellbags and compare their values with the files on the disk. Any derivations will raise flags as to the accuracy of the timestamps. I would then compare volume shadow copies of the files that have been flagged. I am also able to look for anomalies regarding file access prior to the file being created on the system.


Let's get back to  USN Journal analysis tomorow!

Daily Blog #170: Solving Sunday Funday 12/1/13 Part 4 USN Anaysis of non outlook attachments

$
0
0
Hello Reader,
      We've talked quite a bit about the USN Journal by this point in its use to analyze attachment access from Outlook 2007 on Windows 7. Today I wanted to extend the discussion to what the USN Journal contains when a non outlook 2007 attachment is accessed. The USN Journal seems simple when you first review it, but the underlying meaning of its reasons codes are not as simple as they appear.

I created a file in c:\users\suspect\documents called 'test.txt' and populated it with one line of information. This is what the USN Journal shows for that initial creation and save:


File Creation:
This is where it gets interesting, notice the first line says the reason coed is File_Create, this is in fact what happened at this point. However as line 2 with USN number 2363040 shows we have another File_Create reason code here with a close. This is tricky though as the file is not being created again at this point, just the handle being closed from the initial creation. Based on our testing we can see we have three more file_create reason codes listed at 23693488, 23693568, and 23693648.

File Deletion:
What is even stranger than the file_create is the file_delete we see in USN 23693120, the file was not deleted. The file handle that was used to create and then open the file was. This is important to understand and can greatly affect your  conclusions when you are looking at USN data, so always check your assumptions carefully as to when a file is actually being created and deleted against other sources such as the $logfile, $mft and timestamps from shell items.

File Modification:
You can see when I placed data into the notepad document in USNs 23693568 and 23693658 shows the Data_Extend reason code. Data_Extend will be seen whenever additional data is written to a file.

File Close:
Within this USN log you would think I have closed this file three times within the same minute. The first close is seen on USN 23693040 and refers to when the file was first created. The second close is seen on USN 23683120 is when the file handle was closed from the creation. The third close on USN 23694560 is from when I saved the file after changing its contents.

At this point I have created a text file, opened it in notepad and added a line of text to it. I left it open at this point for 4 hours and then went back and added one more line of data as can be seen below:


As you would expect and saw in the prior section I have a data_extend again, but this time we also have a data_overwrite. If the prior contents of a file are being changed you will see a data_overwrite. I then saved my changes resulting in the close that you see in USN 23975120 and closed the application.

Lastly I waited 13 minutes and then deleted the file from the file system using a shift delete within explorer to skip the recycle bin as seen below:


In this case I actually did delete the file, but there is nothing special about this reason code that separates it from the same code was saw when I first created the file. Here again you as an examiner need to look into what the USN reason code is actually referring to.

Conclusions:

The USN Journal shows without additional analysis that a file called test.txt was accessed by me during this time frame. We can also state that files are modified and changed when Data_extend and Data_overwrite as these only seem to be logged on file content changes. These facts are something you can rely on and testify to. What the USN Journal needs additional analysis to verify is:
  • The total time of usage of the file as a file close/open does not correspond to the total time a file is open for
  • When a file is created for the first time as the USN records all file handles as creates.
  • When a file is deleted from the file system as file handle closes also create these reason codes.
We can reach these additional analysis points with analysis comparing these files to their timestamps and actions from other artifacts on the system. Tomorrow let's do just that and show how to determine the actual creation and deletion times.

Daily Blog #171: Solving Sunday Funday 12/1/13 Part 5 Using standard analysis with USN Journals

$
0
0
Hello Reader,
     In yesterdays blog we went through the USN entries for the creation, modification and deletion of a single text file located within the users documents directory on a Windows 7 system. Our conclusions showed we couldn't rely on the USN Journal to show the creation or deletion times of the file without additional analysis. In this post I'm going to run through the locations within the other standard artifacts where you could obtain this information.

Please Note: This post is not for the analysis of those temporary files accessed within Outlook, we will have a separate post about that tomorrow.

So let's run down our other standard artifact locations and see what they reveal.

NTUSER Registry

The NTUSER registry has a couple dates for us showing when the file was opened:
OpenSavePidlMRU\*
LastWrite Time: Tue Dec 10 16:17:51 2013
Note: All value names are listed in MRUListEx order.

  Users\test.txt

RecentDocs
**All values printed in MRUList\MRUListEx order.
Software\Microsoft\Windows\CurrentVersion\Explorer\RecentDocs
LastWrite Time Tue Dec 10 16:17:51 2013 (UTC)
  22 = test.txt

Software\Microsoft\Windows\CurrentVersion\Explorer\RecentDocs\.txt
LastWrite Time Tue Dec 10 16:17:51 2013 (UTC)
MRUListEx = 1,0
  1 = test.txt
This is information you should expect to see populated and contains the last time the file was opened.

Jumplists

The Jumplist for notepad does contain a MRU date and time for this file when it was last opened:
 MRU/MFU    stream#    MRU date        MRU-UTC
1                        3             12/10/2013    16:17:51.083
 This lines up with our known times of last open from the NTUSER Registry. What is interesting with this jumplist entry is that there were no target file dates captured. Meaning that the creation, modification and access times of the file were not contained within the Jumplist for notepad.

LNK Files

The LNK file in this case shows the first sign of the creation and modification times of the file here.

So if we rely on the creation time stamp here we would see that it lines up with our first file_create reason code from the USN Journal letting us know that was indeed the creation time of the file.

 $MFT

Since this file was deleted before defrag ran again, scheduled to run by default every Wednesday at 2am on Windows 7,  we can still find its information within the MFT.

In this case the standard information mod time and MFT mod time are the same. So this does not reflect the true deletion time that occurred at 8:47PM as we know from the USN and my testing timeline. 

So at this point we've confirmed the creation time of the file and the last open time of the file through an application. Tomorrow let's go through the $logfile and talk about absolute deletion dates.
Viewing all 877 articles
Browse latest View live