Monday, October 20, 2014

Back to basics - Time revisited

It is very strange why PowerShell would return a date where the year is wrong ( 0x451a1eb0d869cc01).

PS> get-date 129594868675516997
Saturday, September 3, 0411 1:27:47 AM

Try more than just one value and see if there is a pattern of miscalculation.

PS> get-date 128989761240000000
Friday, October 2, 0409 4:55:24 PM

Always use a tool to validate your findings and keep you on the right track. Here, I'm using FTK Imager to decode the date/time of a little endian value to make sure I get the same results by hand and to identify any mistakes I might make with the manual conversion.  This way, I can also double check if PowerShell interprets the values correctly.

The same value returns the correct date and time if used in this format


Friday, September 2, 2011 8:27:47 PM

UTC time is also returns the correct date and time.  It seems like that is also what the get-date is trying to do.

PS> [datetime]::fromfiletimeUTC("129594868675516997")

Saturday, September 3, 2011 1:27:47 AM

Converting the hex values in Excel and working with the rounded scientific notation should not be used due to the rounding error.

PS> [datetime]::fromfiletime("1.29595E+17")

Saturday, September 3, 2011 12:06:40 AM

If you know the epoc of a time, then you can easily adjust PowerShell to give you the correct time from the epoc by adding the origin to the datetime.


Friday, May 18, 2012 3:24:18 AM

Microsoft counts 100-nanoseconds, so the time value needs to be divided by 1e7 to get the second values from the epoc time.  129594868675516997/1e7 = 12959486867.55169


Friday, September 2, 2011 8:27:47 PM

Thus, analyzing Mozilla Firefox, we can examine places.sqlite database for downloaded applications in the moz_annos table.  We can see values under content column like:


Based on the given file size ( in Bytes ), we can correlate an exfiltrated file even if its name was changed.  In order to find the time ( tracks it in milliseconds, so divide the value by 1000 ) when the exfiltration was completed we can run PowerShell with the Unix epoc date:


Sunday, October 19, 2014 9:58:39 PM

This value can be verified by Decode
( )

Thus, testing, verification, and validation should be part of every analysis especially before a new tool or a tool update is implemented.  Risk management is as important part of forensic analysis as technical knowledge.

Sunday, October 19, 2014

Back to basics - Drive transfer rate

Maybe it is not relevant to most investigators, but knowing your devices and your hardware can help in determining how long an acquisition or indexing of an evidence might take.  Measuring the performance of the storage devices are just as important as analyzing a case for relevant evidence.  You have to be detailed enough and have the drive to understand technology in order to move toward becoming an expert.  The first step of education is to ask questions and find the best answers possible, but not by "googling" for answers other did.

In this case, we examine out storage device transfer rate in a USB 2.0 and in USB 3.0 ports.  I'm lucky enough to have both of these ports on my laptop to test these ports, but if you ignore the port speed then you will never know why sometimes you get better performance.

USB 1.x supports rates of 1.5 Mbit/s (Low-Bandwidth) and 12 Mbit/s (Full-Bandwidth).

USB 2.0 supports higher maximum signaling rate and limited to effective throughput of 280 Mbit/s.  The port is usually black, but the USB symbol might be the best way to distinguish the port types. In the image below, I have a USB 3.0 on the left side while only a USB 2.0 on the right side.  Thus, plugging a device in one port vs. the other will have a huge performance difference.

USB 3.0 ( SuperSpeed mode ) usable data rate of up to 4 Gbit/s. A USB 3.0 port is usually colored blue, and is backwards compatible with USB 2.0.  In the image below, you can see that it will not matter which port to use on this side of the laptop since both of the ports are USB 3.0.

You can see in Windows what port the device is plugged in.

So, what are the effective transfer rates on actual devices and not just in theory.  There are many ways to test performance and most of them will not result in very accurate results, but will give a good indication of device transfer rates to calculate with.  In many cases, the approximation of data transfer rate is good enough to calculate and prepare a quote for clients.

One way is to use the Windows System Assessment Tool ( winsat )  utility to do this test.  Since we are talking about sequential writes of the data, we can test the sequential write rate of E:\ drive, in my case, like this.

winsat disk -seq -write -count 6 -v -drive E

Sequential reads are just as easy to test.

winsat disk -seq -read -count 6 -v -drive E

Another way would be to use SQLIO Disk Subsystem Benchmark Tool.

You can create a script to test the performance of the drive with many different configurations in order to find the optimal settings.

I have the following in my batch file:

"C:\Program Files (x86)\SQLIO\sqlio" -kW -s10 -frandom -o8 -dE -b8 -LS -Fparam.txt 
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -frandom -o8 -dE -b64 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -frandom -o8 -dE -b128 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -frandom -o8 -dE -b256 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -frandom -o8 -dE -b512 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -dE -o8 -b8 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -o8 -dE -b64 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -o8 -dE -b128 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -o8 -dE -b256 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -o8 -dE -b512 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -b8 -dE -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -dE -b64 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -dE -b128 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -dE -b256 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -dE -b512 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -dE -o8 -b8 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -o8 -dE -b64 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -o8 -dE -b128 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -o8 -dE -b256 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -o8 -dE -b512 -LS -Fparam.txt

The param.txt file does not have anything else, but a single line showing where to copy teh file to, in this case to E: drive since that is the drive I'd like to test.

e:\ 2 0x0 100

The testfile.dat was created with dcfldd like this:

C:\>dcfldd-1.3.4.x86win32\dcfldd.exe pattern=61 bs=8388608 count=1

The results can be then added to a spreadsheet to chart the data for easier analysis.

USB 3.0 performance.

USB 2.0 performance.

The best performance results are highlighted with read, but we can see that USB 3.0 have much less latency issues than USB 2.0, so we should definitely use USB 3.0 whenever we can.

So, no matter how obvious the outcome is or how much you know about technology, you should always aim to find a way to test your devices and have performance data available to chart your results to see a pattern that might not emerge just by looking at a data itself.  This is the process of determining an answer by empirical data analysis.  You can never get closer to a scientific thinking unless you realize the power of testing and measuring.  This way, you will always be confident of your conclusions since these are data points you have created, documented, and analyzed.  

Let me know if you any better ways to have a reliable testing of storage device performance.


A. System Environment

> Command Line 'winsat  disk -seq -write -count 6 -v -drive E'
> DWM running... leaving it on
> System processor power policy saved and set to 'max performance'
> Running: Feature Enumeration ''
> Gathering System Information
> Operating System                        : 6.3 Build-9600
> Processor                               : Intel(R) Core(TM) i7-4702HQ CPU @ 2.
> TSC Frequency                           : 0
> Number of Processors                    : 1
> Number of Cores                         : 4
> Number of CPUs                          : 8
> Number of Cores per Processor           : 4
> Number of CPUs Per Core                 : 2
> Cores have logical CPUs                 : YES
> L1 Cache and line Size                  : 32768  64
> L2 Cache and line Size                  : 262144  64
> L3 Cache and line Size                  : 6291456  64
> Total physical mem available to the OS  : 15.9 GB (17,078,214,656 bytes)
> Adapter Description                     : Intel(R) HD Graphics 4600
> Adapter Manufacturer                    : Intel Corporation
> Adapter Driver Provider                 : Intel Corporation
> Adapter Driver Version                  :
> Adapter Driver Date (yy/mm/dd)          : 2013\10\31
> Has DX9 or better                       : Yes
> Has Pixel shader 2.0 or better          : Yes
> Has LDDM Driver                         : Yes
> Dedicated (local) video memory          : 0MB
> System memory dedicated as video memory : 0MB
> System memory shared as video memory    : 1792MB
> Primary Monitor Size                    : 1600 X 900  (1440000 total pixels)
> WinSAT is Official                       : Yes
Mode Flags = 0x02000001
Disk Number = 2
Iterations = 6
IO Count = 1000
Sequential IO Size = 65536

Random IO Size = 16384

B. Drive tested

C:\>wmic diskdrive get name, size, model
Model                           Name                Size
WD My Passport 0748 USB Device  \\.\PHYSICALDRIVE2  2000363420160

C. User Manual and downloads

Tuesday, October 14, 2014

Advanced topics - Search by FileTime

This is in progress, but the main idea is that we should be able to find FileTime ranges in $MFT, FAT DE, and in many SQLite databases, or log files by directly searching the stored time stamp.

We can use PowerShell to give us a range of FileTime values for a particular date range that will allow us to search the evidence for artifacts that we might not even realize yet, but stores the time stamp in its structure.

PS C:\> (Get-Date -Date "2014-10-14T00:00:00").ToFileTime()
PS C:\> (Get-Date -Date "2014-10-14T23:59:59").ToFileTime()
PS C:\> [convert]::tostring((Get-Date -Date "2014-10-14T23:59:59").ToFileTime(),16)
PS C:\> [convert]::tostring((Get-Date -Date "2014-10-14T00:00:00").ToFileTime(),16)

So, now that we know a time stamp range, we can reverse the time stamps to little endian, if needed, and locate values matching the range.

The image below is just a sample of a simple regular expression based search for a pattern matching a time range.

Also, get date and time by entering the fileTime value:

PS > Get-Date 129442497539436142
PS > [datetime]::FromFileTime("129442497539436142")

What did I do? - Pattern is the key

In this case, you need to decide if this case has anything that might be relevant to illegal animal trade.

Our hypothetical law states that it is illegal to own and store any image of animals with feather or fur.

You are given the suspect's computer and you see the following in the C:\temp\images folder.  The scope of the investigation restricts you to this simple folder, so you need to write your report examining data in the given folder.

You can ask for any other information you'll need to find the answer.

Remember computers store data visible, deleted, or hidden format that are either clear text, encoded, or encrypted and generated by operating system, application, or users.  So, as you analyze the image above, write your conclusion from these data characteristics in mind.


What did I do? - Google search

Sometimes you might think that it would be valuable to validate your findings and establish a base for your opinion.  There are many analysis of user actions without validating against actual user actions, so here we go.  I will give you specific scenarios with screenshot of relevant evidence data here on this blog, but you will also be able to watch a video of the actual actions I perform that generated the relevant data.  Depending on the activity type, I can even provide the relevant evidence specific artifact if needed.  All times displayed on the video will be Central Time with the actual daylight offset applied.

So, what was I doing last night?  Create your theory and sequence of events that I have performed.  Then, look at the video to find out if you were correct with your analysis.  Pay attention to sequence of events and the timeline of actions performed.  Don't forget to predict how many times I have visited each websites and what links I have clicked.  Look at the URL bar to find patterns for each actions and compare them to the report to see any discrepancy.


File to be analyzed if you want to confirm these findings by manual analysis or use
C:\Users\<UID>\AppData\Local\Google\Chrome\User Data\Default\History

System setup:
Windows 8.1 Pro
Intel Core i7
64 bit OS

Google Chrome - Version 35.0.1916.153 m
ChromeHistoryView v1.17 -

Let me know how accurate you were in your prediction!!!

Saturday, October 11, 2014

Back to Basics - Security by Monitoring

Security vs. convenience or privacy vs. security or freedom vs. control?  Sometimes we have a hard time deciding what is better for us and what makes sense.  For those in the cybersecurity field, convenience is un-security, your privacy is protected by monitoring your activities to identify the normal patterns in order to alert for abnormal signs.  You can not have it both ways, you do need to give up control ( not freedom ) in order for some else to help you provide your with the desired level of security.  The fundamental premise of security is monitoring.  Think about your kids, you can not protect them unless you know where they are and what are their plans in order to be preemptive instead of reactive.  Without this kind of access to their lives, you could not provide preventative services, you will always be reactive to events and will be late to protect anyone.  It is not about losing freedom, but providing protective services so you can be productive and focus on your assigned tasks instead of reducing your productivity due to your lack of skills to protect yourself.  Keeping up with the skills required to provide meaningful services is a full time job, so you have to outsource that skill to someone else who is qualified for the job.  It is like mowing your own lawn since you do not want to give up control of your grass.  It is convenient, cheaper, and more efficient to let professionals handle trivial tasks.  Have you aver tried to do something yourself to save money and it ended up costing you more time and money than if you hired some else to do the job for you?  I think, every one has.

So, think about cybersecurity and monitoring not as a loss of freedom, but a service that allows you to focus on what you good at, but only give access to those who have a vested interest to protect you, not to profit from it.

Many times, people are afraid of government agencies and ignore the businesses.  Agencies like NSA has a vested interest to enforce laws and protect citizens, not to snoop or to profit from collected information.  Collection is part of providing security in a legally controlled manner where no on person has authority over all data and their usage.  On the other hand, businesses have a vested interest to continually and in real time monitor as many individuals as possible in order to provide advertisement or directed sales pitches.  They thrive on knowing you more tan you know yourself regardless of law or regulation, if the can profit from it, they will use this information to anyone who is willing to pay for it.  There is no write or wrong here since we use services mostly provided for free, thus we willingly give up privacy to our information.  Like I'm using this blog, so by the end of this blog, I will get advertisements based on words I use in this blog and websites I might mention.  When I click on save, the words will be indexed and associated with my id and a profile is built about me that will be marketed to anyone interested focusing customers like me.

So, while NSA might collect data on my international calls, it might be used to generate some basic profile about me and if I break the law, that information can be pulled and analyzed to find out what made me change or to act in a certain way.  For profit organizations are like a wild wild west, they hire the best of the best to find ways to figure out how to make me buy things I don't need.  They are interested in all my button clicks and even on clicks I was thinking about, but decided not to click.  All this is in real time and marketed for profit.  We never even bother to read policies on websites we sign up for and use.  We never question what businesses do with the information we share or where this information is stored or even who owns the data we publish on the web.

My point, is that cybersecuriy is about monitoring to protect you and that is what agencies do for you, so you can focus on creating your wealth in whatever business you are in.  Businesses are the entities that we should be more concerned about and limit what they do with information we provide.  After all, the Internet was created for information sharing and just because I'm being analyzed as I'm writing this, I did not give up my freedom to talk about what I feel strongly about.  I'm being analyzed to make sure we can reach many people in a secure and responsible way.

Technology pose challenges to those who provide services since data grows exponentially and it is harder to distinguish approved traffic from malicious traffic.  Monitoring activities allows intelligent systems to identify normal traffic and learn consistent behaviors.  I like to fill up my car at the same gas station and fill up to a value divisible by 10 plus $0.01.  Security is about establishing consistency, so if I see charges on my credit card for $50.01 at a gas station, I can see that it is normal, but a charge of $50.54 is not.  Now, that is a pattern that can be coded and entered into a system or an intelligent system can learn this pattern and alert for out of pattern charges.  I might make a mistake and fill up my can to $50.75, but that is just a false positive that I can handle even if I get alerted for that charge.

Security is consistency!

Learn about the type of monitoring software can do and think about the patterns that might help professionals in this field do their job effectively.  If you think about consistency and not monitoring, then you might appreciate monitoring and the purpose of cybersecurity.

Back to Basics - Little Endian in PowerShell

Reverse little endian value for 64 bit FileTime entries and show the actual time value.

The basic concept of the code below is the rule of binary ANDing.  Any number logically ANDed with 255 ( 0xFF ) will result of a number itself and any number logically ANDed with 0 will result in zero.

Let's see how that works with an example:
The number is: 01101011 01010110
0x00FF:          00000000 11111111
The result:        00000000 01010110

Thus, you can see that using binary operation, we can separate a value from a sequence of binary values.  Also, this can be done in a decimal format like 23 AND 255 = 23.  So, if we have a longer Base-16 value like the little endian 64bit FileTime, we can reverse it by logically ANDing it with 0x00000000000000FF or just 0xFF.  At that point, we'll end up with the last 8 bits or the right most byte value.  At that point, we can remove those bytes by shifting the values to the right 8 times.  At that point, the last byte will be the second right most byte in the original byte string.  So, we can just repeat the ANDing and shifting of values and adding the appropriate Base-256 values to the total result.


function revEndian{

#Binary AND to identify the lowest byte value
$temp=$a -band 0xFF
#Shift the binary string to the right by 8 bits to replace the lowest byte value
$a=$a -shr 8

#Keep identifying and shifting the bytes to the right and calculating the proper Base-256 value

for($i=7; $i -ge 1;$i--){
    $result=$result+$temp * [math]::pow(256,$i)
    $temp=$a -band 0xFF
    $a=$a -shr 8
#Binary OR to add the last byte to the final value
$result=$result -bor $temp
return ,$result

#Call the function with specified Little Endian FileTime value
write-host "The value in Base-16 is:",("{0:x}" -f [convert]::touint64(($value)))
write-host "... and the date value of it is",([datetime]::FromFileTime($value))

Run the above script and you should see the following.

PS C:\> .\Convert.ps1
The value in Base-16 is: 1cf26c827917be8
... and the date value of it is 2/10/2014 7:25:31 PM