Friday, November 21, 2014

Back to basics - Develop Forensic Analyst Mindset

This is a must watch video and must play game in order to even get started in developing an investigative mindset that is essential in incident response and cybersecurity investigations.

You can not just read about cybersecurity, you need to start developing skills, but will see that even basic skills an be challenging as you start using those skills in real environments.

This video will also show you that basic encoding can also be used by actual applications to store passwords.  It will also show you how Base64 works and how important log analysis is in this field.

http://youtu.be/9sGhmYlBrXU


Tuesday, October 28, 2014

Back to basics - Convert ICS to HTML and CSV

The discreet nature of calendar entries make seeing the over all picture or in investigations seeing a pattern of events is very difficult.  We need to be able to see the events in chronological order in a single document that we can use as a report or chart the values for easy understanding of events for non-technical professionals.

One of the most useful and versatile applications when it comes to Internet communication.  In this blog, I will explore the capability of this tool to convert .ics files, that is the only format that Google Calendar exports.

I also created a video to accommodate this blog post: http://youtu.be/WbBRhP6VXbs

So, in order to follow this process, you need to download and install Thunderbird, https://www.mozilla.org/en-US/thunderbird/download.

Login to your Google Calendar and create a new calendar.

Add new schedules to the new calendar and export the calendar as an .ics file.  Notice in the exported .ics file below the date and time stamps are not very user friendly to read, so it might need to be manually converted to make sense to non-technical professionals.  On the other hand, the HTML and CSV exported files below show the date and time stamps displayed in user friendly format that is easy to report and charted for easy interpretation without any manual conversion or risk of human error.


Import the .ics file into Thunderbird's Lightning add-on, that adds the calendar feature to Thunderbird.

Export the calendar as .ics, .html, or .csv format.


The HTML document can be directly used as a report, but the CSV format gives more flexibility to analyze the data or create chart to show clear patterns of events. 



Thus, digital forensics is about pattern recognition, but pattern can not emerge in some cases in its native format.  So, we need to focus on software capability to import certain file types and explore applications capability to export the data into different format that can aid our analysis and help identify patterns to solve cases.  

Monday, October 27, 2014

Back to basics - SQL and XSS

This post is accompanied by a video explaining this process and you can do about it.

http://youtu.be/-W3efiMT8H0

Sample web page to test Javascipts in browser.  Save the following code in a text file, name it test.html ad open it in your browser to see what it does.

<HTML>
<HEAD>>
              <script> window.open('http://zoltandfw.blogspot.com/','_blank')</script>
              <script> alert(document.cookie)</script>
              <script> alert("Your account has been compromised, please call (111)222-3333 to report!!!")               </script>
</HEAD>
<BODY>
              Just a test for JavaScripts
</BODY>
</HTML>

Sample log file entries showing details on what information might be collected in log files to investigate after the fact or monitor for real-time response.  

141027  7:39:45  122 Connect root@localhost on 
 122 Init DB badbank
 122 Query SELECT userid, accountnumber FROM badbank_accounts WHERE username='zoltan' AND password='9f1c050c2b226c2154d17a3ff9a602f6'
 122 Quit
141027  7:41:55  123 Connect root@localhost on 
 123 Init DB badbank
 123 Query SELECT userid, accountnumber FROM badbank_accounts WHERE username='zoltan' -- ' AND password='d41d8cd98f00b204e9800998ecf8427e'
 123 Quit
141027  8:00:30  124 Connect root@localhost on 
 124 Init DB badbank
 124 Quit
 125 Connect root@localhost on 
 125 Init DB badbank
 125 Quit
141027  8:42:47  126 Connect ODBC@localhost as  on 
 126 Query select @@version_comment limit 1
141027  8:42:55  126 Query show databases
141027  8:43:26  126 Query SELECT DATABASE()
 126 Init DB Access denied for user ''@'localhost' to database 'badbank'
141027  8:43:41  126 Quit

...

141027  9:04:20  130 Query select * from badbank_transactions
141027  9:05:22  213 Connect root@localhost on 
 213 Init DB badbank
 213 Query SELECT balance FROM badbank_accounts WHERE userid=61
 213 Quit
141027  9:05:37  214 Connect root@localhost on 
 214 Init DB badbank
 214 Query SELECT balance FROM badbank_accounts WHERE userid=61
 214 Query SELECT userid FROM badbank_accounts WHERE username='victim1'
 214 Query UPDATE badbank_accounts SET balance=balance-1 WHERE userid=61
 214 Query UPDATE badbank_accounts SET balance=balance+1 WHERE userid=60
 214 Query INSERT INTO badbank_transactions (userid,time,withdrawn,transactor,transfernote) VALUES (61,NOW(),1,60,'<script> alert(document.cookie)</script>')
 214 Query INSERT INTO badbank_transactions (userid,time,deposited,transactor,transfernote) VALUES (60,NOW(),1,61,'<script> alert(document.cookie)</script>')
 214 Quit
141027  9:05:41  215 Connect root@localhost on 
 215 Init DB badbank
 215 Quit
 216 Connect root@localhost on 
 216 Init DB badbank
 216 Quit

Sunday, October 26, 2014

Back to Basics - Information Assurance - Robots.txt

Note: If you like these blog posts, please click the +1 !

In some cases, you might need to, so called, crawl a web site to gather keywords or email addresses. Web sites can utilize the use of robots.txt files to prevent simple automated crawling of the entire website or part of it. The robots.txt file gives instructions to web robots about what not allowed on the web site using the Robots Exclusion Protocol. So, if a website contains a robots.txt like:

User-Agent: * 
Disallow: / 

This robots.txt will disallow all robots from visiting all pages on the web site. So, if a robot would try to visit a web site http://www.domain.topdomain/examplepage.html, then robots.txt in the root of the website http://www.domain.topdomain/robots.txt will not permit the robot to access the website. The robots.txt file can be ignored by many web crawlers, so it should not be used as a security measure to hide information. We should also be able to ignore such a simple security measure to investigate or to test web site security. I have mentioned in previous web posts the tool called wget that is a very useful tool to download a web page, website, or malware from the command line. This simple tool can also be configured to ignore the robots.txt file, but by default, it respects it, so you need to specifically tell the tool to ignore is directions.

wget -e robots=off --wait 1 -m http://domain.topdomain 
FINISHED --2014-10-26 11:12:36-- 
Downloaded: 35 files, 22M in 19s (1.16 MB/s) 

While not using the robots=off option will result in the following results.

wget -m http://domain.topdomain 
FINISHED --2014-10-26 11:56:53-- 
Downloaded: 1 files, 5.5K in 0s (184 MB/s)

It is clear to see in this example that we would have missed 34 files by not being familiar with this simple file and its purpose.

Using "User-Agent: * " is a great option to block robots of unknown name blocked unless the robots use other methods to get to the website contents. Let's try and see what will happen if we use wget without robots=off.

 
As you can see the User-Agent is set to wget/1.11( default Wget/version ), so as you can see in the list below, a robots.txt with the content list below would catch this utility and prevent it from getting the website contents.

Note: The orange highlighted three packets are the 3-way handshake, so the request for the resources with the User-agent settings is the fist packet following the three-way handshake.  That might be a good pattern for alarm settings.

wget also has an option to change the user-agent default string to anything the user wants to use.

wget --user-agent=ZOLTAN -m http://domain.topdomain 



As you can see in the packet capture, the user-agent was overwritten as the option promised, but the website still only allowed a single file download due to User-agent: * that captured the unknown string.  So, robots.txt can help protecting the website to a certain extent, but the -e robots=off option did get the whole website content even though the packet contained an unmodified User-agent settings.

robots.txt can have specific contents to keep unsafe robots away from a web site or to provide basic protection from these "pests":   ( This list is not exhaustive, but it can be a good source to learn about malicious packet contents and a good resource for further reading on each one of these software tools. )

User-agent: Aqua_Products
Disallow: /

User-agent: asterias
Disallow: /

User-agent: b2w/0.1
Disallow: /

User-agent: BackDoorBot/1.0
Disallow: /

User-agent: Black Hole
Disallow: /

User-agent: BlowFish/1.0
Disallow: /

User-agent: Bookmark search tool
Disallow: /

User-agent: BotALot
Disallow: /

User-agent: BuiltBotTough
Disallow: /

User-agent: Bullseye/1.0
Disallow: /

User-agent: BunnySlippers
Disallow: /

User-agent: Cegbfeieh
Disallow: /

User-agent: CheeseBot
Disallow: /

User-agent: CherryPicker
Disallow: /

User-agent: CherryPicker /1.0
Disallow: /

User-agent: CherryPickerElite/1.0
Disallow: /

User-agent: CherryPickerSE/1.0
Disallow: /

User-agent: CopyRightCheck
Disallow: /

User-agent: cosmos
Disallow: /

User-agent: Crescent
Disallow: /

User-agent: Crescent Internet ToolPak HTTP OLE Control v.1.0
Disallow: /

User-agent: DittoSpyder
Disallow: /

User-agent: EmailCollector
Disallow: /

User-agent: EmailSiphon
Disallow: /

User-agent: EmailWolf
Disallow: /

User-agent: EroCrawler
Disallow: /

User-agent: ExtractorPro
Disallow: /

User-agent: FairAd Client
Disallow: /

User-agent: Flaming AttackBot
Disallow: /

User-agent: Foobot
Disallow: /

User-agent: Gaisbot
Disallow: /

User-agent: GetRight/4.2
Disallow: /

User-agent: grub
Disallow: /

User-agent: grub-client
Disallow: /

User-agent: Harvest/1.5
Disallow: /

User-agent: hloader
Disallow: /

User-agent: httplib
Disallow: /

User-agent: humanlinks
Disallow: /

User-agent: ia_archiver
Disallow: /

User-agent: ia_archiver/1.6
Disallow: /

User-agent: InfoNaviRobot
Disallow: /

User-agent: Iron33/1.0.2
Disallow: /

User-agent: JennyBot
Disallow: /

User-agent: Kenjin Spider
Disallow: /

User-agent: Keyword Density/0.9
Disallow: /

User-agent: larbin
Disallow: /

User-agent: LexiBot
Disallow: /

User-agent: libWeb/clsHTTP
Disallow: /

User-agent: LinkextractorPro
Disallow: /

User-agent: LinkScan/8.1a Unix
Disallow: /

User-agent: LinkWalker
Disallow: /

User-agent: LNSpiderguy
Disallow: /

User-agent: lwp-trivial
Disallow: /

User-agent: lwp-trivial/1.34
Disallow: /

User-agent: Mata Hari
Disallow: /

User-agent: Microsoft URL Control
Disallow: /

User-agent: Microsoft URL Control - 5.01.4511
Disallow: /

User-agent: Microsoft URL Control - 6.00.8169
Disallow: /

User-agent: MIIxpc
Disallow: /

User-agent: MIIxpc/4.2
Disallow: /

User-agent: Mister PiX
Disallow: /

User-agent: moget
Disallow: /

User-agent: moget/2.1
Disallow: /

User-agent: mozilla/4
Disallow: /

User-agent: Mozilla/4.0 (compatible; BullsEye; Windows 95)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 2000)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 95)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 98)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows ME)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows NT)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows XP)
Disallow: /

User-agent: mozilla/5
Disallow: /

User-agent: MSIECrawler
Disallow: /

User-agent: NetAnts
Disallow: /

User-agent: NetMechanic
Disallow: /

User-agent: NICErsPRO
Disallow: /

User-agent: Offline Explorer
Disallow: /

User-agent: Openbot
Disallow: /

User-agent: Openfind
Disallow: /

User-agent: Openfind data gathere
Disallow: /

User-agent: Oracle Ultra Search
Disallow: /

User-agent: PerMan
Disallow: /

User-agent: ProPowerBot/2.14
Disallow: /

User-agent: ProWebWalker
Disallow: /

User-agent: psbot
Disallow: /

User-agent: Python-urllib
Disallow: /

User-agent: QueryN Metasearch
Disallow: /

User-agent: Radiation Retriever 1.1
Disallow: /

User-agent: RepoMonkey
Disallow: /

User-agent: RepoMonkey Bait & Tackle/v1.01
Disallow: /

User-agent: RMA
Disallow: /

User-agent: searchpreview
Disallow: /

User-agent: SiteSnagger
Disallow: /

User-agent: SpankBot
Disallow: /

User-agent: spanner
Disallow: /

User-agent: suzuran
Disallow: /

User-agent: Szukacz/1.4
Disallow: /

User-agent: Teleport
Disallow: /

User-agent: TeleportPro
Disallow: /

User-agent: Telesoft
Disallow: /

User-agent: The Intraformant
Disallow: /

User-agent: TheNomad
Disallow: /

User-agent: TightTwatBot
Disallow: /

User-agent: Titan
Disallow: /

User-agent: toCrawl/UrlDispatcher
Disallow: /

User-agent: True_Robot
Disallow: /

User-agent: True_Robot/1.0
Disallow: /

User-agent: turingos
Disallow: /

User-agent: URL Control
Disallow: /

User-agent: URL_Spider_Pro
Disallow: /

User-agent: URLy Warning
Disallow: /

User-agent: VCI
Disallow: /

User-agent: VCI WebViewer VCI WebViewer Win32
Disallow: /

User-agent: Web Image Collector
Disallow: /

User-agent: WebAuto
Disallow: /

User-agent: WebBandit
Disallow: /

User-agent: WebBandit/3.50
Disallow: /

User-agent: WebCopier
Disallow: /

User-agent: WebEnhancer
Disallow: /

User-agent: WebmasterWorldForumBot
Disallow: /

User-agent: WebSauger
Disallow: /

User-agent: Website Quester
Disallow: /

User-agent: Webster Pro
Disallow: /

User-agent: WebStripper
Disallow: /

User-agent: WebZip
Disallow: /

User-agent: WebZip/4.0
Disallow: /

User-agent: Wget
Disallow: /

User-agent: Wget/1.5.3
Disallow: /

User-agent: Wget/1.6
Disallow: /

User-agent: WWW-Collector-E
Disallow: /

User-agent: Xenu's
Disallow: /

User-agent: Xenu's Link Sleuth 1.1c
Disallow: /

User-agent: Zeus
Disallow: /

User-agent: Zeus 32297 Webster Pro V2.9 Win32
Disallow: /

User-agent: Zeus Link Scout
Disallow: /

Saturday, October 25, 2014

Back to Basics - Intellectual Property

This post is about practicing critical thinking when it comes to intellectual property cases and to track down old or previous websites using copyright material. We can also us this technique to locate images where only the portion of the image is used or relevant to the case.


Monday, October 20, 2014

Back to basics - Time revisited

It is very strange why PowerShell would return a date where the year is wrong ( 0x451a1eb0d869cc01).

PS> get-date 129594868675516997
Saturday, September 3, 0411 1:27:47 AM

Try more than just one value and see if there is a pattern of miscalculation.

PS> get-date 128989761240000000
Friday, October 2, 0409 4:55:24 PM

Always use a tool to validate your findings and keep you on the right track. Here, I'm using FTK Imager to decode the date/time of a little endian value to make sure I get the same results by hand and to identify any mistakes I might make with the manual conversion.  This way, I can also double check if PowerShell interprets the values correctly.





The same value returns the correct date and time if used in this format

PS>[datetime]::fromfiletime("129594868675516997")

Friday, September 2, 2011 8:27:47 PM

UTC time is also returns the correct date and time.  It seems like that is also what the get-date is trying to do.

PS> [datetime]::fromfiletimeUTC("129594868675516997")

Saturday, September 3, 2011 1:27:47 AM


Converting the hex values in Excel and working with the rounded scientific notation should not be used due to the rounding error.

PS> [datetime]::fromfiletime("1.29595E+17")

Saturday, September 3, 2011 12:06:40 AM

If you know the epoc of a time, then you can easily adjust PowerShell to give you the correct time from the epoc by adding the origin to the datetime.

[timezone]::CurrentTimeZone.ToLocalTime(([datetime]'1/1/1970').addseconds("1337329458"))


Friday, May 18, 2012 3:24:18 AM

Microsoft counts 100-nanoseconds, so the time value needs to be divided by 1e7 to get the second values from the epoc time.  129594868675516997/1e7 = 12959486867.55169

[timezone]::CurrentTimeZone.ToLocalTime(([datetime]'1/1/1601').addseconds(12959486867.55169))


Friday, September 2, 2011 8:27:47 PM

Thus, analyzing Mozilla Firefox, we can examine places.sqlite database for downloaded applications in the moz_annos table.  We can see values under content column like:
(C:\Users\<UID>\AppData\Roaming\Mozilla\Firefox\Profiles\<random>.default-<random>)

{"state":1,"endTime":1413773919879,"fileSize":4210920}

Based on the given file size ( in Bytes ), we can correlate an exfiltrated file even if its name was changed.  In order to find the time ( tracks it in milliseconds, so divide the value by 1000 ) when the exfiltration was completed we can run PowerShell with the Unix epoc date:

[timezone]::CurrentTimeZone.ToLocalTime(([datetime]'1/1/1970').addseconds("1413773919.879"))


Sunday, October 19, 2014 9:58:39 PM

This value can be verified by Decode
( http://www.digital-detective.net/digital-forensic-software/free-tools/ )



Thus, testing, verification, and validation should be part of every analysis especially before a new tool or a tool update is implemented.  Risk management is as important part of forensic analysis as technical knowledge.

Sunday, October 19, 2014

Back to basics - Drive transfer rate

Maybe it is not relevant to most investigators, but knowing your devices and your hardware can help in determining how long an acquisition or indexing of an evidence might take.  Measuring the performance of the storage devices are just as important as analyzing a case for relevant evidence.  You have to be detailed enough and have the drive to understand technology in order to move toward becoming an expert.  The first step of education is to ask questions and find the best answers possible, but not by "googling" for answers other did.

In this case, we examine out storage device transfer rate in a USB 2.0 and in USB 3.0 ports.  I'm lucky enough to have both of these ports on my laptop to test these ports, but if you ignore the port speed then you will never know why sometimes you get better performance.

USB 1.x supports rates of 1.5 Mbit/s (Low-Bandwidth) and 12 Mbit/s (Full-Bandwidth).

USB 2.0 supports higher maximum signaling rate and limited to effective throughput of 280 Mbit/s.  The port is usually black, but the USB symbol might be the best way to distinguish the port types. In the image below, I have a USB 3.0 on the left side while only a USB 2.0 on the right side.  Thus, plugging a device in one port vs. the other will have a huge performance difference.





USB 3.0 ( SuperSpeed mode ) usable data rate of up to 4 Gbit/s. A USB 3.0 port is usually colored blue, and is backwards compatible with USB 2.0.  In the image below, you can see that it will not matter which port to use on this side of the laptop since both of the ports are USB 3.0.





You can see in Windows what port the device is plugged in.


So, what are the effective transfer rates on actual devices and not just in theory.  There are many ways to test performance and most of them will not result in very accurate results, but will give a good indication of device transfer rates to calculate with.  In many cases, the approximation of data transfer rate is good enough to calculate and prepare a quote for clients.

One way is to use the Windows System Assessment Tool ( winsat )  utility to do this test.  Since we are talking about sequential writes of the data, we can test the sequential write rate of E:\ drive, in my case, like this.

winsat disk -seq -write -count 6 -v -drive E

Sequential reads are just as easy to test.


winsat disk -seq -read -count 6 -v -drive E

Another way would be to use SQLIO Disk Subsystem Benchmark Tool.

You can create a script to test the performance of the drive with many different configurations in order to find the optimal settings.

I have the following in my batch file:

"C:\Program Files (x86)\SQLIO\sqlio" -kW -s10 -frandom -o8 -dE -b8 -LS -Fparam.txt 
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -frandom -o8 -dE -b64 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -frandom -o8 -dE -b128 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -frandom -o8 -dE -b256 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -frandom -o8 -dE -b512 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -dE -o8 -b8 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -o8 -dE -b64 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -o8 -dE -b128 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -o8 -dE -b256 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kW -s360 -fsequential -o8 -dE -b512 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -b8 -dE -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -dE -b64 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -dE -b128 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -dE -b256 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -frandom -o8 -dE -b512 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -dE -o8 -b8 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -o8 -dE -b64 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -o8 -dE -b128 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -o8 -dE -b256 -LS -Fparam.txt
timeout /T 10
"C:\Program Files (x86)\SQLIO\sqlio" -kR -s360 -fsequential -o8 -dE -b512 -LS -Fparam.txt

The param.txt file does not have anything else, but a single line showing where to copy teh file to, in this case to E: drive since that is the drive I'd like to test.

e:\testfile.data 2 0x0 100

The testfile.dat was created with dcfldd like this:

C:\>dcfldd-1.3.4.x86win32\dcfldd.exe pattern=61 of=tesfile.data bs=8388608 count=1

The results can be then added to a spreadsheet to chart the data for easier analysis.


USB 3.0 performance.

USB 2.0 performance.

The best performance results are highlighted with read, but we can see that USB 3.0 have much less latency issues than USB 2.0, so we should definitely use USB 3.0 whenever we can.

So, no matter how obvious the outcome is or how much you know about technology, you should always aim to find a way to test your devices and have performance data available to chart your results to see a pattern that might not emerge just by looking at a data itself.  This is the process of determining an answer by empirical data analysis.  You can never get closer to a scientific thinking unless you realize the power of testing and measuring.  This way, you will always be confident of your conclusions since these are data points you have created, documented, and analyzed.  

Let me know if you any better ways to have a reliable testing of storage device performance.




Appendices

A. System Environment

> Command Line 'winsat  disk -seq -write -count 6 -v -drive E'
> DWM running... leaving it on
> System processor power policy saved and set to 'max performance'
> Running: Feature Enumeration ''
> Gathering System Information
> Operating System                        : 6.3 Build-9600
> Processor                               : Intel(R) Core(TM) i7-4702HQ CPU @ 2.
20GHz
> TSC Frequency                           : 0
> Number of Processors                    : 1
> Number of Cores                         : 4
> Number of CPUs                          : 8
> Number of Cores per Processor           : 4
> Number of CPUs Per Core                 : 2
> Cores have logical CPUs                 : YES
> L1 Cache and line Size                  : 32768  64
> L2 Cache and line Size                  : 262144  64
> L3 Cache and line Size                  : 6291456  64
> Total physical mem available to the OS  : 15.9 GB (17,078,214,656 bytes)
> Adapter Description                     : Intel(R) HD Graphics 4600
> Adapter Manufacturer                    : Intel Corporation
> Adapter Driver Provider                 : Intel Corporation
> Adapter Driver Version                  : 10.18.10.3345
> Adapter Driver Date (yy/mm/dd)          : 2013\10\31
> Has DX9 or better                       : Yes
> Has Pixel shader 2.0 or better          : Yes
> Has LDDM Driver                         : Yes
> Dedicated (local) video memory          : 0MB
> System memory dedicated as video memory : 0MB
> System memory shared as video memory    : 1792MB
> Primary Monitor Size                    : 1600 X 900  (1440000 total pixels)
> WinSAT is Official                       : Yes
Mode Flags = 0x02000001
Disk Number = 2
Iterations = 6
IO Count = 1000
Sequential IO Size = 65536

Random IO Size = 16384

B. Drive tested

C:\>wmic diskdrive get name, size, model
Model                           Name                Size
WD My Passport 0748 USB Device  \\.\PHYSICALDRIVE2  2000363420160

C. User Manual and downloads