T-Shark is practically the command-line version of Wireshark. It has the same basic capabilities but with the added flexibility offered by using the command-line to process outputs and send them to other applications. ¬†Furthermore, T-shark is ideal for large PCAP files which Wireshark may have difficulty digesting, especially since it has to load the entire contents of the file prior to any kind of filtering. As such, T-Shark is my main tool for analyzing PCAPs. One of my main use is to extract specific information from the network for investigation. Below I’ve enclosed some of the commands that I have found myself reusing over and over again.


When analyzing PCAPs, I’m mostly concern to locate anomalies based on intelligence on various current actors. I’m especially interested in analyzing covert channels for specific indicators that are usually indicative of malicious traffic. These often include typos in popular URLs, weird looking domain names, emails with suspicious attachments or certificates with random fields. To extract the data, I’ve used the following T-shark commands:

Extracting URLs from HTTP Requests

tshark -r lab1.http.pcap -T fields -e http.request.full_uri -R 'http' | sort | uniq > lab1.httpreq.txt

In the example above (and the ones below), I’m reading network traffic from a offline PCAP file, which is why the -r lab1.http.pcap parameters is present. The -T fields specify to output the value of the fields specified with the -e parameter. For each field you want to output, you specify it using the -e <field> option. The -R <filter> is the filter to apply to the traffic. For example in this case we filter only HTTP traffic. Sort and Uniq are Linux applications used to sort the output of a program and remove duplicate entries.

Extracting Filenames from FTP uploads

tshark -r lab1.ftp.pcap -T fields -e ftp.request.arg -R 'ftp.request.command=="STOR"' | sort | uniq > lab1.ftpfiles.txt<

Extracting URLs from DNS Requests

tshark -r lab1-dns.pcap -T fields -e -R 'dns'| sort | uniq > lab1.dns.txt

Extracting Recipients’ Email Addresses of Inbound Emails

tshark -r lab1.smtp.pcap -T fields -e smtp.req.parameter -R 'smtp.req.command == "RCPT"' | sort | uniq > lab1.mailto.txt

Extracting Senders’ Email Addresses of Inbound Emails

tshark -r lab1.smtp.pcap -T fields -e smtp.req.parameter -R 'smtp.req.command == "MAIL"' | sort | uniq > lab1.mailfrom.txt

Extracting Subjects of Inbound Emails

tshark -r lab1.imap.pcap -T fields -e imf.subject -R 'imf' | sort | uniq > lab1.subjects.txt

Extracting Source URLs of X509 Certificates

tshark -r lab1.ssl.pcap -T fields -e x509sat.printableString -R ' ==' | sort | uniq > lab1.x509.url.txt

Extracting Information from X509 Certificates

tshark -r lab1.ssl.pcap -R "ssl.handshake.certificate" -T fields -E header=y -E separator=/t -E occurrence=a -E aggregator=\| -e x509sat.CountryName -e x509sat.IA5String -e x509sat.printableString -e x509sat.teletexString -e x509sat.UTF8String -e x509sat.universalString | sort | uniq > lab1.x509.txt


Since these proved useful on many occasions, a simple Bash script called Netminecraft was made to automate their usage.


To split a larger PCAP file into protocol-specific PCAPs, use; -s <protocol> -r <pcap_file>

Note that the scripts only passes the contents of <protocol> to
t-shark. As such, you can specify any Wireshark filter to extract
even more specific information, for example: -s http.request.full_uri -r 20150417.pcap

However avoid filters with spaces as the current version of the
script does not manage spaces. The results will be saved in a file
in the current directory as <protocol>.pcap, for example http.pcap.

To mine for data relating to a specific protocol, use; -p (dns|http|ssl|mail|ftp) -r <pcap_file> -w <output_file>

The output file will contain text data that have been sorted and in
which the doubles will have been removed using ‘uniq -i‘, i.e.
we ignore the case of the items.

Examples -p dns -r dns.pcap -w dns.queries.txt

The example above will extract the URLs of all the DNS queries found
in the file dns.pcap and will output a list of URLs in dns.queries.txt:


Learning to use T-shark has many advantages that can increase efficiency, security and flexibility. It allows for scripting the extraction of data and storage into databases from which further analysis can quickly be done for anomalies. In this short post, we have listed some examples of how T-shark can be used, but it barely scraps the surface.


T-Shark Manual, The Wireshark Network Analyzer,, accessed on 2015-02-24