The list of compatible adapters is often very short, and confusing because the support depends on the operating system and often it breaks when updating Kali.
In particular, nothing is said about Mac OS X compability.
Here the good news, I tested the TP-LINK WN821N USB adapter and it seems to work properly in Kali Linux 2016.x installed in VMware Fusion (VMware Fusion 8.5.3 on Mac OS X El Capitan 10.11.6). The TP-LINK WN821N is athereos based and it supports 802.11n standard at 300 Mbps and it is quite cheap. You can buy it on Amazon HERE. Stay away from TP-LINK Archer T4UH (AC1200) which is Realtek based and its Linux drivers are badly outdated (but it is a good adapter under Windows, with over 200Mbps throughput).
These are the very basic steps:
- Once you booted the system and logged in as root, you can connect the USB wireless adapter. VMWare will ask you to select if connect it to Linux or to the Mac. Select to connect it to Linux.
- You should now see the adapter in
- You can then start
airmong-ng. The command show the interface created for monitoring.
- The final steps are to run
airodump-ngto extract the MAC addresses and use
airplay-ngto lunch a deauthentication attack. For the full tutorial see HERE.
If you are running a recent version of VMware Player, VMware Workstation, or VMware Fusion along with a recent Linux Kernel (>=4.4), then the recommended way to install VMware tools is to use the package for Open VM Tools (OVT) provided by your distribution. For Kali Linux
That’s it you are all set. If you do not like it, you can always go with the old way.
If you are running Kali Linux 2016.x which is in a rolling release, you might need to update the source list for apt in order to update the system.
If you see something like the following:
it means your sources.list file needs to be fixed.
In this case you can use this one-liner from the root prompt:
Now you can update your system:
If you are a *nix geek like me you can’t but love the command prompt.
One of the best tool to improve the plain old terminal is an utility called tmux. You can install through Homebrew.
Now, there are many commands to remember to play nicely with the terminal, and sometimes a little remind might be useful, that’s why cheat sheets exist.
Here is mine, enjoy.
You can use the cURL library and the
curl command to design your own Request and explore the Response. There are many possible uses like e.g., API debug, web hacking, pen testing.
curl is a tool to transfer data from or to a server, using one of the supported protocols (e.g., FTP, GOPHER, HTTP, HTTPS, IMAP, LDAP, POP3, RTMP, SCP, SFTP, SMTP, TELNET). The command is designed to work without user interaction.
curl offers a busload of useful tricks like proxy support, user authentication, FTP upload, HTTP post, SSL connections, cookies, file transfer resume, Metalink, and more. As you will see below, the number of features will make your head spin!
curl is a truly powerful command, however it does at the cost of complexity. Here I will show some real-world use cases.
The URL syntax is protocol-dependent. If you specify URL without
curl will attempt to guess what protocol you might want. It will then default to HTTP but try other protocols based on often-used host name prefixes. For example, for host names starting with “ftp.” curl will assume you want to speak FTP.
You can specify multiple URLs or parts of URLs by writing part sets within braces as in:
or you can get sequences of alphanumeric series by using
[ ] as in:
Nested sequences are not supported, but you can use several ones next to each other:
You can specify any amount of URLs on the command line. They will be fetched in a sequential manner in the specified order.
You can specify a step counter for the ranges to get every Nth number or letter:
In order to analyze in depth what we send and receive we might save everything on a file, this is as easy as:
Save To Disk
If you want save the Response to disk you can use option
-o <file>. If you are using
 to fetch multiple documents, you can use ‘
#‘ followed by a number in the specifier. That variable will be replaced with the current string for the URL being fetched. Remember to protect the URL from shell by adding quotes if you receive the error message
internal error: invalid pattern type (0). Examples:
-O writes output to a local file named like the remote file we get (only the file part of the remote file is used, the path is cut off). The remote file name to use for saving is extracted from the given URL, nothing else. Consequentially, the file will be saved in the current working directory. If you want the file saved in a different directory, make sure you change current working directory before you invoke curl:
Only the file part of the remote file is used, the path is cut off, thus the file will be saved as
Set HTTP Request Method
curl default HTTP method, GET, can be set to any method you would like using the
-X <command> option. The usual suspects POST, PUT, DELETE, and even custom methods, can be specified:
Normally you don’t need this option. All sorts of GET, HEAD, POST and PUT requests are rather invoked by using dedicated command line options.
Forms are the general way a web site can present a HTML page with fields for
the user to enter data in, and then press some kind of ‘submit’
button to get that data sent to the server. The server then typically uses
the posted data to decide how to act. Like using the entered words to search
in a database, or to add the info in a bug track system, display the entered
address on a map or using the info as a login-prompt verifying that the user
is allowed to see what it is about to see.
-d option we can specify URL encoded field names and values:
A very common way for HTML based application to pass state information between pages is to add hidden fields to the forms. Hidden fields are already filled in, they aren’t displayed to the user and they get passed along just as all the other fields. To
curl there is no difference at all, you just need to add it on the command line.
Set Request Headers
Request headers allow clients to provide servers with meta information about things such as authorization, capabilities, and body content-type. OAuth2 uses an
Authorization header to pass access tokens, for example. Custom headers are set in curl using the
Note that if you should add a custom header that has the same name as one of the internal ones curl would use, your externally set header will be used instead of the internal one. You should not replace internally set headers without knowing perfectly well what you’re doing. Remove an internal header by giving a replacement without content on the right side of the colon, as in:
If you send the custom header with no-value then its header must be terminated with a semicolon, such as
-H "X-Custom-Header;" to send
curl will make sure that each header you add/replace is sent with the proper end-of-line marker, you should thus not add that as a part of the header content: do not add newlines or carriage returns, they will only mess things up for you.
A HTTP request may include a
referer field (yes it is misspelled), which can be used to tell from which URL the client got to this particular resource. Some programs/scripts check the referer field of requests to verify that this wasn’t arriving from an external site or an unknown page. While this is a stupid way to check something so easily forged, many scripts still do it.
This can also be set with the
-H, --header flag of course. When used with
-L, --location you can append
";auto" to the
--referer URL to make curl automatically set the previous URL when it follows a
Location: header. The
";auto" string can be used alone, even if you don’t set an initial
To specify the User-Agent string to send to the HTTP server you can use
--user-agent flag. To encode blanks in the string, surround the string with single quote marks. This can also be set with the
-H, --header option of course. Many applications use this information to decide how to display pages. At times, you will see that getting a page with curl will not return the same page that you see when getting the page with your browser. Then you know it is time to set the User Agent field to fool the server into thinking you’re one of those browsers:
The way the web browsers do “client side state control” is by using cookies. Cookies are just names with associated contents. The cookies are sent to the client by the server. The server tells the client for what path and host name it wants the cookie sent back, and it also sends an expiration date and a few more properties.
When a client communicates with a server with a name and path as previously specified in a received cookie, the client sends back the cookies and their contents to the server, unless of course they are expired.
Many applications and servers use this method to connect a series of requests into a single logical session. To be able to use curl in such occasions, we must be able to record and send back cookies the way the web application expects them. The same way browsers deal with them.
It is supposedly the data previously received from the server in a
"Set-Cookie:" line. The data should be in the format
= symbol is used in the line, it is treated as a filename to use to read previously stored cookie lines from, which should be used in this session if they match. Using this method also activates the “cookie parser” which will make curl record incoming cookies too, which may be handy if you’re using this in combination with the
-L, --location option. The file format of the file to read cookies from should be plain HTTP headers or the Netscape/Mozilla cookie file format. NOTE that the file specified with
-b, --cookie is only used as input. No cookies will be stored in the file. To store cookies, use the
-c, --cookie-jar option or you could even save the HTTP headers to a file using
Work In Progress…
Ok there are many more options, but I will stop here for now. I will add something in the future, so if you have any request (like using more real urls) just leave a comment.
After PRISM scandal you may feel the need to secure your connection and protect your privacy. Then it is a good idea to tunnel web traffic through a secure encrypted connection. This allows your traffic to traverse a local network without being visible to snoopers, even when visiting unencrypted web sites.
What you need:
- a modern browser like Firefox, Chrome or Safari (they support SOCKS4 protocol)
- ssh client (already installed in Mac Os X)
- a shell account (with ssh access)
To start the local proxy type:
PORT is a local port between 1024 and 65535 (they do not require super user privileges),
user is the username at the remote machine, and
host is the identifier of the remote host.
That’s not enough, you need to configure the system to use the proxy. Go to
System Preferences > Network > [select active interface] > Advanced... > Proxies and check
Then modify SOCKS Proxy Server info to use the
PORT you chose before.
That’s it! From now on all connections on the active network interface will be tunneled through the proxy. 😎
NOTE1: you may want use proxy browser settings instead of system wide proxy settings, so you can tunnel only part of the traffic (e.g. the most sensitive one). This is easy, but the procedure slightly change between different browsers.
NOTE2: you may need to add the option
-p HOST_PORT if the remote host doesn’t use the standard ssh port 22, e.g.:
I have made a short benchmark comparison of Parallels vs Fusion 5.0 HERE. VMWare Fusion has reached version 5.0.3, but it looks like VMWare has become lazy and those updates are not worth mentioning. However Apple released Mavericks recently, with an updated graphic stack which has slightly better graphics performance:
|Primary hard disk:||7.7||7.7|
Moreover, VMs seem to boot much faster under Mac Os X Mavericks.
N.B. you need to update VMWare Fusion to version 5.0.3 in order to have the best experience in Mavericks (or install VMWare Fusion 6).
In *NIX systems file and folders beginning with a dot (e.g., .name) are not visible in the Finder (also known as file browser). Since Mac Os X it’s a certified UNIX that’s also the case. If you use the terminal you can use the command:
However, most people will use regular Finder. To enable view of hidden files in the Finder use this command:
and then restart the finder with the following command:
To revert the changes use the same command, but replace TRUE with FALSE.
If you are a student, you can save some bucks if you join Amazon Student, the beautiful thing is that it is absolutely free!!!
That’s not all, you can earn 5$ every time someone join thank to you!
Why don’t you take the time to offer me a free beer clicking the following link and registering? Thx you 😎
If you are looking to configure Vim you find the default configuration file in:
Copy and rename it in your home directory:
However it is bare minimal so it is better if you personalize it a bit. One very simple example is the following:
Ok, actually there are many good reasons to wait a bit of time before send a SIGKILL to processes (like giving them time to write things on disk or finish an upload on the iCloud), however the “slow” shutdown could be annoying, so try this:
This set the shutdown timeout to 2 sec instead of the default value (20).
If you run a linux guest VM, every time you update the kernel you need to reinstall VMwareTools for optimal performances.
After selecting Virtual Machine > Install VMware Tools you need to untar the archive and then run a script that ask you many question, etc.
This can be very tedious, so this is a little script that minimize typing:
VMware now recommends to use the
open-vm-tools-desktop provided by the Linux distribution of your choice.
NOTE: -d option implies default answers to install script (most of the time they are ok)
NOTE2: the script create a directory on Desktop with all directories shared by the host system with the VM
NOTE3: this script has been tested only on Ubuntu 12.04 LTS
NOTE4: this script install native VMware Tools, if you want you can install open tools instead, but you can’t install both at the same time!
This is a legendary tool developed by the author of The Web Application Hacker’s Handbook: Finding and Exploiting Security Flaws (2nd edition).
Unfortunately there is
no native Mac Os X version but Corsaire packed one. Since they updated the site, the link to download it provided in the aforementioned book is broken, so I will provide a new working one of the recently released version 1.5 1.6 of Burp Suite Free.
All rights reserved to Corsair and Portswigger.
See also here for a [much less powerful] alternative.
Quicklook is a beautiful and powerful feature of Mac Os X (take a look here to know what is capable of) but it is somewhat weird that it doesn’t allow text selection.
To enable text selection in Quicklook:
NOTE: this trick stopped working with Mac OS X 10.11 and later.
There are Macs without disk drives. You can access CD/DVDs from another Mac over the network allowing remote disk.
This option allows you to always see remote drives within Finder:
Most browsers allow search to default search engine in address bar, but what if you want a different engine? You can switch on the right box, choose one and then start the search.
So you can try this:
- click the little arrow in the search engine box to open the drop down menu.
- select Manage Search Engines…
- select an engine and then click on Edit Keyword…
- add the keyword you prefer and then click OK
From now on, if you want make a search with the engine in the URL bar, construct your research like:
wiki power law distribution
answer AAPL growth
twitter Barack Obama
NOTE: if your keyword doesn’t work and your query is answered by the default search engine, try to change it! I have noticed that if you use the name of the service (e.g., Wikipedia, Yhaoo) most of the time the keyword doesn’t work. 😦
If you are a pro, you know, you use the keyboard to do most of the work, right?
Then why click to open the mail client or a new compose window when you can do it in less then 10 character:
in the address bar of your browser and let’s the magic happen! 😎
Yep, Mac world is only about Drag & Drop but if you feel nostalgic of PC’s world you can move files the old way, just do this:
- selct file/files and hit
Command + C
- move to another location an hit
Command + Option + V
Paros is a web proxy that allows to intercept and modify all HTTP and HTTPS data between server and client, including cookies and form fields.
I will not explain how and why you use it, but if you are really interested, please take a look on the book: The Web Application Hacker’s Handbook: Discovering and Exploiting Security Flaws.
Unfortunately there is no native Mac Os X version but Corsaire packed one. Since they updated the site, the link to download it provided in the aforementioned book is broken, so I will provide a new working one. All rights reserved to Corsair and Paros Team.
Paros is no longer developed (it also requires JDK 6 which is not longer supported).
An alternative is a fork of it, Zed Attack Proxy, maintained by OWASP:
Zed Attack Proxy (ZAP)