This is overview of using SDR (Software Defined Radio) receiver system to capture 802.11b/g/n “traffic”, signals on frequency range 2412- 2484 MHz.
The system comprises of CAPTURE NODE, PRE-PROCESSING (draw in same illustration here) and TASK SPECIFIC FILTER (such as 802.11b/g/n here).
Making things interesting, the full raw capture is stored as stream file for future processing and analysis (like FFT) AND channel specific files are stored and combined allowing system to create station specific capture file, containing data send/received by single WLAN station.
As the system allows listening over a wide bandwidth, so the full 802.11b/g/n 2.4 MHz frequency range can be simultaneously analyzed. As the file(s) created by the data filter & splitter are pure data stream files, those can be send for further tasking – such as detailed data analysis or even assembly/reconstruction and then dissemination.
Implementing further data analysis, for intelligence purposes or even code breaking at the level of current public or military ciphers are, it is not feasible to accomplish INSIDE within system described here. Those needs tools and parallel tasking capabilities. This is just collection engine. Part of it. But for illustration purposes, I did test something around it as well.
The FEEDER -side
Ok – because I can’t help it – I decided to give overview for most of the remaining as well. Breaking other people WLAN secrets is illegal, so I wouldn’t go that far (in case anyone is going to molest such nice diagrams here for their own sick use)
Few additional components include The FEEDER – feeder organizes data to be analyzed further by TASK MANAGER. It sets the initial working queue, discards un-needed data (files) and sets (inserts) metadata defining the origins of data (such as time, channel, pre-filtering done etc.).
TASK MANAGER itself is just (actually) a simple lists of header information and settings to run the system. Build on top of fancy MySQL.
HIERARCHY ORGANIZER strips file to desired chunks based on the content. This is actually where the most magic happens. I made system to pick up very quick resolution out of “garbage” within the data (such as organizable frames, too fragmented data etc.) and the real data I am looking for. How? – Well, 1st I took manually few tens of pieces of such data to be analyzed and then I formed an model that (Yea, Matlab is nice too!) takes each and every time SIMILAR or NEAR looking data out. It works to some extend.
TASK PREPARATION and JOB/QUEQUING are just staging gates, doing simple things like exporting prepared chunk of data out in fashionable form to be inserted in job queue.
TASK ENGINE does all the big work – I tested with John, and John worked humble attitude with WEP and WPA pretty nicely.
As the system allows “HPC” or parallel tasking (not tested, not executed, but build a PoC..) – there is lots of things which are missing, like reconstructing data to other forms, making HPC efficient etc.
It took some 3 minutes (in my test environment, that is) data to be intercepted and flow through the system for TASK ENGINE (John) to be processed.
Interestingly I would been able to simultaneously listen ALL public radio stations around.
Holy #ncow (Network Centric Operations Warfare) !!! My WLAN “listening post”, which is not highly capable in direction finding, was crafted under – say 4hrs.