Passive Radar Target Classification

Rushil Goomer
4 min readNov 5, 2020

--

Can you distinguish between humans and animals in radar tracks?

Overview:

Radars use a phenomenon known as the pulse doppler effect. Pulse-Doppler radar uses the Doppler effect, where a moving target produces a shift in frequency in the EM wave being reflected from the object. Pulse-Doppler radars are widely used in meteorological radars, allowing it to get the wind speed from the speed of rain or other precipitation. Doppler radar is also used heavily in medicinal services, for example, fall detection and fall identification, for nursing purposes. We intend to distinguish human movements from that animal movement based on the pulse radar Doppler signals intercepted through this project. Traditional signal processing techniques can distinguish some objects; however, separating non-rigid objects such as humans and animals tracked in radars is difficult.

Our Goal:

The goal is to investigate automated, novel solutions that enable the classification of humans and animals with high certainty and precision levels. The task is a binary classification task; the tracked objects are either humans or animals using neural networks.

Why is this a difficult task?

Separating of radar-tracked objects is generally done by traditional signal processing. Although radars highly accurate have limited classification capability, current Radars cannot tell what the target is. For our research, we focused on Ground Surveillance Radar’s (GSR) classification between Humans and Animals, one of the most challenging tasks. In the present scenario, multiple cameras must view the footage manually and classify between the animal or human. Manually human classification is a tedious task, and it is limited to the cameras’ visibility, which is very inefficient. We will be trying to assist the human classification when the camera systems are not reliable, which will be highly time-efficient and can be extended to other applications as well. Hence, the task of distinguishing between humans and animals based on their radar signature is quite challenging!

Our Solution:

The problem of classifying radar signals is not very groundbreaking as we can see there has already been much work done in this area. Similar existing solutions are mainly based on analyzing the radar signal. There have been trials of using machine learning algorithms like random forests classifier. However, in our proposed methodology, we use the powerful neural networks (Convolutional Nets or Recurrent neural nets) for the classification tasks by passing in the doppler characteristics after applying signal processing techniques. The existing solutions used only a few doppler characteristics and did not show how they affected the classification algorithm. In our work, we would like to point out the effects of all these characteristics like doppler bursts, IQ sweepburst, land terrains, type of SNR, type of terrain, etc. Using these many various parameters will make our model more robust to outliers or anomalous data points.

About the Model:

We followed Keras multiple inputs and mixed data approach to handling the dimension issues in handling 3D and 1D data.

A sequential model is applied, along with a CNN model as shown in the above figure with the best performing hyper-parameters; the prediction AUC-ROC curve is shown below.

Result:

The results obtain surpass any traditional signal processing algorithm to identify the non-rigid target. The AUC-ROC curve was used as the evaluation criteria which, as shown, is at 0.99 in both the training and validation set.

Remarks:

This project was extremely insightful for me as well as my teammate Shanmukha (https://www.linkedin.com/in/shanmukha-yenneti/); we would like to thank Bennett University CSE dept for making us adept in Data Science and Machine Learning during our undergraduate studies. The knowledge acquired here helped us a lot in completing this project. Special thanks to our mentor Dr. Vipul Mishra (https://www.linkedin.com/in/vipul-kumar-mishra-7bb43953/), for his continuous guidance and support.

--

--