Miniature radar sensors typically excel in capturing radial movements, while the thermal sensor can easily capture lateral movements; this complementary nature motivates the fusion of measurements from both sensors for enhancing the accuracy of hand-gesture recognition. This paper presents fusion techniques for combining the signal from commercially available miniature radar with infrared thermal sensor to improve the performance of recognizing hand-gestures. To achieve this, two parallel deep-learning networks are used to establish the detection probabilities of each gesture class separately using the radar and thermal sensors. The detection probabilities are then fused using several methods, including (i) weighted average score with optimized weights, (ii) logistic regression, (iii) multilayer perceptron, and (iv) random forest algorithm. The performance of these different methods is analyzed and compared where the best fusion technique shows very high classification accuracy above 99 % in successfully recognizing 14 different hand-gesture types. This result is significantly higher compared to the performance of individual sensors.