Deep Neural Networks (DNN) trained on large datasets have been shown to be able to capture high quality features describing image data. Numerous studies have proposed various ways to transfer DNN structures trained on large data sets to perform classification tasks represented by relatively small datasets. Due to the limitations of these proposals, it is not well known how to effectively adapt the pre-trained model into the new task. Typically, the transfer process uses a combination of fine-tuning and training of adaptation layers, however, both tasks are susceptible to problems with data shortage and high computational complexity. This work proposes an improvement to the well-known AlexNet feature extraction technique. The proposed approach applies a Recursive Neural Network (RNN) structure on features extracted by a Deep Convolutional Neural Network (CNN) pre-trained on a large data set. Object recognition experiments conducted on the Washington RGBD image data set have shown that, the proposed method has the advantages of structural simplicity combined with the ability to provide higher recognition accuracy at a low computational cost compared to other relevant methods. The new approach requires no training at the feature extraction phase, and can be performed very efficiently as the output features are compact and highly discriminative, and can be used with a simple classifier in object recognition settings.