RMIT University
Browse

An Improved Initialization Method for Fast Learning in Long Short-Term Memory-Based Markovian Spectrum Prediction

journal contribution
posted on 2024-11-02, 18:37 authored by Niranjana Radhakrishnan, Kandeepan SithamparanathanKandeepan Sithamparanathan
The opportunistic sharing of frequency bands supported in the Dynamic Spectrum Access (DSA) paradigm resolves the spectrum scarcity issue in wireless communications. To this end, deep learning models such as Long Short-Term Memory (LSTM) are becoming a popular choice for predicting the spectrum for cognitive radio type applications. However, the computational complexity to train such models can be very high, and delays in performing spectrum prediction (even in the order of msec) can reduce spectrum utilization efficiency. Here, we propose a novel method to initialize LSTM to reduce the training time to a good extent based on prior (statistical) knowledge of the input data and hence minimize the delay in spectrum prediction. This article proposes the 'Kandeepan-Niranjana (K-N) initialization', a novel initialization methodology for an LSTM based system model. We consider the well-known Markov model based spectrum utilization data with prior knowledge of the model parameters, such as the transition probabilities, to explain our method. Our results show that initialization with the parameters we propose provides a significant improvement in the training convergence of the LSTM based model for spectrum prediction. We also observe fast training convergence when the proposed method is applied to a real spectrum dataset.

History

Journal

IEEE Transactions on Cognitive Communications and Networking

Volume

7

Issue

3

Start page

729

End page

738

Total pages

10

Publisher

American Society of Civil Engineers

Place published

United States

Language

English

Copyright

© 2020 IEEE.

Former Identifier

2006110976

Esploro creation date

2023-01-30

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC