TY - JOUR
T1 - A Quaternion Gated Recurrent Unit Neural Network for Sensor Fusion
AU - Onyekpe, Uche
AU - Palade, Vasile
AU - Kanarachos, Stratis
AU - Christopoulos, Stavros Richard G.
N1 - Publisher Copyright:
© 2021, MDPI AG. All rights reserved.
PY - 2021/3/1
Y1 - 2021/3/1
N2 - Recurrent Neural Networks (RNNs) are known for their ability to learn relationships within temporal sequences. Gated Recurrent Unit (GRU) networks have found use in challenging time-dependent applications such as Natural Language Processing (NLP), financial analysis and sensor fusion due to their capability to cope with the vanishing gradient problem. GRUs are also known to be more computationally efficient than their variant, the Long Short-Term Memory neural network (LSTM), due to their less complex structure and as such, are more suitable for applications requiring more efficient management of computational resources. Many of such applications re-quire a stronger mapping of their features to further enhance the prediction accuracy. A novel Qua-ternion Gated Recurrent Unit (QGRU) is proposed in this paper, which leverages the internal and external dependencies within the quaternion algebra to map correlations within and across multidimensional features. The QGRU can be used to efficiently capture the inter-and intra-dependen-cies within multidimensional features unlike the GRU, which only captures the dependencies within the sequence. Furthermore, the performance of the proposed method is evaluated on a sensor fusion problem involving navigation in Global Navigation Satellite System (GNSS) deprived environments as well as a human activity recognition problem. The results obtained show that the QGRU produces competitive results with almost 3.7 times fewer parameters compared to the GRU. The QGRU code is available at https://github.com/onyekpeu/Quarternion-Gated-Recurrent-Unit.
AB - Recurrent Neural Networks (RNNs) are known for their ability to learn relationships within temporal sequences. Gated Recurrent Unit (GRU) networks have found use in challenging time-dependent applications such as Natural Language Processing (NLP), financial analysis and sensor fusion due to their capability to cope with the vanishing gradient problem. GRUs are also known to be more computationally efficient than their variant, the Long Short-Term Memory neural network (LSTM), due to their less complex structure and as such, are more suitable for applications requiring more efficient management of computational resources. Many of such applications re-quire a stronger mapping of their features to further enhance the prediction accuracy. A novel Qua-ternion Gated Recurrent Unit (QGRU) is proposed in this paper, which leverages the internal and external dependencies within the quaternion algebra to map correlations within and across multidimensional features. The QGRU can be used to efficiently capture the inter-and intra-dependen-cies within multidimensional features unlike the GRU, which only captures the dependencies within the sequence. Furthermore, the performance of the proposed method is evaluated on a sensor fusion problem involving navigation in Global Navigation Satellite System (GNSS) deprived environments as well as a human activity recognition problem. The results obtained show that the QGRU produces competitive results with almost 3.7 times fewer parameters compared to the GRU. The QGRU code is available at https://github.com/onyekpeu/Quarternion-Gated-Recurrent-Unit.
KW - Autonomous vehicle navigation
KW - Gated recurrent unit
KW - GPS outage
KW - Human activity recognition
KW - Inertial navigation
KW - INS
KW - Neural networks
KW - Quaternion gated recurrent unit
KW - Quaternion neural network
UR - http://www.scopus.com/inward/record.url?scp=85102975375&partnerID=8YFLogxK
U2 - 10.3390/info12030117
DO - 10.3390/info12030117
M3 - Article
AN - SCOPUS:85102975375
VL - 12
JO - Information (Switzerland)
JF - Information (Switzerland)
SN - 2078-2489
IS - 3
M1 - 117
ER -