Gated Recurrent Unit
The Gated Recurrent Unit (GRU) is a variant of the recurrent neural network (RNN) proposed by Cho et al. in 2014.Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling". GRU is designed to solve the gradient vanishing problem encountered by traditional RNN when processing long sequence data. It controls the flow of information by introducing update gate and reset gate to better capture long-term dependencies in time series.