The Gated Recurrent Unit is similar to LSTM, but instead of the input, forget, and output gates, it has an update and reset gate. Ultimately, it helps the model determine how much past information to pass on and forsake. LSTM and GRU are really similar types and usually used interchangeably.
See Simeon Kostadinov’s detailed explanation of GRU on Towards Data Science