top of page


Deep Dive into Gated Recurrent Units (GRU): Understanding the Math behind RNNs
Gated Recurrent Unit (GRU) is a simplified version of Long Short-Term Memory (LSTM). Let’s see how it works in this article.
shivamshinde92722
Jan 14, 20234 min read


From Vanilla RNNs to LSTMs: A Practical Guide to Long Short-Term Memory
Long short-term memory (LSTM) networks have become a go-to tool for tasks like machine translation, language modeling, and speech recognitio
shivamshinde92722
Jan 6, 20233 min read
bottom of page