Recurrent Neural Networks, Long Short Term Memory, and Gated Recurrent Units, all three have one similarity. They all are unidirectional. It means, context is created by only and only considering the past. However, some use cases benefit to have context of both past and future to estimate the present. For instance, machine translation is one of the NLP use case that heavily benefits from the context built from both past and the future. The neural networks like this, are called Bidirectional.
While we are discussing the use cases, you might want to hold and think of one case where you would never use Bidirectional Neural Networks and comment the same.
How BiRNN work (or essentially any Bidirectional network works) is simple. There are 2 different RNN layers processing data from either direction and the output is then merged together to form the final output.
BiRNNs and other bidirectional networks are extremely complex, computationally speaking. This makes them harder to train, require more memory, and takes considerably more time to train. However, they do usually perform better in terms of unidirectional networks in terms of accuracy and also can handle variable length sequences better.