Hybrid decoding is to combine different iterative decoding algorithms with the aim of improving error performance or decoding complexity. This, e.g., can be performed by using a specific blend of different algorithms in every iteration (time-invariant hybrid: H-TI), or by switching between different algorithms throughout the iteration process (switch-type hybrid: H-ST). In this work, we study HT, and HST algorithms both asymptotically, using density-evolution, and at finite block lengths, using simulations, and show that these algorithms perform considerably better than their constituent algorithms. We also investigate the convergence properties of HTI and HST algorithms, under both the assumption of perfect knowledge of the channel, and the lack of it, and show that compared to HST algorithms, such as Gallager's algorithm B, HTI algorithms are far less sensitive to channel conditions and thus can be practically more attractive.