In this paper, we consider a class of models that describe parallel observations of a single source by many noisy sensors, lossy quantization at each sensor, and finally information fusion of the quantized data. Certain phenomena in biophysics and neural information processing, but also in detection networks and modern communications can be elucidated by these models. Mutual information is used as an analytical measure of information exchange. We characterize the optimum information fusion rule by maximum entropy of the corresponding output distribution. For discrete input distributions, this problem can be reduced to a generalized Knapsack problem, which is hard to solve in general. We suggest a heuristic that minimizes the decrease of entropy in each step, and show that for binary information fusion the true optimum is attained for dyadic distributions. The problem of finding optimum quantization rules is an essential part of the model and treated analogously. For input distributions with a density, optimality is achieved by determining appropriate quantization thresholds. Finally, by applying the data processing inequality, an upper bound for the mutual information of arbitrary stochastic pooling channels is found. This bound provides interesting insight into the resilience of parallel noisy information processing in biological systems.