This paper proposes a new kind of generalization and reduction learning method based on an Arbiter. This is a multi-layer incremental induction algorithm, which is linked to an existing non-incremental induction algorithm to learn incrementally from noisy data. This method is applied to the learning problem under the conditions that 1) the data set is very large, 2) data quality is low, 3) after partition, each subset still meets the requirements of data completeness and information integrity. These conditions would guarantee that learning reliability will not be affected by such data partitioning The method mainly comprises three processing steps, which Arbiter applies to the data set, rule generalization, and reduction of data subsets. The experimental results show that by applying the Arbiter learning from partitioned data in a large scale data set, we are able to sustain the accuracy level of a single classifier trained on the entire data set, and improve some existing non-incremental learning algorithms.