This communication introduces a new bound on the probability of error of margin multi-category classifiers. We consider classifiers based on classes of vector-valued functions with one component function per category. The gamma-dimensions [3] of the classes of component functions are supposed to grow no faster than polynomially with gamma(-1). We adopt a standard approach which starts with a bound on the risk in terms of a Rademacher complexity [4]. In [5], this Rademacher complexity is upper bounded by the sum of the ones of the component function classes. This yields a bound at least linear in the number C of categories. In [1, 2], the Rademacher complexity is bounded by a function of the metric entropy using the chaining method [6] to obtain a sublinear dependency on C. Then, the quality of the final result depends on the generalized Sauer-Shelah lemma used. We establish that dimension-free lemmas (yielding metric entropy bounds independent of the sample size) do not improve the final convergence rate. Thus, we choose the lemma most favorable with respect to C. In this way, we obtain a confidence interval growing as the square root of C with convergence rate similar to those in [1, 2]. This behaviour holds true irrespective of the degree of the polynomial.