In functional classification problems the data available for learning are characterized by functions, rather than vectors of attributes. In consequence, multivariate classifiers need to be adapted, and new types of classifiers designed to take into account the special characteristics of these types of data. In this work, an empirical evaluation of different classification methods is carried out using a variety of functional classification problems from different areas of application. The classifiers considered include nearest centroids with functional means as class prototypes and functional distances, standard multivariate classifiers used in combination with a variable selection method, classifiers based on the notion of functional depth, a functional version of k-nearest neighbors (k-NN), and random forest. From the results of this comparative study one concludes that random forest is among the best off-the-shelf classifiers not only for multivariate but also for functional classification problems. The variable selection method used in combination with a quadratic discriminant has fairly good overall accuracy using only a small set of impact points. This dimensionality reduction leads to improvements both in efficiency and interpretability. Finally, a functional version of k-NN that uses the alpha-Mahalanobis distance exhibits consistently good predictive performance in all the problems considered. This robustness makes k-NN a good benchmark for functional classification.