Bayes classifiers for functional data pose a challenge. This is because probability
density functions do not exist for functional data. As a consequence, the classical Bayes
classifier using density quotients needs to be modified. We propose to use density ratios
of projections on a sequence of eigenfunctions that are common to the groups to be
classified. The density ratios can then be factored into density ratios of individual
functional principal components whence the classification problem is reduced to a sequence
of nonparametric one-dimensional density estimates. This is an extension to functional data
of some of the very earliest nonparametric Bayes classifiers that were based on simple
density ratios in the one-dimensional case. By means of the factorization of the density
quotients the curse of dimensionality that would otherwise severely affect Bayes
classifiers for functional data can be avoided. We demonstrate that in the case of Gaussian
functional data, the proposed functional Bayes classifier reduces to a functional version
of the classical quadratic discriminant. A study of the asymptotic behavior of the proposed
classifiers in the large sample limit shows that under certain conditions the
misclassification rate converges to zero, a phenomenon that has been referred to as
"perfect classification". The proposed classifiers also perform favorably in finite sample
applications, as we demonstrate in comparisons with other functional classifiers in
simulations and various data applications, including wine spectral data, functional
magnetic resonance imaging (fMRI) data for attention deficit hyperactivity disorder (ADHD)
patients, and yeast gene expression data.