Adam Cannon
(Department of Computer Science, Columbia University)
VC-type generalization error bounds for data-dependent classifiers
Abstract
We extend ideas from the VC theory for statistical learning to data-dependent
classes of sets or classifiers. We provide an example of such a data-dependent
class of classifiers. Using the
theoretical foundations presented for general data-dependent classes
we develop a structural risk minimization principle for our example similar
to that of Vapnik and Chervonenkis. We discuss possible advantages over
conventional (non-data-dependent) classes. |
|