Abstract:  
We introduce a reduction-based model for analyzing supervised learning
tasks. We use this model to devise a new reduction
from multi-class cost-sensitive classification to binary
classification with the following
guarantee: If the learned binary classifier has error rate at most ε
then the cost-sensitive classifier has cost at most 2ε times the
expected sum of costs of all possible labels.
Since cost-sensitive classification can embed any bounded loss finite choice
supervised learning task, this result shows that any such task
can be solved using a binary classification oracle.
Finally, we present experimental results showing
that our new reduction outperforms existing
algorithms for multi-class cost-sensitive learning.