We study the problem of performing cautious inferences for an ordi-nal classification aka ordinal regression task that is when the possible classes are totally ordered By cautious inference we mean that we may produce partial predictions when available information is insufficient to provide reliable precise ones We do so by estimating probabilistic bounds instead of precise ones These bounds induce a convex set of possible probabilistic models from which we perform inferences As the estimates or predictions for such models are usually computationally harder to obtain than for precise ones we study the extension of two binary decomposition strategies that remain easy to obtain and computation-ally efficient to manipulate when shifting from precise to bounded estimates We demonstrate the possible usefulness of such a cautious attitude on tests performed on benchmark data sets
from HAL : Dernières publications http://ift.tt/1sSmqBv
from HAL : Dernières publications http://ift.tt/1sSmqBv

0 commentaires:
Enregistrer un commentaire