|
Multi-dimensional classification (MDC) refers to learning an association between individual inputs and their multiple dimensional output discrete variables. Different output dimensions can have different ranges and are usually heterogeneous to correspond to different semantics. Thus MDC is more general and stronger challenging than multi-label classification (MLC). One of keys to achieve its successful learning lies in how to take good advantage of explicit and/or implicit relationships both between dimensions and within-dimensions. To this end, one of the effective strategy is to firstly make a transformation for output space and then learn in the transformed space. In this paper, we firstly propose a new transformation approach to make it possess the following favorable characteristics: i) it is relatively easier for subsequent learning in the transformed space, ii) it can reflect the explicit within-dimensional relationships, iii) it can keep the output space size invariant, iv) it can overcome the drawbacks of existing transformation approaches for MDC, v) it is decomposable for each output dimension of MDC. Next, for effectively modeling the dependency of the transformed problem, we present a novel Mahalanobis distance metric learning method in which we can obtain a closed-form solution. Interestingly, the method itself can be of independent interest.Finally, we conduct extensive experiments whose results justify that our approach combining the above two procedures can beat the state-of-the-art MDC methods in most cases in term of classification performance, while on MLC data sets, our distance metric learning method itself can obtain competitive classification performance compared to its counterparts designed specifically for MLC. |
|
Keywords:computer application technology, Multi-dimensional classification, problem transformation, distance metric learning, closed-form solution. |
|