Index Show pagesource Old revisions Digg this! Del.Icio.Us Google bookmark

# MEMPM Matlab Toolbox

## Introduction

This is a MATLAB toolbox to demonstrate two classification models: Minimum Error Minimax Probability Machine (MEMPM) and Biased Minimax Probability Machine (BMPM). MEMPM and BMPM are derived from Minimax Probability Machine (MPM). For convenient comparison, we implement the MEMPM and the BMPM using the same data format to the MPM. The advantages of MEMPM and BMPM are:

• MEMPM is a worst-case distribution-free Bayes optimal classifier. It converges to the true Bayes optimal classifier when Gaussian distribution is assumed.
• MEMPM provides a lower bound on the probability of correct classification of future data being a class of data with given mean and covariance matrix. This bound is superior to the bound of MPM.
• BMPM can be used to the biased classification. It achieves the bias by maximizing the worst-case probability of correct classification on one class while keeping the worst-case probability of correct classification on the other acceptable.
• BMPM provides the corresponding lower bound for a classification accuracy indicator.

MEMPM is solved by the sequential BMPM method, i.e., a line search + BMPM. The line search is solved by the Quadratic Interpolation (QI) method and BMPM is implemented by solving a concave-convex Fractional Programming problem with each local maximum being a global maximum. For linear BMPM, it is implemented by the Rosen Gradient projection method. The kernelized BMPM is implemented by the parametric method, where we need to solve a second order cone problem. We adopt the least-square method and implement the codes based on the MPM source codes provided by Gert R. G. Lanckriet. We would thank his distribution.

We also implement the robust MPM with unequal parameters (nu) by solving a Fractional Programming problem using parametric method.

The toolbox is provided free for non-commercial use.

If you want to use this toolbox for research purposes, we would like you cite it as: Haiqin Yang, Kaizhu Huang, Irwin King, Michael R. Lyu and Laiwan Chan. Matlab Toolbox for Minimum Error Minimax Probability Machine (MEMPM-1.0), http://www.cse.cuhk.edu.hk/~miplab/mempm_toolbox/index.htm, 2004. or Haiqin Yang, Kaizhu Huang, Irwin King, Michael R. Lyu and Laiwan Chan. Matlab Toolbox for Biased Minimax Probability Machine (BMPM-1.0), http://www.cse.cuhk.edu.hk/~miplab/mempm_toolbox/index.htm, 2004.

We would be grateful if you let us know about any bugs you find or send suggestions to improve the toolbox. You may send email to hqyang@cse.cuhk.edu.hk and kzhuang@cse.cuhk.edu.hk.

## References

1. [2004, article | www]
Kaizhu Huang, Haiqin Yang, Irwin King, Michael~R. Lyu, and Laiwan Chan, "Minimum Error Minimax Probability Machine," Journal of Machine Learning Research, vol. 5, pp. 1253-1286, 2004.
2. [2006, article | www]
Kaizhu Huang, Haiqin Yang, Irwin King, and Michael R. Lyu, "Maximizing Sensitivity in Medical Diagnosis Using Biased Minimax Probability Machine," IEEE Transactions on Biomedical Engineering, vol. 53, iss. 5, pp. 821-831, 2006.
3. [2006, article | www]
Kaizhu Huang, Haiqin Yang, Irwin King, and Michael R. Lyu, "Imbalanced Learning With Biased Minimax Probability Machine," IEEE Transactions on System, Man, and Cybernetics Part B, vol. 36, iss. 4, pp. 913-923, 2006.
4. [2004, inproceedings | www]
Kaizhu Huang, Haiqin Yang, Irwin King, and Michael~R. Lyu, "Learning Classifiers from Imbalanced Data Based on Biased Minimax Probability Machine," in Proc. Proceedings IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR2004), Washington D.C., 2004, pp. 558-563.