PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Maximum Separation Partial Least Squares (MSPLS): a new method for classification in microarray experiment

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The purpose of the paper is to propose a new method for classification. Our MSPLS method was deduced from the classic Partial Least Squares (PLS) algorithm. In this method we applied the Maximum Separation Criterion. On the basis of the approach we are able to find such weight vectors that the dispersion between the classes is maximal and the dispersion within the classes is minimal. In order to compare the performance of classifier we used the following types of dataset - biological and simulated. Error rates and confidence intervals were estimated by the jackknife method.
Rocznik
Tom
Strony
187--195
Opis fizyczny
Bibliogr. 16 poz., tab.
Twórcy
  • University of Silesia, Institute of Mathematics, ul. Bankowa 14, 40-007 Katowice
autor
Bibliografia
  • [1] BASTIEN P., VINZI V. E., TENENHAUS M., PLS generalized linear regression, Computational Statistics \& Data Analysis, 48 (2005), pg. 17-46.
  • [2] BROBERG P.: Statistical methods for ranking differentially expressed genes. Genome Biology 2003, 4:R41.
  • [3] DUDOIT S., YANG Y. H., CALLOW M. J., SPEED T .P.: Statistical methods for identifying differentially expressed genes in replicated cDNA microarray experiments. Technical Report 578, Department of Statistics, UC Berkeley, CA, 2000.
  • [4] EFRON B.: Estimating the error rate of prediction rule improvement on cross-validation. Journal of the American Statistical Association, 1983, Vol. 78, No. 382.
  • [5] FUKUNAGA K., Introduction to statistical pattern recognition, Academic Press Professional, New York, 1990.
  • [6] GARTHWAITE P. H., An interpretation of Partial Least Squares, Journal of the American Statistical Association, Mar 1994, 89, 425, ABI/INFORM Global, pg. 122.
  • [7] HÖSKULDSSON A., PLS Regression methods, Journal of Chemometrics, vol. 2, 211-228 (1988).
  • [8] HÖSKULDSSON A., Variable and subset selection in PLS regression, Chemometrics and Intelligent laboratory Systems, 55 (2001), 23-38.
  • [9] NGUYEN D. V., ROCKE D. M., On Partial Least Squares dimension reduction for a microarray-based classification: a simulation study, Computational Statistic and Data Analysis, 46 (2004) pg. 407-425.
  • [10] WOLD H., Soft Modeling by Latent Variables: The Non-Linear Iterative Partial Least Squares (NIPALS) Approach, Perspectives in Probability and Statistics. Papers in Honour of M. S. Bartlett, London 1975, pg. 117-142.
  • [11] WOLD S., MARTENS H., WOLD H, The multivariate calibration problem in chemistry solved by the PLS method, Proc. Conf. Matrix Pencils, (A. Ruhe and B. Kagström, eds.), March 1982, Lecture Notes in Mathematics, Springer Verlag, Heidelberg, pg. 286-293.
  • [12] WOLD S., RUHE A., WOLD H., The co-linearity problem in linear regression. The partial least squares (PLS) approach to generalized inverses, SIAM J. Sci. Stat. Comput. Vol. 5, no 3, September 1984.
  • [13] WOLD S., SJÖSTRÖM M., ERIKSSON L., PLS-regression: a basic tool of chemometrics, Chemometrics and Intelligent laboratory Systems, 58, 2001, pg. 109-130.
  • [14] VAPNIK V. N., Statistical Learning Theory, Wiley, New York 1998.
  • [15] VAPNIK V. N., The nature of statistical learning theory 2ed., Springer, 2000.
  • [16] http://www.broad.mit.edu
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-article-PWA4-0007-0019
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.