PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Pattern recovery and signal denoising by SLOPE when the design matrix is orthogonal

Wybrane pełne teksty z tego czasopisma
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Sorted ℓ1 Penalized Estimator (SLOPE) is a relatively new convex regularization method for fitting high-dimensional regression models. SLOPE allows the reduction of the model dimension by shrinking some estimates of the regression coefficients completely to zero or by equating the absolute values of some nonzero estimates of these coefficients. This allows one to identify situations where some of true regression coefficients are equal. In this article we will introduce the SLOPE pattern, i.e., the set of relations between the true regression coefficients, which can be identified by SLOPE. We will also present new results on the strong consistency of SLOPE estimators and on the strong consistency of pattern recovery by SLOPE when the design matrix is orthogonal and illustrate advantages of the SLOPE clustering in the context of high frequency signal denoising.
Słowa kluczowe
Rocznik
Strony
283--302
Opis fizyczny
Bibliogr. 27 poz., wykr.
Twórcy
  • Faculty of Pure and Applied Mathematics, Wrocław University of Science and Technology, Wybrzeże Wyspiańskiego 27, 50-370 Wrocław, Poland
  • Laboratoire de Mathématiques LAREMA, Université d’Angers, 2 Boulevard Lavoisier, 49045 Angers Cedex 01, France
  • Faculty of Mathematics and Information Science, Warsaw University of Technology, Koszykowa 75, 00-662 Warszawa, Poland
  • Laboratoire de Mathématiques LAREMA, Université d’Angers, 2 Boulevard Lavoisier, 49045 Angers Cedex 01, France
  • Faculty of Pure and Applied Mathematics, Wrocław University of Science and Technology, Wybrzeże Wyspiańskiego 27, 50-370 Wrocław, Poland
Bibliografia
  • [1] J.-P. Aubin, Mathematical Methods of Game and Economic Theory, North-Holland, 1980.
  • [2] M. Bogdan, E. van den Berg, C. Sabatti, W. Su and E. J. Candès, SLOPE - Adaptive variable selection via convex optimization, Ann. Appl. Statist. 9 (2015), 1103-1140.
  • [3] M. Bogdan, E. van den Berg, W. Su and E. J. Candès, Statistical estimation and testing via the sorted ℓ1 norm, arXiv:1310.1969 (2013).
  • [4] M. Bogdan, X. Dupuis, P. Graczyk, B. Kołodziejek, T. Skalski, P. Tardivel and M. Wilczyński, Pattern recovery by SLOPE, arXiv:2203.12086 (2022).
  • [5] H. D. Bondell and B. J. Reich, Simultaneous regression shrinkage, variable selection, and supervised clustering of predictors with OSCAR, Biometrics 64 (2008), 115-123.
  • [6] H. D. Bondell and B. J. Reich, Simultaneous factor selection and collapsing levels in ANOVA, Biometrics 65 (2009), 169-177.
  • [7] S. Sh. Chen and D. L. Donoho, Basis pursuit, in: Proc. 1994 28th Asilomar Conference on Signals, Systems and Computers, IEEE, 1994, 41-44.
  • [8] S. Sh. Chen, D. L. Donoho and M. A. Saunders, Atomic decomposition by basis pursuit, SIAM J. Sci. Comput. 20 (1998), 33-61.
  • [9] X. Dupuis and P. Tardivel, Proximal operator for the sorted l1 norm: Application to testing procedures based on SLOPE, hal-03177108v2 (2021).
  • [10] K. Ewald and U. Schneider, Uniformly valid confidence sets based on the Lasso, Electron. J. Statist. 12 (2018), 1358-1387.
  • [11] M. A. T. Figueiredo and R. Nowak, Ordered weighted ℓ1 regularized regression with strongly correlated covariates: Theoretical aspects, in: Proc. 19th Int. Conf. on Artificial Intelligence and Statistics, Proc. Mach. Learning Res. 51, 2016, 930-938.
  • [12] J. Gertheiss and G. Tutz, Sparse modeling of categorial explanatory variables, Ann. Appl. Statist. 4 (2010), 2150-2180.
  • [13] P. Kremer, D. Brzyski, M. Bogdan and S. Paterlini, Sparse index clones via the sorted ℓ1-norm, Quant. Finance 22 (2022), 349-366.
  • [14] A. Maj-Kańska, P. Pokarowski and A. Prochenka, Delete or merge regressors for linear model selection, Electron. J. Statist. 9 (2015), 1749-1778.
  • [15] K. Minami, Degrees of freedom in submodular regularization: a computational perspective of Stein’s unbiased risk estimate, J. Multivariate Anal. 175 (2020), art. 104546, 22 pp.
  • [16] R. Negrinho and A. F. T. Martins, Orbit regularization, in: Advances in Neural Information Processing Systems 27, 2014, 9 pp.
  • [17] Sz. Nowakowski, P. Pokarowski and W. Rejchel, Group Lasso merger for sparse prediction with high-dimensional categorical data, arXiv:2112.11114 (2021).
  • [18] M.-R. Oelker, J. Gertheiss and G. Tutz, Regularization and model selection with categorical predictors and effect modifiers in generalized linear models, Statist. Model. 14 (2014), 157-177.
  • [19] K. R. Rao, N. Ahmed and M. A. Narasimhan, Orthogonal transforms for digital signal processing, in: Proc. 18th Midwest Symposium on Circuits and Systems, 1975, 1-6.
  • [20] U. Schneider and P. Tardivel, The geometry of uniqueness, sparsity and clustering in penalized estimation, arXiv:2004.09106 (2020).
  • [21] P. Tardivel, R. Servien and D. Concordet, Simple expression of the LASSO and SLOPE estimators in low-dimension, Statistics 54 (2020), 340-352.
  • [22] P. Tardivel, T. Skalski, P. Graczyk and U. Schneider, The geometry of model recovery by penalized and thresholded estimators, hal-03262087 (2021).
  • [23] B. G. Stokell, R. D. Shah and R. J. Tibshirani, Modelling high-dimensional categorical data using nonconvex fusion penalties, arXiv:2002.12606 (2021).
  • [24] R. Tibshirani, Regression shrinkage and selection via the lasso, J. R. Statist. Soc. Ser. B. Statist. Methodology 101 (1996), 167-188.
  • [25] X. Zeng and M. A. T. Figueiredo, Decreasing weighted sorted ℓ1 regularization, IEEE Signal Process. Lett. 21 (2014), 1240-1244.
  • [26] P. Zhao and B. Yu, On model selection consistency of Lasso, J. Mach. Learn. Res. 7 (2006), 2541-2563.
  • [27] H. Zou, The adaptive lasso and its oracle properties, J. Amer. Statist. Assoc. 101 (2006), 1418-1429.
Uwagi
Opracowanie rekordu ze środków MEiN, umowa nr SONP/SP/546092/2022 w ramach programu "Społeczna odpowiedzialność nauki" - moduł: Popularyzacja nauki i promocja sportu (2022-2023).
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-8d161517-3a9a-4724-9432-4920e991e889
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.