Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników

Znaleziono wyników: 3

Liczba wyników na stronie
first rewind previous Strona / 1 next fast forward last
Wyniki wyszukiwania
Wyszukiwano:
w słowach kluczowych:  unconstrained optimization
help Sortuj według:

help Ogranicz wyniki do:
first rewind previous Strona / 1 next fast forward last
EN
In this article, inspired by the projection technique of Solodov and Svaiter, we exploit the simple structure, low memory requirement, and good convergence properties of the mixed conjugate gradient method of Stanimirović et al. [New hybrid conjugate gradient and broyden-fletcher-goldfarbshanno conjugate gradient methods, J. Optim. Theory Appl. 178 (2018), no. 3, 860–884] for unconstrained optimization problems to solve convex constrained monotone nonlinear equations. The proposed method does not require Jacobian information. Under monotonicity and Lipschitz continuity assumptions, the global convergence properties of the proposed method are established. Computational experiments indicate that the proposed method is computationally efficient. Furthermore, the proposed method is applied to solve the ℓ1 -norm regularized problems to decode sparse signals and images in compressive sensing.
EN
The purpose of this paper is to present a new conjugate gradient method for solving unconstrained nonlinear optimization problems, based on Perry’s idea. An accelerated adaptive algorithm is proposed, where our search direction satisfies the sufficient descent condition. The global convergence is analyzed using the spectral analysis. The numerical results are described for a set of standard test problems, and it is shown that the performance of the proposed method is better than that of the CG-DESCENT, the mBFGS and the SPDOC.
3
Content available remote A New Non-monotone Line Search Algorithm for Nonlinear Programming
EN
We study the application of a kind of non-monotone line search’s technique in conjugate gradient method. At present, most of the study of conjugate gradient methods are using Wolfe’s monotone line search, by constructing the condition of Zoutendijk, we can get the conclusion that it’s convergence by using reduction to absurdity. Here we study the global convergence of conjugate gradient methods with Armijo-type line search, the thought of proof wasn’t using the method above mentioned.
PL
Przeprowadzono studia nad zastosowaniem niemonotonicznego badania prostej w sprzężonej metodzie gradientowej. Obecnie najczęściej wykorzystuje się metodę Wolfa ale nasze badania wykazały że lepsze wyniki uzyskuje się w metodach globalnej zbieżności sprzężonej metody gradientowej.
first rewind previous Strona / 1 next fast forward last
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.