Efficient global optimization based on dynamic canonical descent
Stochastic methods have gained some popularity in global optimization in that most of them do not assume the cost functions to be differentiable, have capabilities to avoid being trapped by local optima, and may converge even faster than gradient-based optimization methods on some problems. The present paper briefly reviews the advantages and the limitations of some classical stochastic optimization algorithms such as genetic algorithms (GA) and simulated annealing (SA), and then proposes a faster derivative-free and deterministic (non-stochastic) global optimization algorithm which retains their advantages while avoiding their disadvantages.
Bibliogr. 10 poz.