RECORD DETAIL


Back To Previous

UPA Perpustakaan Universitas Jember

Extragradient Method in Optimization: Convergence and Complexity

No image available for this title
We consider the extragradient method to minimize the sum of two functions, the first one being smooth and the second being convex. Under the Kurdyka–
Ɓojasiewicz assumption, we prove that the sequence produced by the extragradient method converges to a critical point of the problem and has finite length. The analysis is extended to the case when both functions are convex. We provide, in this case, a sublinear convergence rate, as for gradient-based methods. Furthermore, we show that the recent small-prox complexity result can be applied to this method. Considering the extragradient method is an occasion to describe an exact line search scheme for proximal decomposition methods. We provide details for the implementation of this scheme for the one-norm regularized least squares problem and demonstrate numerical results which suggest that combining nonaccelerated methods with exact line search can be a competitive choice.

Availability
EB00000003452KAvailable
Detail Information

Series Title

-

Call Number

-

Publisher

: ,

Collation

-

Language

ISBN/ISSN

-

Classification

NONE

Detail Information

Content Type

E-Jurnal

Media Type

-

Carrier Type

-

Edition

-

Specific Detail Info

-

Statement of Responsibility

No other version available
File Attachment