Strong Convergence Theorem by An Extragradient Method For Fixed Point Problems and Variational Inequality Problems
Strong Convergence Theorem by An Extragradient Method For Fixed Point Problems and Variational Inequality Problems
Strong Convergence Theorem by An Extragradient Method For Fixed Point Problems and Variational Inequality Problems
1. INTRODUCTION
Let H be a real Hilbert space with inner product ·, · and norm ·, respectively.
Let C be a nonempty closed convex subset of H and let PC : H → C be the metric
projection of H onto C.
1293
1294 Lu-Chuan Zeng and Jen-Chih Yao
Au, v − u ≥ 0 ∀v ∈ C.
where λ ∈ (0, 1/k). He showed that the sequences {x n } and {x̄n } generated by
(1.2) converge to the same point z ∈ Ω.
Further motivated by the idea of Korpelevich’s extragradient method, Nadezhk-
ina and Takahashi [10] introduced an iterative process for finding a common element
of the set of fixed points of a nonexpansive mapping and the set of solutions of
a variational inequality problem. They proved the following weak convergence
theorem for two sequences generated by this process.
Theorem 1.1 [10, Theorem 3.1]. Let C be a nonempty closed convex subset of
a real Hilbert space H. Let A : C → H be a monotone, k-Lipschitz continuous
Extragradient Method for Fixed Point Problems and VIP 1295
where {λn} ⊂ [a, b] for some a, b ∈ (0, 1/k) and {α n } ⊂ [c, d] for some c, d ∈
(0, 1). Then the sequences {xn }, {yn } converge weakly to the same point z ∈
F (S) ∩ Ω where
z = lim PF (S)∩Ω xn .
n→∞
2. PRELIMINARIES
Let H be a real Hilbert space with inner product ·, · and norm ·, respectively.
Let C be a nonempty closed convex subset of H. We write xn x to indicate
that the sequence {x n } converges weakly to x and xn → x to indicate that {x n }
converges strongly to x. For every point x ∈ H, there exists a unique nearest point
in C, denoted by PC x, such that x − PC x ≤ x − y ∀y ∈ C. PC is called the
metric projection of H onto C. It is known that PC is a nonexpansive mapping of
H onto C. It is also known that PC is characterized by the following properties
(see [7] for more details): PC x ∈ C and for all x ∈ H, y ∈ C,
(2.1) x − PC x, PC x − y ≥ 0,
1296 Lu-Chuan Zeng and Jen-Chih Yao
Lemma 2.1 [6, Lemma 2.1]. Let {sn } be a sequence of nonnegative numbers
satisfying the conditions: s n+1 ≤ (1 − αn )sn + αn βn , ∀ n ≥ 0 where {αn } and
{βn } are sequences of real numbers such that
(i) {αn } ⊂ [0, 1] and ∞ n=0 αn = ∞, or equivalently,
∞ n
n=0 (1 − αn ) := limn→∞ k=0 (1 − αk ) = 0;
(ii) limsupn→∞ βn ≤ 0, or
(ii) n αn βn is convergent.
Then limn→∞ sn = 0.
Now we can state and prove the main result in this paper.
Then the sequences {x n }, {yn } converge strongly to the same point PF (S)∩Ω (x0 )
provided
lim xn − xn+1 = 0.
n→∞
tn −u2 ≤ xn −u2 −xn −yn 2 −yn −tn 2 +2λn kxn −yn tn −yn
≤ xn −u2 −xn −yn 2 −yn −tn 2 +λ2n k 2 xn −yn 2 +yn −tn 2
(3.1)
≤ xn −u2 +(λ2n k 2 −1)xn −yn 2
≤ xn −u2 .
which implies that (3.2) holds for n = 0. Suppose that (3.2) holds for n ≥ 1. Then
we have xn − u ≤ x0 − u. This together with (3.1) implies that
This shows that (3.2) holds for n + 1. Therefore (3.2) holds for all n ≥ 0; i.e., {xn }
is bounded. So it follows from (3.1) that tn − u ≤ x0 − u ∀n ≥ 0, i.e., {tn }
is also bounded.
Extragradient Method for Fixed Point Problems and VIP 1299
and
Without loss of generality, we may further assume that {xni } converges weakly to
ũ for some ũ ∈ H. Hence (3.7) reduces to
xn − λn Ayn − tn , tn − v ≥ 0
and hence
tn − xn
v − tn , + Ayn ≥ 0.
λn
Extragradient Method for Fixed Point Problems and VIP 1301
4. APPLICATIONS
Remark 4.1. See Yamada [9] and Xu and Kim [6] for the case when A : H →
H is a strongly monotone and Lipschitz continuous mapping on a real Hilbert space
H and S : H → H is a nonexpansive mapping.
Then the sequence {xn } converges strongly to P A−1 0∩B −10 (x0 ) provided
REFERENCES
2. G. M. Korpelevich, The extragradient method for finding saddle points and other
problems, Matecon, 12 (1976), 747-756.
3. F. Liu and M. Z. Nashed, Regularization of nonlinear ill-posed variational inequalities
and convergence rates, Set-Valued Anal., 6 (1998), 313-344.
4. K. Goebel and W. A. Kirk, Topics on Metric Fixed-Point Theory, Cambridge Uni-
versity Press, Cambridge, England, 1990.
5. R. T. Rockafellar, On the maximality of sums of nonlinear monotone operators, Trans.
Amer. Math. Soc., 149 (1970), 75-88.
6. H. K. Xu and T. H. Kim, Convergence of hybrid steepest-descent methods for vari-
ational inequalities, J. Optim. Theory Appl., 119 (2003), 185-201.
7. W. Takahashi, Nonlinear Functional Analysis, Yokohama Publishers, Yokohama,
Japan, 2000.
8. W. Takahashi and M. Toyoda, Weak convergence theorems for nonepxansive map-
pings and monotone mappings, J. Optim. Theory Appl., 118 (2003), 417-428.
9. I. Yamada, The hybrid steepest-descent method for the variational inequality problem
over the intersection of fixed-point sets of nonexpansive mappings, in Inherently Par-
allel Algorithms in Feasibility and Optimization and Their Applications (D. Butnariu,
Y. Censor and S. Reich Eds.), Kluwer Academic Publishers, Dordrecht, Holland,
2001.
10. N. Nadezhkina and W. Takahashi, Weak convergence theorem by an extragradient
method for nonexpansive mappings and monotone mappings, J. Optim. Theory Appl.,
128 (2006), 191-201.
Lu-Chuan Zeng
Department of Mathematics,
Shanghai Normal University,
Shanghai 200234, China.
E-mail: [email protected]
Jen-Chih Yao∗
Department of Applied Mathematics,
National Sun Yat-sen University,
Kaohsiung 804, Taiwan
E-mail: [email protected]