Optimization Eruditorum

Electronic ISSN: 3008-1521

DOI: 10.69829/oper

A new efficient alternative extension of the Hager-Zhang conjugate gradient method for vector optimization

Optimization Eruditorum, Volume 3, Issue 1, April 2026, Pages 34–53

XIAOQING OU

College of Management, Chongqing College of Humanities, Science & Technology, Chongqing 401524, China

YUNYE WAN

Beibei Power Supply Branch of State Grid Chongqing Electric Power Company, Chongqing 400700, China

ZHAO-HAN LIU

School of Mathematics and Statistics, Southwest University, Chongqing 400715, China

HUILIN HAN

School of Mathematics and Statistics, Southwest University, Chongqing 400715, China

WEIGUANG PENG

School of Mathematics and Statistics, Southwest University, Chongqing 400715, China

JIAWEI CHEN

School of Mathematics and Statistics, Southwest University, Chongqing 400715, China


Abstract

In this paper, a new efficient alternative extension of the Hager-Zhang conjugate gradient method is proposed to solve the vector optimization problem that ensures sufficient descent without relying on any linear search or convexity assumptions. We give the convergence of our proposed method combined with Wolfe line search under mild assumptions. Finally, we show through numerical experiments that our proposed method is more efficient than the extant related vector Hagar-Zhang conjugate method.


Cite this Article as

Xiaoqing Ou, Yunye Wan, Zhao-Han Liu, Huilin Han, Weiguang Peng, Jiawei Chen, A new efficient alternative extension of the Hager-Zhang conjugate gradient method for vector optimization, Optimization Eruditorum, 3(1), 34–53, 2026