Home > Papers

 
 
A Brief Survey of Nonlinear Conjugate Gradient Methods for Vector Optimization
HE Qing-Rui,CHEN Chun-Rong *
College of Mathematics and Statistics, Chongqing University, Chongqing 401331
*Correspondence author
#Submitted by
Subject:
Funding: Fundamental Research Funds for the Central Universities (No.106112017CDJZRPY0020)
Opened online:26 December 2022
Accepted by: none
Citation: HE Qing-Rui,CHEN Chun-Rong.A Brief Survey of Nonlinear Conjugate Gradient Methods for Vector Optimization[OL]. [26 December 2022] http://en.paper.edu.cn/en_releasepaper/content/4758680
 
 
Conjugate gradient methods are important first-order algorithms, which are characterized by low memory requirements and strong convergence properties. Conjugate gradient methods were first proposed for solving symmetric and positive-definite linear systems, and then developed into a class of major approaches for solving nonlinear unconstrained minimization problems. In recent years, conjugate gradient methods have been also applied to vector optimization problems. In this paper, we mainly introduce the research status and convergence results of nonlinear conjugate gradient methods for vector optimization, and give an instance to illustrate their practicability.
Keywords:Conjugate gradient methods; Vector optimization; Unconstrained optimization; Global convergence; Wolfe conditions
 
 
 

For this paper

  • PDF (0B)
  • ● Revision 0   
  • ● Print this paper
  • ● Recommend this paper to a friend
  • ● Add to my favorite list

    Saved Papers

    Please enter a name for this paper to be shown in your personalized Saved Papers list

Tags

Add yours

Related Papers

Statistics

PDF Downloaded 26
Bookmarked 4
Recommend 0
Comments Array
Submit your papers