貴州大學(xué) 數(shù)學(xué)與統(tǒng)計(jì)學(xué)院
最優(yōu)化方法與應(yīng)用
-
1個(gè)人簡介
-
2學(xué)術(shù),、社會(huì)兼職
-
3獎(jiǎng)勵(lì)
-
4項(xiàng)目
-
5軟件
-
6論文
-
7教學(xué)
-
8Contact Me
劉澤顯,,1984年出生,,廣西昭平人,, 2018年博士畢業(yè)于西安電子科技大學(xué)應(yīng)用數(shù)學(xué)(導(dǎo)師:劉紅衛(wèi)教授),,2020年從中國科學(xué)院數(shù)學(xué)與系統(tǒng)科學(xué)研究院博士后出站(合作導(dǎo)師:戴彧虹研究員),,現(xiàn)為貴州大學(xué)數(shù)學(xué)與統(tǒng)計(jì)學(xué)院副教授,,碩士生導(dǎo)師,。主要從事最優(yōu)化方法與應(yīng)用的研究,,在近似最優(yōu)梯度法、子空間極小化共軛梯度法和有限內(nèi)存共軛梯度法方面取得了富有特色的研究成果 ,。目前在讀碩士研究生7人,,歡迎對最優(yōu)化方法與應(yīng)用感興趣的本科生報(bào)考,聯(lián)系方式:[email protected]
1. 中國運(yùn)籌學(xué)會(huì)算法軟件與應(yīng)用分會(huì)理事
2. 美國《數(shù)學(xué)評論》評論員
3. 國際SCI期刊 J. Global Optim., Optim. Methods Softw., Numer. Algorithms, Appl. Numer. Math., J. Comput. Appl. Math., Numer. Func. Anal. Optim.,Optim. Letters等期刊審稿人
4. 國內(nèi)核心期刊 《中國科學(xué).數(shù)學(xué)》,、《計(jì)算數(shù)學(xué)》,、《運(yùn)籌學(xué)報(bào)》等期刊審稿人
1. 陜西省第四屆研究生創(chuàng)新成果一等獎(jiǎng)(2018)
2. 西安電子科技大學(xué)數(shù)學(xué)與統(tǒng)計(jì)學(xué)院2016-2017 年學(xué)術(shù)創(chuàng)新一等獎(jiǎng)
3. 2014 年廣西高等教育學(xué)會(huì)數(shù)學(xué)專業(yè)教學(xué)專業(yè)委員會(huì)學(xué)術(shù)論文一等獎(jiǎng)
4. 2021年西安電子科技大學(xué)優(yōu)秀博士論文
5. 2021年被評為貴州大學(xué)數(shù)學(xué)與統(tǒng)計(jì)學(xué)院“優(yōu)秀共產(chǎn)黨員”
6. 2022年獲貴州大學(xué)數(shù)學(xué)與統(tǒng)計(jì)學(xué)院考核優(yōu)秀
1. 新型一階算法及其收斂率和應(yīng)用研究,,國家自然科學(xué)基金,2023.1-2026.12,,主持,,在研
2. 混合整數(shù)規(guī)劃的人工智能方法(11991021), 國家自然科學(xué)基金重大項(xiàng)目, 2020.1-2024.12, 主要參與, 在研
3. 大規(guī)模優(yōu)化的近似最優(yōu)梯度法和有限內(nèi)存共軛梯度法研究, 國家自然科學(xué)基金青年項(xiàng)目,,2020.1-2021.12,,主持,結(jié)題
4. 非凸優(yōu)化問題的梯度型算法研究(黔科合基礎(chǔ)-ZK[2022]一般 084),,貴州自然科學(xué)基金, 2022.3-2024.3,,主持,在研
5. 無約束優(yōu)化問題的若干算法研究,,中國博士后科學(xué)基金面上項(xiàng)目,2019.11-2021.1,,主持,,結(jié)題
6. 基于BB 算法思想的梯度法與共軛梯度法及其應(yīng)用研究,廣西自然科學(xué)基金,,2018.11-2021.12,,主持,結(jié)題
1.子空間極小化共軛梯度法軟件 SMCG_BB (Hongwei Liu, Zexian Liu. An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization): Code, Numerical results.
2.梯度法軟件 GM_AOS(cone) (Zexian Liu, Hongwei Liu. Gradient method with approximately optimal stepsize based on conic model ):Code, Numerical results.
27. Song Taiyong, Liu Zexian*. An efficient inertial subspace minimization CG algorithm with convergence rate analysis for constrained nonlinear monotone equations. accepted by Journal of Computational and Applied Mathematics, 2024 (SCI).
26. Liu Hongwei, Wang, Ting, Liu Zexian. A nonmonotone accelerated proximal gradient method with variable stepsize strategy for nonsmooth and nonconvex minimization problems. Journal of Global Optimization (2024). https://doi.org/10.1007/s10898-024-01366-4 (SCI)
25. Ni Yan, Liu Zexian*. A new Dai-Liao conjugate gradient method based on approximately optimal stepsize for unconstrained optimization, accepted by Numerical Functional Analysis and Optimization, 2024 (SCI).
24. Liu Zexian, Ni Yan, Liu Hongwei, Sun Wumei. A new subspace minimization conjugate gradient method for unconstrained minimization. Journal of Optimization Theory and Applications, https://link.springer.com/article/10.1007/s10957-023-02325-x, 2023 (SCI)
23. Liu Hongwei, Sun Wumei, Liu Zexian. A Regularized Limited Memory Subspace Minimization Conjugate Gradient Method for Unconstrained Optimization. Numerical Algorithms, https://doi.org/10.1007/s11075-023-01559-0, 2023 (SCI),
22. Liu Zexian, Liu Hongwei, Wang Ting. New gradient methods with adaptive stepsizes by approximate models, Optimization, https://doi.org/10.1080/02331934.2023.2234925 , (SCI), 2023.
21. Liu Hongwei, Wang Ting, Liu Zexian. Convergence rate of inertial forward–backward algorithms based on the local error bound condition. IMA Journal of Numerical Analysis. https://doi.org/10.1093/imanum/drad031, 2023(SCI)
20. Liu Hongwei, Wang Ting, Liu Zexian. Some modified fast iterative shrinkage thresholding algorithms with a new adaptive non-monotone stepsize strategy for nonsmooth and convex minimization problems. Computational Optimization and Applications, (2022). https://doi.org/10.1007/s10589-022-00396-6 (SCI)
19. Liu Zexian, Chu Wangli, Liu Hongwei. An efficient gradient method with approximately optimal stepsizes based on regularization models for unconstrained optimization. RAIRO Operations Research, 56, 2403–2424(2022). (SCI)
18. Sun Wumei, Liu Hongwei, Liu Zexian. Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems. Numerical Algorithms, (2022). https://doi.org/10.1007/s11075-022-01319-6 (SCI )
17. Sun Wumei, Liu Hongwei, Liu Zexian. A class of accelerated subspace minimization conjugate gradient methods. Journal of Optimization Theory and Applications, 190, 811–840 (2021). (SCI )
16. Zhao Ting, Liu Hongwei, Liu Zexian. New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization. Numerical Algorithms. 87(4), 1501–1534, 2021.(SCI )
15. Liu Zexian, Liu Hongwei, Dai Yu-Hong*. An improved Dai-Kou conjugate gradient algorithm forunconstrained optimization. Computational Optimization and Applications. 2020,,75(1):145–167 (SCI )
14. Liu Zexian, Liu Hongwei*. An efficient gradient method with approximately optimal stepsize based ontensor model for unconstrained optimization. Journal of Optimization Theory and Applications, 2019, 181(2): 608-633. (SCI )
13. Liu Hongwei, Liu Zexian*. An efficient Barzila-Borwein conjugate gradient method for unconstrained Optimization. Journal of Optimization Theory and Applications, 2019, 180(3):879-906 (SCI )
12. Liu Zexian, Liu Hongwei. Several efficient gradient methods with approximate optimal stepsizes forlarge scale unconstrained optimization. Journal of Computational and Applied Mathematics, 2018, 328:400-413. (SCI )
11. Liu Zexian, Liu Hongwei. An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numerical Algorithms, 2018, 78(1):21-39. (SCI 2 區(qū))
10. Li Ming, Liu Hongwei, Liu Zexian*. A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numerical Algorithms, 2018, 79(1):195- 219(SCI )
9. Liu Zexian, Liu Hongwei, Dong Xiaoliang. An efficient gradient method with approximate optimal stepsize for the strictly convex quadratic minimization problem. Optimization, 2018, 67(3): 427-440.(SCI )
8. Liu Zexian*, Liu Hongwei, Wang Xiping. Accelerated augmented Lagrangian method for total variation minimization. Computational and Applied Mathematics, 2019, 38(2). https://doi.org/10.1007/ s40314-019-0787-7. (SCI )
7. Liu Hongwei, Liu Zexian*, Dong Xiaoliang. A new adaptive Barzilai and Borwein method for unconstrained optimization. Optimization Letters, 2018, 12(4):845-873. (SCI )
6. Li Yufei, Liu Zexian*, Liu Hongwei. A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Computational and Applied Mathematics, 2019, 38(1),https://link.springer.com/article/10.1007/s40314-019-0779-7 . ( SCI )
5. Wang Ting, Liu Zexian*, Liu Hongwei. A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization. International Journal of Computer Mathematics, 2019, 96(10): 1924-1942. (SCI )
4. Dong Xiaoliang, Liu Zexian, Liu Hongwei, Li Xiangli. An efficient adaptive three-term extension of the Hestenes–Stiefel conjugate gradient method. Optimization Methods and Software, 2018, 34(2):1-14
3. Zhang Keke, Liu Hongwei, Liu Zexian*. A new adaptive subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Journal of Computational Mathematics. 2021, 39(2), 159-177. (SCI )
2. 劉澤顯. 一種修正的線搜索Filter-SQP 算法. 系統(tǒng)科學(xué)與數(shù)學(xué), 2014, 34(1):53-63. (核心)
1. 劉澤顯,,劉紅衛(wèi),何川美. 基于新的Hessian 近似矩陣的稀疏重構(gòu)算法.數(shù)學(xué)的實(shí)踐與認(rèn)識, 2019,(13):167-178. (核心)
承擔(dān)數(shù)值分析,、運(yùn)籌學(xué),、離散數(shù)學(xué)、概率論與數(shù)理統(tǒng)計(jì),、信息論基礎(chǔ),、數(shù)學(xué)實(shí)驗(yàn)和最優(yōu)化方法等課程, 主持完成廣西區(qū)教改項(xiàng)目:
1. 地方本科院校數(shù)學(xué)與應(yīng)用數(shù)學(xué)專業(yè)實(shí)驗(yàn)教學(xué)的研究與實(shí)踐(2014JGB234),,2014廣西高等教育教學(xué)改革工程項(xiàng)目,,2014.6-2016.4,主持
-
個(gè)人簡介
-
學(xué)術(shù),、社會(huì)兼職
-
獎(jiǎng)勵(lì)
-
項(xiàng)目
-
軟件
-
論文
-
教學(xué)
-
Contact Me