搜索资源列表
bayesian-hill-climbin
- FORTRAN code for minimizing a function whose uation is expensive. At each iteration, a Bayesian posterior mean for the surface shape conditional on points already sampled is constructed and the minimum of this is found.
An-Improved-Learning-Algorithm-Based-on-BFGS-Meth
- This paper suggests that a simple modification to the initial search direction can also substantially improve the training efficiency of almost all major optimization methods. It was discovered that if the initial search
BFGS
- 拟牛顿法和最速下降法(Steepest Descent Methods)一样只要求每一步迭代时知道目标函数的梯度。通过测量梯度的变化,构造一个目标函数的模型使之足以产生超线性收敛性。这类方法大大优于最速下降法,尤其对于困难的问题。另外,因为拟牛顿法不需要二阶导数的信息,所以有时比牛顿法(Newton s Method)更为有效。如今,优化软件中包含了大量的拟牛顿算法用来解决无约束,约束,和大规模的优化问题。-The quasi-Newt
最优化
- 中国石油大学(华东)最优化课程要求的两个作业程序:乘子法和Armiyo算法+BFGS算法(China University of Petroleum (East China) optimization course requirements of the two operating procedures: multiplier and Armiyo algorithm, +BFGS algorithm)
3_small.png
- anndgef dgbc ahdtwgge ahdgw , agdhcga da.
782290
- 无约束优化DFP&BFGS方法的C++实现()
749753
- 无约束优化DFP&BFGS方法的C++实现()
Deep-ADMM-Net-master
- Net is defined over a data flow graph, which is derived from the iterative pro- cedures in Alternating Direction Method of Multipliers (ADMM) algorithm for optimizing a CS-based MRI model. In the training phase, all pa
niuduanlafuxun12
- 电力系统潮流计算包含11个节点并能生成相关参数图形(Matlab for power system)
Newton
- 求解无约束最优化问题,Newton方法包括基本Newton法,拟Newton法等BFGS,DFP方法(Solving unconstrained optimization problems, Newton method)
L-BFGS-B-C-master
- 基于梯度下降法的最优迭代算法,在深度学习和神经网络中应用非常广泛,也非常好用(The optimal iterative algorithm based on gradient descent method is widely used in depth learning and neural network, and it is also very useful.)
L-BFGS
- 有限记忆算法,用于处理大规模算法,算法原理为把不断迭代的牛顿矩阵分解并部分抵消达到减少运算量的目的(limit memory,for large-scale algorithms. The principle of the algorithm is to decompose and partially cancel the iterative Newton matrix in order to reduce the computatio