linux下安装nloptr、AnnoProbe

Python013

linux下安装nloptr、AnnoProbe,第1张

想使用geoChina,需要安装AnnoProbe包。

报错。缺少nloptr包。

继续报错。于linux环境中安装nloptr

重新安装包。成功。

Nlopt包含很多优化算法。NLopt中的每个算法都由命名常量标识,被传入NLopt中。

These constants are mostly of the formNLOPT_{G,L}{N,D}_xxxx, whereG/Ldenotes global/local optimization andN/Ddenotes derivative-free/gradient-based algorithms, respectively.

For example, theNLOPT_LN_COBYLAconstant refers to the COBYLA algorithm (described below), which is a local (L) derivative-free (N) optimization algorithm.

Comparing algorithms这里讲了如何对优化算法进行比较。

下面先列举一下NLopt包含了哪些全局优化算法,哪些局部搜索算法。(列举完之后反思一下该怎么选择优化算法。然后思考一下怎么使用。)

All of the global-optimization algorithms currently require you to specify bound constraints on all the optimization parameters.

Of these algorithms, only ISRES, AGS, and ORIG_DIRECT support 非线性不等式约束, and only ISRES supports 非线性等式约束. 但它们都能与 augmented Lagrangian method 结合来解决非线性约束问题。

做完全局搜索完之后,最好以全局搜索的结果作为起点,再进行局部搜索。这里的全局搜索算法花很多精力在search parameter space上,而没有特别认真在寻找局部最佳的确切位置。

1 DIRECT and DIRECT-L (Most of the above algorithms only handle bound constraints, and in fact require finite bound constraints (they are not applicable to unconstrained problems). They do not handle arbitrary nonlinear constraints. However, theORIGversions by Gablonsky et al. include some support for arbitrary nonlinear inequality constraints.)

2 Controlled Random Search (CRS) with local mutation (Only bound-constrained problems are supported by this algorithm.)

3 MLSL (Multi-Level Single-Linkage) (Only bound-constrained problems are supported by this algorithm.)

4 StoGO (Only bound-constrained problems are supported by this algorithm.)

5 AGS (AGS can handle arbitrary objectives and nonlinear inequality constraints. Also bound constraints are required for this method. )

6 ISRES (Improved Stochastic Ranking Evolution Strategy) (This method supports arbitrary nonlinear inequality and equality constraints in addition to the bound constraints)

7 ESCH (evolutionary algorithm) (The method supports bound constraints only (no nonlinear constraints))

Of these algorithms, only COBYLA currently supports arbitrary nonlinear inequality and equality constraints非线性不等式和等式约束the rest of them support bound-constrained or unconstrained problems only. (However, any of them can be applied to nonlinearly constrained problems by combining them with the augmented Lagrangian method below.)

1 COBYLA (Constrained Optimization BY Linear Approximations) (The underlying COBYLA code only supports inequality constraints. Equality constraints are automatically transformed into pairs of inequality constraints, which in the case of this algorithm seems not to cause problems.)

2 BOBYQA (BOBYQA performs derivative-free bound-constrained optimization using an iteratively constructed quadratic approximation for the objective function.)

3 NEWUOA + bound constraints (permits efficient handling of bound constraints. This algorithm is largely superseded by BOBYQA (above))

4 PRAXIS (PRincipal AXIS)

5 Nelder-Mead Simplex

6 Sbplx (based on Subplex)

Of these algorithms, only MMA and SLSQP support arbitrary nonlinear inequality constraints任意非线性不等式约束, and only SLSQP supports nonlinear equality constraints非线性等式约束the rest support bound-constrained or unconstrained problems only. (However, any of them can be applied to nonlinearly constrained problems by combining them with the augmented Lagrangian method below.)

1 MMA (Method of Moving Asymptotes) and CCSA

2 SLSQP (this is a sequential quadratic programming (SQP) algorithm for nonlinearly constrained gradient-based optimization (supporting both inequality and equality constraints))

3 Low-storage BFGS

4 Preconditioned truncated Newton

5 Shifted limited-memory variable-metric

NLopt中有一种算法适合所有上述类别,具体取决于指定的辅助优化算法. This method combines the objective function and the nonlinear inequality/equality constraints (if any) in to a single function.