0 专栏介绍
🔥课设、毕设、创新竞赛必备!🔥本专栏涉及更高阶的运动规划算法轨迹优化实战,包括:曲线生成、碰撞检测、安全走廊、优化建模(QP、SQP、NMPC、iLQR等)、轨迹优化(梯度法、曲线法等),每个算法都包含代码实现加深理解
🚀详情:运动规划实战进阶:轨迹优化篇
1 LBFGS优化器
1.1 拟牛顿法框架
在数值优化 | 详解拟牛顿法之SR1、DFP、BFGS、L-BFGS(附案例分析与Python实现)中,我们介绍了拟牛顿法(quasi-Newton method)框架,有助于解决经典牛顿法计算Hessian矩阵的逆矩阵 H k ( x ) \boldsymbol{H}_k\left( \boldsymbol{x} \right) Hk(x)复杂度高、数值稳定性较差的问题
L-BFGS算法是一种用于求解大规模无约束优化问题的拟牛顿法。它是BFGS算法的内存优化版本,适用于高维问题(如机器学习、轨迹优化等)。具体而言,L-BFGS算法仅保留最近 m m m步的中间结果来近似Hessian矩阵的逆矩阵 H k ( x ) \boldsymbol{H}_k\left( \boldsymbol{x} \right) Hk(x)
H k + 1 = ( V k T ⋯ V k − m + 1 T ) H 0 ( V k − m + 1 ⋯ V k ) + ( V k T ⋯ V k − m + 2 T ) ρ k − m + 1 s k − m + 1 s k − m + 1 T ( V k − m + 2 ⋯ V k ) + ⋯ + ρ k s k s k T \begin{aligned}\boldsymbol{H}_{k+1}&=\left( \boldsymbol{V}_{k}^{T}\cdots \boldsymbol{V}_{k-m+1}^{T} \right) \boldsymbol{H}_0\left( \boldsymbol{V}_{k-m+1}\cdots \boldsymbol{V}_k \right) \\\,\, &+\left( \boldsymbol{V}_{k}^{T}\cdots \boldsymbol{V}_{k-m+2}^{T} \right) \rho _{k-m+1}\boldsymbol{s}_{k-m+1}\boldsymbol{s}_{k-m+1}^{T}\left( \boldsymbol{V}_{k-m+2}\cdots \boldsymbol{V}_k \right) \\\,\, &+\cdots \\\,\, &+\rho _k\boldsymbol{s}_k\boldsymbol{s}_{k}^{T}\end{aligned} Hk+1=(VkT⋯Vk−m+1T)H0(Vk−m+1⋯Vk)+(VkT⋯Vk−m+2T)ρk−m+1sk−m+1sk−m+1T(Vk−m+2⋯Vk)+⋯+ρkskskT
因此在L-BFGS算法中无需保留完整的 H k \boldsymbol{H}_k Hk,只需存储向量序列 { s i } i = k − m k \left\{ \boldsymbol{s}_i \right\} _{i=k-m}^{k} {si}i=k−mk、 { y i } i = k − m k \left\{ \boldsymbol{y}_i \right\} _{i=k-m}^{k} {yi}i=k−mk,占用存储空间 2 m n 2mn 2mn,空间复杂度为 O ( n ) O(n) O(n),下降了一个量级.更详细的推导请参考数值优化 | 详解拟牛顿法之SR1、DFP、BFGS、L-BFGS(附案例分析与Python实现)
1.2 LBFGS-Lite
库
浙江大学FAST-LAB开源的L-BFGS算法库提供了一个轻量级 L-BFGS 算法实现,支持用户自定义目标函数和梯度计算,专为高效、易用和嵌入式优化设计。它基于 C++ 实现,适用于机器人、自动驾驶等领域的实时优化问题。
下面简单介绍LBFGS-Lite
的使用方法:
-
定义目标函数和梯度
举例而言,设优化目标为
f ( x ) = ( 1 − x 1 ) 2 + 100 ( x 2 − x 1 2 ) 2 f(\boldsymbol{x})=(1-x_1)^2+100(x_2-x_1^2)^2 f(x)=(1−x1)2+100(x2−x12)2则梯度
{ ∂ f ( x ) ∂ x 1 = − 2 ( 1 − x 1 + 200 x 1 ( x 2 − x 1 2 ) ) ∂ f ( x ) ∂ x 2 = 200 ( x 2 − x 1 2 ) \begin{cases} \frac{\partial f\left( \boldsymbol{x} \right)}{\partial x_1}=-2\left( 1-x_1+200x_1\left( x_2-x_{1}^{2} \right) \right)\\ \frac{\partial f\left( \boldsymbol{x} \right)}{\partial x_2}=200\left( x_2-x_{1}^{2} \right)\\ \end{cases} {∂x1∂f(x)=−2(1−x1+200x1(x2−x12))∂x2∂f(x)=200(x2−x12)对应的C++代码为
static double costFunction(void *instance,const Eigen::VectorXd &x,Eigen::VectorXd &g) {const int n = x.size();double fx = 0.0;for (int i = 0; i < n; i += 2){const double t1 = 1.0 - x(i);const double t2 = 10.0 * (x(i + 1) - x(i) * x(i));g(i + 1) = 20.0 * t2;g(i) = -2.0 * (x(i) * g(i + 1) + t1);fx += t1 * t1 + t2 * t2;}return fx; }
-
给定初始解并优化求解
/* Set the initial guess */ for (int i = 0; i < N; i += 2) {x(i) = -1.2;x(i + 1) = 1.0; }/* Set the minimization parameters */ lbfgs::lbfgs_parameter_t params; params.g_epsilon = 1.0e-8; params.past = 3; params.delta = 1.0e-8;/* Start minimization */ int ret = lbfgs::lbfgs_optimize(x,finalCost,costFunction,nullptr,monitorProgress,this,params);
2 基于LBFGS的轨迹优化
为了使用LBFGS-Lite
进行轨迹优化,我们首先需要定义一个无约束的优化问题:选择优化变量为二维的路径序列 X = { x i = ( x i , y i ) ∣ i ∈ [ 1 , N ] } \mathcal{X} =\left\{ \boldsymbol{x}_i=\left( x_i,y_i \right) |i\in \left[ 1,N \right] \right\} X={xi=(xi,yi)∣i∈[1,N]},并基于此设计优化目标函数
f ( X ) = w o P o b s ( X ) + w κ P c u r ( X ) + w s P s m o ( X ) f\left( \mathcal{X} \right) =w_oP_{\mathrm{obs}}\left( \mathcal{X} \right) +w_{\kappa}P_{\mathrm{cur}}\left( \mathcal{X} \right) +w_sP_{\mathrm{smo}}\left( \mathcal{X} \right) f(X)=woPobs(X)+wκPcur(X)+wsPsmo(X)
其中:
- 障碍目标函数
P o b s ( X ) = ∑ x i ∈ X σ o ( ∥ x i − o min ∥ 2 − d max ) P_{\mathrm{obs}}\left( \mathcal{X} \right) =\sum_{\boldsymbol{x}_i\in \mathcal{X}}^{}{\sigma _o\left( \left\| \boldsymbol{x}_i-\boldsymbol{o}_{\min} \right\| _2-d_{\max} \right)} Pobs(X)=xi∈X∑σo(∥xi−omin∥2−dmax)
- 曲率目标函数
P c u r ( X ) = ∑ x i ∈ X σ κ ( Δ ϕ i ∥ Δ x i ∥ 2 − κ max ) P_{\mathrm{cur}}\left( \mathcal{X} \right) =\sum_{\boldsymbol{x}_i\in \mathcal{X}}^{}{\sigma _{\kappa}\left( \frac{\varDelta \phi _i}{\left\| \varDelta \boldsymbol{x}_i \right\| _2}-\kappa _{\max} \right)} Pcur(X)=xi∈X∑σκ(∥Δxi∥2Δϕi−κmax)
- 平滑目标函数
P s m o ( X ) = ∑ x i ∈ X ∥ Δ x i − Δ x i + 1 ∥ 2 2 P_{\mathrm{smo}}\left( \mathcal{X} \right) =\sum_{\boldsymbol{x}_i\in \mathcal{X}}^{}{\left\| \varDelta \boldsymbol{x}_i-\varDelta \boldsymbol{x}_{i+1} \right\| _{2}^{2}} Psmo(X)=xi∈X∑∥Δxi−Δxi+1∥22
由于LBFGS-Lite
需要用户显式地定义目标函数梯度,因此需要进一步计算
- 障碍目标函数梯度
∂ P o b s ( x i ) ∂ x i = = 2 ( ∥ x i − o min ∥ 2 − d max ) x i − o min ∥ x i − o min ∥ 2 \frac{\partial P_{\mathrm{obs}}\left( \boldsymbol{x}_i \right)}{\partial \boldsymbol{x}_i}==2\left( \left\| \boldsymbol{x}_i-\boldsymbol{o}_{\min} \right\| _2-d_{\max} \right) \frac{\boldsymbol{x}_i-\boldsymbol{o}_{\min}}{\left\| \boldsymbol{x}_i-\boldsymbol{o}_{\min} \right\| _2} ∂xi∂Pobs(xi)==2(∥xi−omin∥2−dmax)∥xi−omin∥2xi−omin
- 曲率目标函数梯度
∂ κ i ∂ x i = 1 ∥ Δ x i ∥ 2 − 1 1 − cos 2 Δ ϕ ( − p 1 − p 2 ) − Δ ϕ i Δ x i ∥ Δ x i ∥ 2 3 \begin{aligned} \frac{\partial \kappa _i}{\partial \boldsymbol{x}_i}&=\frac{1}{\left\| \varDelta \boldsymbol{x}_i \right\| _2}\frac{-1}{\sqrt{1-\cos ^2\varDelta \phi}}\left( -\boldsymbol{p}_1-\boldsymbol{p}_2 \right) -\frac{\varDelta \phi _i\varDelta \boldsymbol{x}_i}{\left\| \varDelta \boldsymbol{x}_i \right\| _{2}^{3}}\\\end{aligned} ∂xi∂κi=∥Δxi∥211−cos2Δϕ−1(−p1−p2)−∥Δxi∥23ΔϕiΔxi
- 平滑目标函数梯度
∂ P s m o ( x i ) ∂ x i = 2 ( x i − 2 − 4 x i − 1 + 6 x i − 4 x i + 1 + x i + 2 ) \frac{\partial P_{\mathrm{smo}}\left( \boldsymbol{x}_i \right)}{\partial \boldsymbol{x}_i}=2\left( \boldsymbol{x}_{i-2}-4\boldsymbol{x}_{i-1}+6\boldsymbol{x}_i-4\boldsymbol{x}_{i+1}+\boldsymbol{x}_{i+2} \right) ∂xi∂Psmo(xi)=2(xi−2−4xi−1+6xi−4xi+1+xi+2)
上述目标函数的具体定义和梯度推导和轨迹优化 | 基于ESDF的共轭梯度优化算法(附ROS C++/Python仿真)是一致的,感兴趣的同学可以参考
ROS_C_127">3 ROS C++仿真
代价函数和梯度计算的核心代码如下所示
double LBFGSOptimizer::costFunction(void* ptr, const Eigen::VectorXd& x, Eigen::VectorXd& g)
{auto instance = reinterpret_cast<LBFGSOptimizer*>(ptr);const auto& origin_path = instance->path_opt_;int points_num = static_cast<int>(origin_path.size());int var_num = points_num - 4;double cost = 0.0;Eigen::Matrix2Xd grad;grad.resize(2, var_num);grad.setZero();// update opt-path during optimizationEigen::Matrix2Xd path_opt;path_opt.resize(2, points_num);path_opt(0, 0) = origin_path[0].x();path_opt(1, 0) = origin_path[0].y();path_opt(0, 1) = origin_path[1].x();path_opt(1, 1) = origin_path[1].y();path_opt.block(0, 2, 1, var_num) = x.head(var_num).transpose();path_opt.block(1, 2, 1, var_num) = x.tail(var_num).transpose();path_opt(0, points_num - 2) = origin_path[points_num - 2].x();path_opt(1, points_num - 2) = origin_path[points_num - 2].y();path_opt(0, points_num - 1) = origin_path[points_num - 1].x();path_opt(1, points_num - 1) = origin_path[points_num - 1].y();// obstacle termdouble obstacle_cost = 0.0;const auto& obstacle_grad = _calObstacleTerm(instance, path_opt, obstacle_cost);grad += obstacle_grad;cost += obstacle_cost;// smooth termdouble smooth_cost = 0.0;const auto& smooth_grad = _calSmoothTerm(instance, path_opt, smooth_cost);grad += smooth_grad;cost += smooth_cost;// curvature termdouble curvature_cost = 0.0;const auto& curvature_grad = _calCurvatureTerm(instance, path_opt, curvature_cost);grad += curvature_grad;cost += curvature_cost;g.setZero();g.head(var_num) = grad.row(0).transpose();g.tail(var_num) = grad.row(1).transpose();return cost;
}
优化过程的核心代码如下所示
bool LBFGSOptimizer::optimize(const Points3d& waypoints)
{// first two and last two points are constant (to keep start and end direction)int var_num = static_cast<int>(waypoints.size()) - 4;Eigen::VectorXd opt_var(2 * var_num);for (int i = 2; i < static_cast<int>(waypoints.size()) - 2; ++i){opt_var(i - 2) = waypoints[i].x();opt_var(i - 2 + var_num) = waypoints[i].y();}path_opt_ = waypoints;// optimizer parameterslbfgs::lbfgs_parameter_t lbfgs_params;lbfgs_params.mem_size = 256;lbfgs_params.past = 3;lbfgs_params.min_step = 1.0e-32;lbfgs_params.g_epsilon = 0.0;lbfgs_params.delta = 0.01;// optimizationdouble min_cost = 0.0;int ret =lbfgs::lbfgs_optimize(opt_var, min_cost, &LBFGSOptimizer::costFunction, nullptr, nullptr, this, lbfgs_params);// parse solutionif (ret >= 0){for (int i = 0; i < var_num; ++i){path_opt_[i + 2].setX(opt_var(i));path_opt_[i + 2].setY(opt_var(i + var_num));}}else{R_WARN << "LBFGS Optimization failed: " << lbfgs::lbfgs_strerror(ret);return false;}return true;
}
轨迹优化的效果如下,其中橙色为优化后的路径,蓝色为原始路径
完整工程代码请联系下方博主名片获取
🔥 更多精彩专栏:
- 《ROS从入门到精通》
- 《Pytorch深度学习实战》
- 《机器学习强基计划》
- 《运动规划实战精讲》
- …