【SVM預測】基于郊狼算法改進SVM實現(xiàn)數(shù)據(jù)回歸預測附matlab代碼
1 簡介
提出一種基于郊狼優(yōu)化算法(COA)和支持向量機(SVM)的股價預測方法.針對SVM預測模型參數(shù)難以確定的問題,采用COA算法對SVM中懲罰因子及核函數(shù)參數(shù)進行優(yōu)化,構建COA-SVM股價預測模型。
支持向量機是利用已知數(shù)據(jù)類別的樣本為訓練樣本,尋找同類數(shù)據(jù)的空間聚集特征,從而對測試樣本進行分類驗證,通過驗證可將分類錯誤的數(shù)據(jù)進行更正。本文以體檢數(shù)據(jù)為數(shù)據(jù)背景,首先通過利用因子分析將高維數(shù)據(jù)進行降維,由此將所有指標整合成幾個綜合性指標;為降低指標之間的衡量標準所引起的誤差,本文利用 MATLAB軟件將數(shù)據(jù)進行歸一化處理,結合聚類分析將數(shù)據(jù)分類;最后本文利用最小二乘支持向量機分類算法進行分類驗證,從而計算出數(shù)據(jù)分類的準確率,并驗證了數(shù)據(jù)分類的準確性和合理性。


2 部分代碼
function [fbst, xbst, performance] = hho( objective, d, lmt, n, T, S)
%Harris hawks optimization algorithm
% inputs:
% ? objective - function handle, the objective function
% ? d - scalar, dimension of the optimization problem
% ? lmt - d-by-2 matrix, lower and upper constraints of the decision varable
% ? n - scalar, swarm size
% ? T - scalar, maximum iteration
% ? S - scalar, times of independent runs
% data: 2021-05-09
% author: elkman, github.com/ElkmanY/
%% Levy flight
beta = 1.5;
sigma = ( gamma(1+beta)*sin(pi*beta/2)/gamma((1+beta)/2)*beta*2^((beta-1)/2) ).^(1/beta);
Levy = @(x) 0.01*normrnd(0,1,d,x)*sigma./abs(normrnd(0,1,d,x)).^(1/beta);
%% algorithm procedure
tic;
for s = 1:S
? ?%% ?Initialization
? ?X = lmt(:,1) + (lmt(:,2) - lmt(:,1)).*rand(d,n);
? ?for t = 1:T
? ? ? ?F = objective(X);
? ? ? ?[f_rabbit(s,t), i_rabbit] = min(F);
? ? ? ?x_rabbit(:,t,s) = X(:,i_rabbit);
? ? ? ?xr = x_rabbit(:,t,s);
? ? ? ?J = 2*(1-rand(d,1));
? ? ? ?E0 = 2*rand(1,n)-1;
? ? ? ?E(t,:) = 2*E0*(1-t/T);
? ? ? ?absE = abs(E(t));
? ? ? ?p1 = absE>=1; ? %eq(1)
? ? ? ?r = rand(1,n);
? ? ? ?p2 = (r>=0.5) & (absE>=0.5) & (absE<1); %eq(4)
? ? ? ?p3 = (r>=0.5) & (absE<0.5); ?%eq(6)
? ? ? ?p4 = (r<0.5) & (absE>=0.5) & (absE<1); %eq(10)
? ? ? ?p5 = (r<0.5) & (absE<0.5); %eq(11)
? ? ? ?%% update locations
? ? ? ?rh = randi([1,n],1,n);
? ? ? ?flag1 = rand(1,n)>=0.5;
? ? ? ?Y = xr - E(t,:).*abs( J.*xr - X );
? ? ? ?Z = Y + rand(d,n).*Levy(n);
? ? ? ?flag2 = (objective(Y)<objective(Z)) & (objective(Y)<F);
? ? ? ?flag3 = (objective(Y)>objective(Z)) & (objective(Z)<F);
? ? ? ?flag4 = (~flag2) & (~flag3);
? ? ? ?X_ = ? ?p1.*( ? (X(:,rh) - rand(1,n).*abs( X(:,rh) - 2*rand(1,n).*X )).*flag1 +...
? ? ? ? ? ?((X(:,rh) - mean(X)) - rand(1,n).*( lmt(:,1) + (lmt(:,2) - lmt(:,1)).*rand(d,n) )).*(~flag1) ? )...
? ? ? ? ? ?+ ? p2.*( ? xr - X - E(t,:).*abs( J.*xr - X ) ? )...
? ? ? ? ? ?+ ? p3.*( ? xr - E(t,:).*abs( xr - X ) ? )...
? ? ? ? ? ?+ ? p4.*( ? Y.*flag2 + Z.*flag3 + ( lmt(:,1) + (lmt(:,2) - lmt(:,1)).*rand(d,n) ).*flag4 ?)...
? ? ? ? ? ?+ ? p5.*( ? Y.*flag2 + Z.*flag3 + ( lmt(:,1) + (lmt(:,2) - lmt(:,1)).*rand(d,n) ).*flag4 ?);
? ? ? ?X_(:,i_rabbit) = xr;
? ? ? ?X = X_;
? ?end
end
%% ê?3?-outputs
performance = [min(f_rabbit(:,T));mean(f_rabbit(:,T));std(f_rabbit(:,T))];
timecost = toc;
[fbst, ibst] = min(f_rabbit(:,T));
xbst = x_rabbit(:,T,ibst);
%% ??í?-plot data
% Convergence Curve
figure('Name','Convergence Curve');
box on
semilogy(1:T,mean(f_rabbit,1),'b','LineWidth',1.5);
xlabel('Iteration','FontName','Aril');
ylabel('Fitness/Score','FontName','Aril');
title('Convergence Curve','FontName','Aril');
if d == 2
? ?% Trajectory of Global Optimal
? ?figure('Name','Trajectory of Global Optimal');
? ?x1 = linspace(lmt(1,1),lmt(1,2));
? ?x2 = linspace(lmt(2,1),lmt(2,2));
? ?[X1,X2] = meshgrid(x1,x2);
? ?V = reshape(objective([X1(:),X2(:)]'),[size(X1,1),size(X1,1)]);
? ?contour(X1,X2,log10(V),100); % notice log10(V)
? ?hold on
? ?plot(x_rabbit(1,:,1),x_rabbit(2,:,1),'r-x','LineWidth',1);
? ?hold off
? ?xlabel('\it{x}_1','FontName','Time New Roman');
? ?ylabel('\it{x}_2','FontName','Time New Roman');
? ?title('Trajectory of Global Optimal','FontName','Aril');
end
end
3 仿真結果

4 參考文獻
[1]楊建新, 蘭小平, 姚志強,等. 基于郊狼算法優(yōu)化的LSSVM多工序質量預測方法[J]. 制造業(yè)自動化, 2021, 43(12):5.
博主簡介:擅長智能優(yōu)化算法、神經(jīng)網(wǎng)絡預測、信號處理、元胞自動機、圖像處理、路徑規(guī)劃、無人機等多種領域的Matlab仿真,相關matlab代碼問題可私信交流。
部分理論引用網(wǎng)絡文獻,若有侵權聯(lián)系博主刪除。
