• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

【matlab】stanford线性回归,logistic regression 实验 - Loull

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

【matlab】stanford线性回归,logistic regression 实验

2013-04-21 12:41  Loull  阅读(566)  评论(0编辑  收藏  举报

1、找到衡量误差的函数costFunction

2、拟合参数theta,使costFunction最小。用梯度下降,迭代n次,迭代更新theta,让costFunction减小

3、找到了合适的参数theta,进行预测

一、linear regression

computeCost:

for i=1:m
    h = X(i,:) * theta;
    J = J + (h - y(i))^2;
end
J = J / (2*m);

梯度下降过程,拟合参数theta

for iter = 1:num_iters

    sum = zeros(size(theta,1),1);
    for j = 1:size(theta,1)
        for i = 1:m
            h = X(i,:) * theta;
            sum(j) = sum(j) + (h - y(i))*X(i,j);
        end
        % theta(j) = theta(j) - alpha * sum / m; 
        %go wrong! simultaneously update theta
    end
    
    theta = theta - sum .* alpha ./ m;

    % Save the cost J in every iteration    
    J_history(iter) = computeCostMulti(X, y, theta);

end

 

二、Logistic Regression

costFunction

function [J, grad] = costFunctionReg(theta, X, y, lambda)
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
%   J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
%   theta as the parameter for regularized logistic regression and the
%   gradient of the cost w.r.t. to the parameters. 

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly 
J = 0;
grad = zeros(size(theta));


for i=1:m
    J = J - y(i)*log(h_fun(X(i,:), theta)) - (1-y(i))*log(1-h_fun(X(i,:),theta));
end
J = J / m;
reg = 0;
for j=2:size(theta)
    reg = reg + theta(j)^2;
end
reg = reg * lambda /(2*m);
J = J + reg;

for i=1:m
    grad(1) = grad(1) + (h_fun(X(i,:),theta) - y(i))*X(i,1);
end
grad(1) = grad(1) / m;
for j=2:size(theta)
    for i=1:m
        grad(j) = grad(j) + (h_fun(X(i,:),theta) - y(i)) * X(i,j) + lambda*theta(j)/m;
    end
    grad(j) = grad(j) / m;
end


end

参数拟合

% Initialize fitting parameters
initial_theta = zeros(size(X, 2), 1);

% Set regularization parameter lambda to 1 (you should vary this)
lambda = 0;

% Set Options
options = optimset(\'GradObj\', \'on\', \'MaxIter\', 400);

% Optimize
[theta, J, exit_flag] = ...
    fminunc(@(t)(costFunctionReg(t, X, y, lambda)), initial_theta, options);

 


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Delphi外挂开发网站发布时间:2022-07-18
下一篇:
用Delphi10.3 创建一条JSON数据的第三种方法,非常简洁的写法发布时间:2022-07-18
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap