• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

jwasham/machine-learning: Some notes on machine learning algorithms, mostly in M ...

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称(OpenSource Name):

jwasham/machine-learning

开源软件地址(OpenSource Url):

https://github.com/jwasham/machine-learning

开源编程语言(OpenSource Language):

MATLAB 100.0%

开源软件介绍(OpenSource Introduction):

Machine Learning Algorithms

This is a collection of notes and code for machine learning algorithms.

Most of these will be Matlab/Octave files. Would like to add some Python/Numpy implementations later.

Linear regression

Regularized linear regression has the following cost function:

regularized linear regression cost function

Correspondingly, the partial derivative of regularized linear regression's cost for θj is defined as:

Regularized linear regression gradient

To plot the learning curve, we need a training and cross validation set error for different training set sizes. To obtain different training set sizes, use different subsets of the original training set X. Specifically, for a training set size of i, you should use the first i examples (i.e., X(1:i,:) and y(1:i)).

You can use the trainLinearRegression() function to find the θ parameters. Note that the lambda is passed as a parameter to the learningCurve function. After learning the θ parameters, you should compute the error on the training and cross validation sets. Recall that the training error for a dataset is defined as:

Calculating training error

In particular, note that the training error does not include the regularization term. One way to compute the training error is to use your existing cost function and set λ to 0 only when using it to compute the training error and cross validation error. When you are computing the training set error, make sure you compute it on the training subset (i.e., X(1:n,:) and y(1:n)) (instead of the entire training set). However, for the cross validation error, you should compute it over the entire cross validation set.

Logistic Regression

coming soon

Multi-class Classification

coming soon

Neural Networks

coming soon

Neural Network Learning

coming soon

Regularized Linear Regression

coming soon

Support Vector Machines

coming soon




鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
iahncajigas/nSTAT: Neural Spike Train Analysis Toolbox for Matlab发布时间:2022-08-17
下一篇:
jinwar/matgsdf: Matlab version of GSDF method发布时间:2022-08-17
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap