在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):jwasham/machine-learning开源软件地址(OpenSource Url):https://github.com/jwasham/machine-learning开源编程语言(OpenSource Language):MATLAB 100.0%开源软件介绍(OpenSource Introduction):Machine Learning AlgorithmsThis is a collection of notes and code for machine learning algorithms. Most of these will be Matlab/Octave files. Would like to add some Python/Numpy implementations later. Linear regressionRegularized linear regression has the following cost function: Correspondingly, the partial derivative of regularized linear regression's cost for θj is defined as: To plot the learning curve, we need a training and cross validation set error for different training set sizes. To obtain different training set sizes, use different subsets of the original training set X. Specifically, for a training set size of i, you should use the first i examples (i.e., X(1:i,:) and y(1:i)). You can use the trainLinearRegression() function to find the θ parameters. Note that the lambda is passed as a parameter to the learningCurve function. After learning the θ parameters, you should compute the error on the training and cross validation sets. Recall that the training error for a dataset is defined as: In particular, note that the training error does not include the regularization term. One way to compute the training error is to use your existing cost function and set λ to 0 only when using it to compute the training error and cross validation error. When you are computing the training set error, make sure you compute it on the training subset (i.e., X(1:n,:) and y(1:n)) (instead of the entire training set). However, for the cross validation error, you should compute it over the entire cross validation set. Logistic Regressioncoming soon Multi-class Classificationcoming soon Neural Networkscoming soon Neural Network Learningcoming soon Regularized Linear Regressioncoming soon Support Vector Machinescoming soon |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论