概述
初入门,仅做学习记录(http://cs229.stanford.edu/materials.html) 的过程。记录学习的代码,理论内容偷懒一下,参考:http://blog.csdn.net/yangliuy/article/details/18455525
linear_grad_ascent.m文件内容
function [theta, J_history] = linear_grad_ascent(X, y, theta, alpha, num_iters)
%GRADIENTDESCENT Performs gradient descent to learn theta
% theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by
% taking num_iters gradient steps with learning rate alpha
% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
%两种方法,一种使用循环公式,
% for iter = 1:num_iters
% % Batch gradient descent
% grad = 0;
% for i = 1:m
% grad = grad + (X(i,:) * theta - y(i)) * X(i, :)';
% end
% theta = theta - alpha *2 *grad;
% % Save the cost J in every iteration
% J_history(iter) = computeCost(X, y, theta);
% end
%一种使用矢量运算方法,参考ng
x = X;
sample_num = m;
for iter = 1:num_iters
grad = x'*(x*theta-y) ;
theta = theta - alpha /sample_num * grad;
J_history(iter) = computeCost(x, y, theta);%(1/(2*sample_num))*(x*theta-y)'*(x*theta-y);%Jtheta是个行向量
end
end
linear_grad_ascent_test.m测试文件代码
function linear_grad_ascent_test
%LINEAR_GRAD_ASCENT_TEST Summary of this function goes here
% Detailed explanation goes here
x = load('ex2x.dat');
y = load('ex2y.dat');
plotData(x,y);
x = [ones(size(x,1),1),x];
alpha = 0.0001;
max_iters = 5000;
theta = [0.722254032225002;2.585491252616242]; %randn(2,1);
%andrew ng 的代码
x = x';
y = y';
theta = linear_regress(x,y);
x = x';
%andrew ng 的代码
%自己代码
% [theta,J] = linear_grad_ascent(x, y, theta, alpha, max_iters);
y1 = x * theta;
hold on;
plot(x(:,2)',y1','b');
hold off;
t = 1:max_iters;
figure;
plot(t(1:20)',J(1:20)','b');
plot(t(1:20)',J(1:20)','b');
xlabel('Number of iterations');
ylabel('Cost J');
end
function plotData(x, y)
%PLOTDATA Plots the data points x and y into a new figure
% PLOTDATA(x,y) plots the data points and gives the figure axes labels of
% population and profit.
% ====================== YOUR CODE HERE ======================
% Instructions: Plot the training data into a figure using the
% "figure" and "plot" commands. Set the axes labels using
% the "xlabel" and "ylabel" commands. Assume the
% population and revenue data have been passed in
% as the x and y arguments of this function.
%
% Hint: You can use the 'rx' option with plot to have the markers
% appear as red crosses. Furthermore, you can make the
% markers larger by using plot(..., 'rx', 'MarkerSize', 10);
figure; % open a new figure window
plot(x, y, 'rx', 'MarkerSize', 10); % Plot the data
% ============================================================
end
linear_regress.m(ng代码)
% linear_regress.m
function w_learned = linear_regress(X,y)
% linear regression model
% Plot the original data
epsilon = 0.0001;
max_iters = 5000;
% Use gradient descent to to learn a set of parameters w_learned
% initialize w_learned randomly
w_learned = randn(2,1);
% iterate for max_iters # of iterations (could use other convergence
% criteria)
for iteration = 1:max_iters
grad = 2*sum(repmat(w_learned'*X-y,size(X,1),1).*X,2);
w_learned=w_learned-0.0001*grad;
err=sum((y-w_learned'*X).^2);
end
测试数据连接:
http://openclassroom.stanford.edu/MainFolder/courses/DeepLearning/exercises/ex2materials/ex2Data.zip
最后
以上就是秀丽百褶裙为你收集整理的linear Regression 学习的全部内容,希望文章能够帮你解决linear Regression 学习所遇到的程序开发问题。
如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。
本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
发表评论 取消回复