```
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
%GRADIENTDESCENTMULTI Performs gradient descent to learn theta
% theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by
% taking num_iters gradient steps with learning rate alpha
% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
% ====================== YOUR CODE HERE ======================
% Instructions: Perform a single gradient step on the parameter vector
% theta.
%
% Hint: While debugging, it can be useful to print out the values
% of the cost function (computeCostMulti) and gradient here.
%
%Vectorization Fundamental
%h(theta)X = summ(theta(j) * X(j)); where j = 0 .. n
% = transpose(theta) * X;
%where theta is a Vector
%theta = [theta(1)/theta(2)/theta(3)] .. column Vector
%X = [X(1)/X(2)/X(3)] ... column Vector
%Gradient Descent with Multiple Variables Formula
%theta(j) := theta(j) - alpha * 1/m * summ(h(theta)x(i) - y(i)).Xj(i)); where i = 0 ... n
%Xj(i) is Vector
%h(theta)x(i) = X*theta ... as it takes all the values
vectorMultiple = X' * (X*theta - y);
theta = theta - (alpha/m) * (vectorMultiple);
square = (X * theta - y).^2;
J_history(iter) = (1/(2*m))*sum(square);
%read more about vectorization
%https://www.gnu.org/software/octave/doc/interpreter/Basic-Vectorization.html
% ============================================================
% Save the cost J in every iteration
% J_history(iter) = computeCostMulti(X, y, theta);
end
end
```

Find More about me : https://ie.linkedin.com/in/iamabhishekchoudhary

## Wednesday, May 7, 2014

### Gradient Descent with Multiple Feature Algorithm

Subscribe to:
Post Comments (Atom)

## No comments:

Post a Comment