【吴恩达机器学习】Week3 编程作业——对数几率回归
2022/1/28 1:04:32
本文主要是介绍【吴恩达机器学习】Week3 编程作业——对数几率回归,对大家解决编程问题具有一定的参考价值,需要的程序猿们随着小编来一起学习吧!
1. 可视化数据
plot.m
function plotData(X, y) %PLOTDATA Plots the data points X and y into a new figure % PLOTDATA(x,y) plots the data points with + for the positive examples % and o for the negative examples. X is assumed to be a Mx2 matrix. % Create New Figure figure; hold on; % ====================== YOUR CODE HERE ====================== % Instructions: Plot the positive and negative examples on a % 2D plot, using the option 'k+' for the positive % examples and 'ko' for the negative examples. % % Find Indices of Positive and Negative Examples pos = find(y == 1); neg = find(y == 0); plot(X(pos, 1), X(pos, 2), 'k+', 'LineWidth', 2, 'MarkerSize', 7); plot(X(neg, 1), X(neg, 2), 'ko', 'MarkerFaceColor', 'y', 'MarkerSize', 7); % ========================================================================= hold off; end
ex2.m
%% Machine Learning Online Class - Exercise 2: Logistic Regression % % Instructions % ------------ % % This file contains code that helps you get started on the logistic % regression exercise. You will need to complete the following functions % in this exericse: % % sigmoid.m % costFunction.m % predict.m % costFunctionReg.m % % For this exercise, you will not need to change any code in this file, % or any other files other than those mentioned above. % %% Initialization clear ; close all; clc %% Load Data % The first two columns contains the exam scores and the third column % contains the label. data = load('ex2data1.txt'); X = data(:, [1, 2]); y = data(:, 3); %% ==================== Part 1: Plotting ==================== % We start the exercise by first plotting the data to understand the % the problem we are working with. fprintf(['Plotting data with + indicating (y = 1) examples and o ' ... 'indicating (y = 0) examples.\n']); plotData(X, y); % Put some labels hold on; % Labels and Legend xlabel('Exam 1 score') ylabel('Exam 2 score') % Specified in plot order legend('Admitted', 'Not admitted') hold off; fprintf('\nProgram paused. Press enter to continue.\n'); %pause;
2. 求代价函数和梯度下降法
costFunction.m
function [J, grad] = costFunction(theta, X, y) %COSTFUNCTION Compute cost and gradient for logistic regression % J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the % parameter for logistic regression and the gradient of the cost % w.r.t. to the parameters. % Initialize some useful values m = length(y); % number of training examples % You need to return the following variables correctly %J = 0; %grad = zeros(size(theta)); % ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta. % You should set J to the cost. % Compute the partial derivatives and set grad to the partial % derivatives of the cost w.r.t. each parameter in theta % % Note: grad should have the same dimensions as theta % h = sigmoid(X*theta); y1 = -y' * log(h); y2 = (1 - y)' * log(1 - h); J = 1/m * (y1 - y2); grad = 1/m * X' * (h - y); % ============================================================= end
注意:
数值操作时,./
和/
效果相同,.*
和*
同理。
矩阵运算时,A./X
相当于矩阵A与矩阵X中对应位置元素相除;A/X
相当于矩阵A乘矩阵X的逆矩阵;A.*X
相当于矩阵A与矩阵X对应位置上的元素相乘,A*X
,相当于矩阵A乘矩阵X做正常的矩阵乘法运算。
ex2.m
%% ============ Part 2: Compute Cost and Gradient ============ % In this part of the exercise, you will implement the cost and gradient % for logistic regression. You neeed to complete the code in % costFunction.m % Setup the data matrix appropriately, and add ones for the intercept term [m, n] = size(X); % Add intercept term to x and X_test X = [ones(m, 1) X]; % Initialize fitting parameters initial_theta = zeros(n + 1, 1); % Compute and display initial cost and gradient [cost, grad] = costFunction(initial_theta, X, y); fprintf('Cost at initial theta (zeros): %f\n', cost); fprintf('Expected cost (approx): 0.693\n'); fprintf('Gradient at initial theta (zeros): \n'); fprintf(' %f \n', grad); fprintf('Expected gradients (approx):\n -0.1000\n -12.0092\n -11.2628\n'); % Compute and display cost and gradient with non-zero theta test_theta = [-24; 0.2; 0.2]; [cost, grad] = costFunction(test_theta, X, y); fprintf('\nCost at test theta: %f\n', cost); fprintf('Expected cost (approx): 0.218\n'); fprintf('Gradient at test theta: \n'); fprintf(' %f \n', grad); fprintf('Expected gradients (approx):\n 0.043\n 2.566\n 2.647\n'); fprintf('\nProgram paused. Press enter to continue.\n'); %pause;
3. 高级优化 fminunc函数
ex2.m
%% ============= Part 3: Optimizing using fminunc ============= % In this exercise, you will use a built-in function (fminunc) to find the % optimal parameters theta. % Set options for fminunc options = optimset('GradObj', 'on', 'MaxIter', 400); % Run fminunc to obtain the optimal theta % This function will return theta and the cost [theta, cost] = ... fminunc(@(t)(costFunction(t, X, y)), initial_theta, options); % Print theta to screen fprintf('Cost at theta found by fminunc: %f\n', cost); fprintf('Expected cost (approx): 0.203\n'); fprintf('theta: \n'); fprintf(' %f \n', theta); fprintf('Expected theta (approx):\n'); fprintf(' -25.161\n 0.206\n 0.201\n'); % Plot Boundary plotDecisionBoundary(theta, X, y); % Put some labels hold on; % Labels and Legend xlabel('Exam 1 score') ylabel('Exam 2 score') % Specified in plot order legend('Admitted', 'Not admitted') hold off; fprintf('\nProgram paused. Press enter to continue.\n'); %pause;
拓展资料:
matlab最小化函数中的求解函数fminunc(线性以及非线性))
optimset
Matlab中的函数句柄@
这篇关于【吴恩达机器学习】Week3 编程作业——对数几率回归的文章就介绍到这儿,希望我们推荐的文章对大家有所帮助,也希望大家多多支持为之网!
- 2024-12-17机器学习资料入门指南
- 2024-12-06如何用OpenShift流水线打造高效的机器学习运营体系(MLOps)
- 2024-12-06基于无监督机器学习算法的预测性维护讲解
- 2024-12-03【机器学习(六)】分类和回归任务-LightGBM算法-Sentosa_DSML社区版
- 2024-12-0210个必须使用的机器学习API,为高级分析助力
- 2024-12-01【机器学习(五)】分类和回归任务-AdaBoost算法-Sentosa_DSML社区版
- 2024-11-28【机器学习(四)】分类和回归任务-梯度提升决策树(GBDT)算法-Sentosa_DSML社区版
- 2024-11-26【机器学习(三)】分类和回归任务-随机森林(Random Forest,RF)算法-Sentosa_DSML社区版
- 2024-11-18机器学习与数据分析的区别
- 2024-10-28机器学习资料入门指南