机器学习--(第四周)一对多分类和神经网络预测

xiaoxiao2021-02-28  24

一对多分类

系统函数

随机抽取样本 

randperm(n)

返回包含1:n随机排列的行向量

fmincg

使用非线性共轭梯度算法最小化函数,用法举例如下,

 [theta] = fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)),initial_theta, options);

自定义函数

代价函数

lrCostFunction() 

function [J, grad] = lrCostFunction(theta, X, y, lambda)

m = length(y);

J = 0;

grad = zeros(size(theta));

J = 1./m*(-y'*log(sigmoid(X*theta))-(1-y')*log(1-sigmoid(X*theta)));J = J + lambda/(2*m)*(sum(theta.^2)-theta(1).^2);grad = 1./m*X'*(sigmoid(X*theta)-y);grad = grad + lambda/m*theta;

grad(1) = grad(1) - lambda/m*theta(1);

grad = grad(:);end

定义梯度下降函数

function [all_theta] = oneVsAll(X, y, num_labels, lambda)

m = size(X, 1);

n = size(X, 2);

all_theta = zeros(num_labels, n + 1);

initial_theta = zeros(n+1,1);options = optimset('GradObj', 'on', 'MaxIter', 50);for c = 1:num_labels    [theta] = fmincg (@(t)(lrCostFunction(t, X, (y == c), lambda)),initial_theta, options);    all_theta(c,:) = theta';end

end

大的主要的流程是1、2、5、6

其中流程3是为了检验编写的函数是否能够达到目的,需要已有的数据结果进行对比

1、设置参数

input_layer_size  = 400;  % 20x20 Input Images of Digitsnum_labels = 10;          % 10 labels, from 1 to 10

2、导入数据

load('ex3data1.mat'); % training data stored in arrays X, y

m = size(X, 1);

3、随机选组前100个例子进行展示

rand_indices = randperm(m);sel = X(rand_indices(1:100), :);

displayData(sel);

4、测试梯度下降函数(测试梯度下降函数)

% Test case for lrCostFunctionfprintf('\nTesting lrCostFunction() with regularization');theta_t = [-2; -1; 1; 2];X_t = [ones(5,1) reshape(1:15,5,3)/10];y_t = ([1;0;1;0;1] >= 0.5);lambda_t = 3;[J grad] = lrCostFunction(theta_t, X_t, y_t, lambda_t);

5、计算参数grad

lambda = 0.1;[all_theta] = oneVsAll(X, y, num_labels, lambda);

6、预测值与计算值对比,得出分类算法的准确率

pred = predictOneVsAll(all_theta, X);

fprintf('\nTraining Set Accuracy: %f\n', mean(double(pred == y)) * 100);

神经网络进行预测

系统函数

[c, p] = max(a3, [], 2);

[x, ix] = max ([1, 3, 5, 2, 5])

⇒ x = 5 ix = 3

定义函数

预测函数

function p = predict(Theta1, Theta2, X)

m = size(X, 1);

num_labels = size(Theta2, 1);

p = zeros(size(X, 1), 1);

X = [ones(m,1) X];

a2 = sigmoid(X*Theta1');

a2 = [ones(size(a2,1),1) a2];

a3 = (sigmoid(a2*Theta2'));

[c, p] = max(a3, [], 2);

1、设定参数

input_layer_size  = 400;  % 20x20 Input Images of Digitshidden_layer_size = 25;   % 25 hidden unitsnum_labels = 10;          % 10 labels, from 1 to 10   

                          % (note that we have mapped "0" to label 10)

2、导入数据,显示部分随机的结果

fprintf('Loading and Visualizing Data ...\n')load('ex3data1.mat');m = size(X, 1);% Randomly select 100 data points to displaysel = randperm(size(X, 1));sel = sel(1:100);displayData(X(sel, :));

fprintf('Program paused. Press enter to continue.\n');

3、导入神经网络隐藏层参数Theta1和逻辑判断参数Theta2

% Load the weights into variables Theta1 and Theta2load('ex3weights.mat');

4、预测

pred = predict(Theta1, Theta2, X);fprintf('\nTraining Set Accuracy: %f\n', mean(double(pred == y)) * 100);

5、随机展示其中一个手写图像,输出预测结果

rp = randperm(m);

for i = 1:m    % Display     fprintf('\nDisplaying Example Image\n');    displayData(X(rp(i), :));    pred = predict(Theta1, Theta2, X(rp(i),:));    fprintf('\nNeural Network Prediction: %d (digit %d)\n', pred, mod(pred, 10));      % Pause with quit option    s = input('Paused - press enter to continue, q to exit:','s');    if s == 'q'      break    endend

转载请注明原文地址: https://www.6miu.com/read-2629775.html

最新回复(0)