吴恩达机器学习 - PCA算法降维

xiaoxiao2025-08-27  16

吴恩达机器学习 - PCA算法降维

2018年06月25日 13:08:17 离殇灬孤狼 阅读数:152 <div class="tags-box space"> <span class="label">个人分类:</span> <a class="tag-link" href="https://blog.csdn.net/wyg1997/article/category/7742222" target="_blank">吴恩达机器学习 </a> </div> </div> <div class="operating"> </div> </div> </div> </div> <article> <div id="article_content" class="article_content clearfix csdn-tracking-statistics" data-pid="blog" data-mod="popu_307" data-dsm="post" style="height: 2070px; overflow: hidden;"> <div class="article-copyright"> 版权声明:如果感觉写的不错,转载标明出处链接哦~blog.csdn.net/wyg1997 https://blog.csdn.net/wyg1997/article/details/80800514 </div> <div class="markdown_views"> <!-- flowchart 箭头图标 勿删 --> <svg xmlns="http://www.w3.org/2000/svg" style="display: none;"><path stroke-linecap="round" d="M5,0 0,2.5 5,5z" id="raphael-marker-block" style="-webkit-tap-highlight-color: rgba(0, 0, 0, 0);"></path></svg> <p>题目链接:<a href="https://s3.amazonaws.com/spark-public/ml/exercises/on-demand/machine-learning-ex7.zip" rel="nofollow" target="_blank">点击打开链接</a></p>

笔记:


数据可视化:

求矩阵U和S(pca.m):

function [U, S] = pca(X) %PCA Run principal component analysis on the dataset X % [U, S, X] = pca(X) computes eigenvectors of the covariance matrix of X % Returns the eigenvectors U, the eigenvalues (on diagonal) in S % % Useful values [m, n] = size(X); % You need to return the following variables correctly. U = zeros(n); S = zeros(n); % ====================== YOUR CODE HERE ====================== % Instructions: You should first compute the covariance matrix. Then, you % should use the "svd" function to compute the eigenvectors % and eigenvalues of the covariance matrix. % % Note: When computing the covariance matrix, remember to divide by m (the % number of examples). % sigma = X'*X./m; [U, S, ~] = svd(sigma); % ========================================================================= end 12345678910111213141516171819202122232425262728

降维(projectData.m):

function Z = projectData(X, U, K) %PROJECTDATA Computes the reduced data representation when projecting only %on to the top k eigenvectors % Z = projectData(X, U, K) computes the projection of % the normalized inputs X into the reduced dimensional space spanned by % the first K columns of U. It returns the projected examples in Z. % % You need to return the following variables correctly. Z = zeros(size(X, 1), K); % ====================== YOUR CODE HERE ====================== % Instructions: Compute the projection of the data using only the top K % eigenvectors in U (first K columns). % For the i-th example X(i,:), the projection on to the k-th % eigenvector is given as follows: % x = X(i, :)'; % projection_k = x' * U(:, k); % U_reduce = U(:,1:K); Z = X*U_reduce; % ============================================================= end 1234567891011121314151617181920212223242526

压缩重现(投影后的位置)(recoverData.m):

function X_rec = recoverData(Z, U, K) %RECOVERDATA Recovers an approximation of the original data when using the %projected data % X_rec = RECOVERDATA(Z, U, K) recovers an approximation the % original data that has been reduced to K dimensions. It returns the % approximate reconstruction in X_rec. % % You need to return the following variables correctly. X_rec = zeros(size(Z, 1), size(U, 1)); % ====================== YOUR CODE HERE ====================== % Instructions: Compute the approximation of the data by projecting back % onto the original space using the top K eigenvectors in U. % % For the i-th example Z(i,:), the (approximate) % recovered data for dimension j is given as follows: % v = Z(i, :)'; % recovered_j = v' * U(j, 1:K)'; % % Notice that U(j, 1:K) is a row vector. % X_rec = Z*U(:,1:K)'; % ============================================================= end 12345678910111213141516171819202122232425262728
效果图:


然后是两个应用了

第一个是人脸数据的压缩,可以加速其它的学习算法

第二个是数据可视化,把一个3D的数据压缩到2D上看的更清楚

如图:

阅读更多
转载请注明原文地址: https://www.6miu.com/read-5035300.html

最新回复(0)