机器学习算法笔记系列之深入理解主成分分析PCA

xiaoxiao2021-02-28  37

https://blog.csdn.net/shizhixin/article/details/51181379

https://blog.csdn.net/shizhixin/article/details/51192031

相关Python代码

import numpy as np import matplotlib.pyplot as plt sampleDataSet =np.array( [[1, 1, 2, 4, 2], [1, 3, 3, 4, 4]]) mean_data = np.mean(sampleDataSet,1) move_mean_sample = (sampleDataSet.transpose() - mean_data).transpose() p1 = plt.subplot(121) p1.plot(sampleDataSet[0,:],sampleDataSet[1,:],'*') p1.axis([0,5,0,5]) p2 = plt.subplot(122) p2.plot(move_mean_sample[0,:],move_mean_sample[1,:],'*') p2.axis([-5,5,-5,5]) np_cov = np.cov(move_mean_sample,rowvar=1) print 'sampleDataSet =\n',sampleDataSet print 'move_mean_sample =\n',move_mean_sample print 'np_cov =\n',np_cov mat_cov = np.mat(np_cov) (eigV, eigVector) = np.linalg.eigh(mat_cov) pca_mat = eigVector[:,-1] pca_data = pca_mat.T * np.mat(move_mean_sample)#Y = P^T * X recon_data = ((pca_mat * pca_data).transpose() + mean_data).transpose() #X = P*Y p1.plot(recon_data[0,:],recon_data[1,:],'o') p1.axis([0,5,0,5]) k = pca_mat[1,0]/pca_mat[0,0] b = recon_data[1,0] - k*recon_data[0,0] xx = [0,5] yy = k * xx +b p1.plot(xx,yy) print 'eigV=\n',eigV print 'eigVector=\n',eigVector print 'pca_data=\n',pca_data plt.show()
转载请注明原文地址: https://www.6miu.com/read-2624010.html

最新回复(0)