tensorflow实现线下回归、softmax回归、bp神经网络

xiaoxiao2021-02-28  135

先入门做简单的线下回归,最小二乘化利用tensorflow来实现,代码原理如下:

#encoding:utf-8import sysimport tensorflow as tfimport numpy as npx_data=np.random.rand(100).astype(np.float32)y_data=x_data*0.1+0.55#create tensortdlow strctru startWeights=tf.Variable(tf.random_uniform([1],-1.0,1.0))biases=tf.Variable(tf.zeros([1]))y=Weights*x_data+biasesloss=tf.reduce_mean(tf.square(y-y_data))op=tf.train.GradientDescentOptimizer(0.5)train=op.minimize(loss)init=tf.initialize_all_variables()sess=tf.Session()sess.run(init)for i in xrange(250): sess.run(train) if i%20==0: print(i,sess.run(Weights),sess.run(biases)) 结果是: ('0', array([-0.17141712], dtype=float32), array([ 0.96016634], dtype=float32)) ('20', array([-0.00157562], dtype=float32), array([ 0.60436034], dtype=float32)) ('40', array([ 0.07296405], dtype=float32), array([ 0.56446886], dtype=float32)) ('60', array([ 0.09280396], dtype=float32), array([ 0.55385113], dtype=float32)) ('80', array([ 0.09808466], dtype=float32), array([ 0.55102503], dtype=float32)) ('100', array([ 0.09949023], dtype=float32), array([ 0.55027282], dtype=float32)) ('120', array([ 0.09986429], dtype=float32), array([ 0.55007261], dtype=float32)) ('140', array([ 0.09996392], dtype=float32), array([ 0.55001932], dtype=float32)) ('160', array([ 0.09999041], dtype=float32), array([ 0.55000514], dtype=float32)) ('180', array([ 0.09999743], dtype=float32), array([ 0.55000138], dtype=float32)) ('200', array([ 0.09999929], dtype=float32), array([ 0.55000037], dtype=float32)) ('220', array([ 0.0999998], dtype=float32), array([ 0.55000013], dtype=float32)) ('240', array([ 0.09999982], dtype=float32), array([ 0.55000013], dtype=float32))

softmax情况如下:

from tensorflow.examples.tutorials.mnist import input_dataimport tensorflow as tfmnist=input_data.read_data_sets("MNIST_data/",one_hot=True)x=tf.placeholder("float",[None,784])w=tf.Variable(tf.zeros([784,10]))b=tf.Variable(tf.zeros([10]))y=tf.nn.softmax(tf.matmul(x,w)+b) #模型拟合值y_=tf.placeholder("float",[None,10]) #实际值cross_entropy=-tf.reduce_sum(y_*tf.log(y))train_step = tf.train.GradientDescentOptimizer(0.01).minimize(cross_entropy)init = tf.global_variables_initializer()sess=tf.Session()sess.run(init)for i in range(10000): batch_xs,batch_ys=mnist.train.next_batch(100) sess.run(train_step,feed_dict={x:batch_xs,y_:batch_ys}) if i%1000==0: # 评估模型,tf.argmax能给出某个tensor对象在某一维上数据最大值的索引。因为标签是由0,1组成了one-hot vector,返回的索引就是数值为1的位置 correct_prediction = tf.equal(tf.argmax(y, 1), tf.argmax(y_, 1)) # 计算正确预测项的比例,因为tf.equal返回的是布尔值,使用tf.cast可以把布尔值转换成浮点数,tf.reduce_mean是求平均值 accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float")) # 在session中启动accuracy,输入是MNIST中的测试集 print("第"+str(i)+"步骤为:",sess.run(accuracy, feed_dict={x: mnist.test.images, y_: mnist.test.labels}))

结果为:

0步骤为: 0.41241000步骤为: 0.9172000步骤为: 0.91983000步骤为: 0.92064000步骤为: 0.92285000步骤为: 0.91986000步骤为: 0.91197000步骤为: 0.91998000步骤为: 0.92329000步骤为: 0.9218

bp神经网络:

import tensorflow as tfimport numpy as npdef add_layer(inputs,in_size,out_size,activation_function=None): Weights=tf.Variable(tf.random_normal([in_size,out_size])) biases=tf.Variable(tf.zeros([1,out_size])+0.1) Wx_plus_b=tf.matmul(inputs,Weights)+biases if activation_function==None: outputs=Wx_plus_b else: outputs=activation_function(Wx_plus_b) return outputsx_data=np.linspace(-1,1,300)[:,np.newaxis] #300*1,输入只有一个神经元noise=np.random.normal(0,0.05,x_data.shape)y_data=np.square(x_data)-0.5+noisexs=tf.placeholder(tf.float32,[None,1])ys=tf.placeholder(tf.float32,[None,1])#定义隐含层,隐含层有10个神经元l1=add_layer(xs,1,10,activation_function=tf.nn.relu)#定义输出层,假设没有任何激活函数prediction=add_layer(l1,10,1,activation_function=None)loss=tf.reduce_mean(tf.reduce_sum(tf.square(ys-prediction),reduction_indices=[1]))train_step=tf.train.GradientDescentOptimizer(0.1).minimize(loss)init=tf.global_variables_initializer()with tf.Session() as sess: sess.run(init) for i in range(3000): sess.run(train_step,feed_dict={ xs:x_data,ys:y_data }) if i%100==0: print(sess.run(loss,feed_dict={ xs:x_data,ys:y_data }))

结果:

0.8987710.008051740.006724370.006132270.005716950.005337020.004943470.004508890.004187790.003942190.00373838
转载请注明原文地址: https://www.6miu.com/read-42622.html

最新回复(0)