# tf.train

xiaoxiao2021-02-28  49

# tf.train

## 前言

tf.train属于Training类，一般用于梯度的计算。由于使用缘故探究下tf.train的简单用法。

## 官方

Optimizer that implements the gradient descent algorithm.

Construct a new gradient descent optimizer.

Args：

learning_rate: A Tensor or a floating point value. The learning rate to use.use_locking: If True use locks for update operation.sname: Optional name prefix for the operations created when applying gradients. Defaults to “GradientDescent”.

Returns: An Operation that updates the variables in ‘var_list’. If ‘global_step’ was not None, that operation also increments global_step.

### 2.tf.train.Optimizer.minimize

Add operations to minimize ‘loss’ by updating ‘var_list’.

Args：

loss: A Tensor containing the value to minimize.global_step: Optional Variable to increment by one after the variables have been updated.var_list: Optional list of variables.Variable to update to minimize ‘loss’. Defaults to the list of variables collected in the graph under the key GraphKeys.TRAINABLE_VARIABLES.gate_gradients: How to gate the computation of gradients. Can be GATE_NONE, GATE_OP, or GATE_GRAPH.name: Optional name for the returned operation.

## 例子

import tensorflow as tf import numpy as np # Creat data X_Data = np.random.rand(100).astype(np.float32) Y_Data = X_Data*0.1 + 0.3 # Creat TF Structure Start # Weight = tf.random_uniform([1], -1.0, 1.0) biases = tf.Variable(tf.zeros([1])) y = Weight*X_Data + biases loss = tf.reduce_mean(tf.square(y-Y_Data)) optimizer = tf.train.GradientDescentOptimizer(0.5) train = optimizer.minimize(loss) init = tf.global_variables_initializer() #Creat TF Structure End sess = tf.Session() sess.run(init) # Important for step in range(201): sess.run(train) if step%20 == 0: print(step, sess.run(Weight), sess.run(biases))

## 理解

tf.train.Optimizer.minimize中参数loss一般用tf.reduce_mean计算出来