MNIST data-set training problem

#1

(I’m sorry , the new user only can upload one picture, so I’ve combined four picture together, please click on the picture)

I am training the MNIST data, the cross entropy get smaller and smaller after learning 500000 times (figure 2)
But the cross entropy change to NAN after learning 900000 times(only change the iterations) (figure4)
and the graph looks very confusing. It’s so amazing!

The code:

``````import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
from tensorflow.examples.tutorials.mnist import input_data

x = tf.placeholder(tf.float32, [None, 784])
W = tf.Variable(tf.zeros([784, 10]))
b = tf.Variable(tf.zeros([10]))
y = tf.matmul(x, W) + b
y_ = tf.placeholder(tf.float32, [None, 10])

#cross_entropy = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(labels=y_, logits=y))
y_softmax = tf.nn.softmax(y)
cross_entropy = -tf.reduce_sum(y_ * tf.log(y_softmax))

sess = tf.Session()
init = tf.global_variables_initializer()
sess.run(init)

tf.summary.scalar('Cross_Entropy' , cross_entropy)
merged = tf.summary.merge_all()
writer = tf.summary.FileWriter("logs/", sess.graph)

for i in range(500000):
batch_xs, batch_ys = mnist.train.next_batch(10)
#    sess.run(train_step, feed_dict={x: batch_xs, y_: batch_ys})
_ , rs = sess.run([train_step , merged], feed_dict={x: batch_xs, y_: batch_ys})