¸ÞÀϺ¸³»±â

À̸§°Ë»ö

::: Research Board :::


9 11 Åë°èÄ«¿îÅÍ º¸±â   ȸ¿ø °¡ÀÔ È¸¿ø ·Î±×ÀÎ °ü¸®ÀÚ Á¢¼Ó --+
Name   ½Åº´Ãá
Subject   MNIST tf ½ÇÇà ÆÄÀÏusing mnist_ready
# mnist_tf.py
# One-layer ANN

import numpy as np
import tensorflow as tf
# from tensorflow.examples.tutorials.mnist import input_data
# mnist = input_data.read_data_sets("MNIST_data/", one_hot=True)

from mnist_ready import load_mnist  # load dataset from mnist_ready.py
mnist = load_mnist(normalize=True, one_hot_label=True)
# group ³ª´©±â: X_train, t_train, X_test, t_test
X_train, t_train = mnist['train_img'], mnist['train_label']
X_test, t_test = mnist['test_img'], mnist['test_label']

x = tf.placeholder("float", [None, 784])
W = tf.Variable(tf.zeros([784,10]))
b = tf.Variable(tf.zeros([10]))

y = tf.nn.softmax(tf.matmul(x,W) + b)

y_ = tf.placeholder("float", [None,10])
cross_entropy = -tf.reduce_sum(y_*tf.log(y))

learning_rate = 0.01
train_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(cross_entropy)

sess = tf.Session()
sess.run(tf.global_variables_initializer())

train_size = X_train.shape[0]
batch_size = 100
N_epochs = 20
itr_per_epoch = int(max(train_size / batch_size, 1))

for epoch in range(N_epochs):
    for i in range(itr_per_epoch):
        batch_mask = np.random.choice(train_size, batch_size)
        batch_xs, batch_ys = X_train[batch_mask], t_train[batch_mask]
        sess.run(train_step, feed_dict={x: batch_xs, y_: batch_ys})

    correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
    accuracy = tf.reduce_mean(tf.cast(correct_prediction, "float"))
    acc1 = sess.run(accuracy, feed_dict={x: X_train, y_: t_train})
    acc2 = sess.run(accuracy, feed_dict={x: X_test, y_: t_test})
    print('After {}-th epoch, Probability for Train({}), Test({})' \
          .format(epoch,acc1, acc2) )
    if (acc1+acc2)/2 > 0.97: break

°Ô½Ã¹°À» À̸ÞÀÏ·Î º¸³»±â ÇÁ¸°Æ®Ãâ·ÂÀ» À§ÇÑ È­¸éº¸±â
DATE: 2019.07.10 - 15:39


 ´ÙÀ½±Û ½Å°æ¸Á ÆÄÀÏ: 1-layer(Inet1), 2-layer(Inet2)
±Û³²±â±â»èÁ¦Çϱâ¼öÁ¤Çϱâ´äº¯´Þ±â°Ë»ö¸ñ·Ï º¸±â