H(x) = wx + b 의 Linear Regression example code
import tensorflow as tf
# X and Y data
x_train = [1,2,3]
y_train = [1,2,3]
W = tf.Variable(tf.random_normal([1]), name = 'weight')
b = tf.Variable(tf.random_normal([1]), name = 'bias')
# Our hypothesis XW + b
hypothesis = x_train * W + b
# cost/Loss function
cost = tf.reduce_mean(tf.square(hypothesis - y_train))
# Minimize
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.01)
train = optimizer.minimize(cost)
# Launch the graph in a session.
sess = tf.Session()
# Initializes global variables in the graph.
sess.run(tf.global_variables_initializer())
# Fit the line
for step in range(2001):
sess.run(train)
if step % 20 == 0:
print(step, sess.run(cost), sess.run(W), sess.run(b))
-Reference-
https://github.com/hunkim/DeepLearningZeroToAll/
'Programming > TensorFlow' 카테고리의 다른 글
Logistic Classification on TensorFlow (0) | 2017.06.11 |
---|---|
텐서플로우에서 데이터 읽기 | Read data from files on TensorFlow (0) | 2017.04.19 |
Multi Variables Linear Regression을 Tensor Flow 에서 구현하기 (0) | 2017.04.09 |
TensorFlow Linear Regression with place holder (0) | 2017.04.08 |
아나콘다(Anaconda)에 TensorFlow 설치하기 on Mac | Installing TensorFlow at Anaconda on MAC OS X (0) | 2017.03.26 |