Review By Artineer
Review Derivative 𝑓=𝑥𝑛 𝜕𝑓 𝜕𝑥 = 𝑛𝑥n-1
Review Derivative 𝑓=𝑎(𝑏 𝑥 ) 𝑓=𝑎(𝑏 𝑥 ) 𝜕𝑓 𝜕𝑥 = 𝑎 ′ 𝑏 𝑥 × 𝑏′(𝑥)
Review Derivative 𝑓=2𝑥+𝑦 𝜕𝑓 𝜕𝑥 = 2 𝜕𝑓 𝜕𝑦 = 1
Review Hypothesis 𝑦=𝑎𝑥+𝑏 𝑦=𝑤𝑥+𝑏
Review Cost Function 1 2∗10 𝑖=1 10 𝑦𝑖 −(𝑎𝑥𝑖+𝑏) 2 𝑒=
Review wn+1 = wn + 𝛼 𝜕𝑒 𝜕𝑤 bn+1 = bn + 𝛼 𝜕𝑒 𝜕𝑏 Gradient Descent Cost Function wn+1 = wn + 𝛼 𝜕𝑒 𝜕𝑤 bn+1 = bn + 𝛼 𝜕𝑒 𝜕𝑏
Multiple Linear Regression By Artineer
Concept Simple Linear Regression Multiple Linear Regression 𝑥1 𝑥 𝑦 𝑥2 𝑥3 𝑥…
Concept Simple Linear Regression Multiple Linear Regression 𝑦=𝑤𝑥+𝑏 𝑦=𝑤1𝑥1+𝑤2𝑥2+𝑤3𝑥3+ …+𝑏
Concept Simple Linear Regression Multiple Linear Regression
Hypothesis Y = w1x1 + w2x2 + w3x3 + … + b It's very complicated.
Hypothesis Y = w1x1 + w2x2 + w3x3 + … + b Y = (x1 x2 x3 …) 𝑤1 𝑤2 𝑤3 … + b
Hypothesis Y = (x1 x2 x3 …) 𝑤1 𝑤2 𝑤3 … + b Y = XW + b
Cost Function Simple Linear Regression Multiple Linear Regression
1 2 𝑖=1 10 𝑦𝑖 −(𝑤1𝑥1𝑖+𝑤2𝑥2𝑖+𝑤3𝑥3𝑖+ 𝑏) 2 Cost Function Simple Linear Regression Multiple Linear Regression 1 2 𝑖=1 10 𝑦𝑖 −(𝑤𝑥𝑖+𝑏) 2 1 2 𝑖=1 10 𝑦𝑖 −(𝑤1𝑥1𝑖+𝑤2𝑥2𝑖+𝑤3𝑥3𝑖+ 𝑏) 2 n = x의 개수
e = 1 2 𝑖=1 10 𝑦𝑖 − 𝑤1𝑥1𝑖+𝑤2𝑥2𝑖+𝑤3𝑥3𝑖+𝑏 2 Cost Function e = 1 2 𝑖=1 10 𝑦𝑖 − 𝑤1𝑥1𝑖+𝑤2𝑥2𝑖+𝑤3𝑥3𝑖+𝑏 2
Multiple Linear Regression Gradient Descent Simple Linear Regression Multiple Linear Regression
Tensorflow import tensorflow as tf x_data = [1, 2, 3] y_data = [1, 2, 3] w = tf.Variable(tf.random_normal([1]), name)
Tensorflow import tensorflow as tf x_data = [1, 2, 3] y_data = [2, 4, 6] a = tf.Variable(tf.random_normal([1]), name='a') b = tf.Variable(tf.random_normal([1]), name='b') hypothesis = a * x_train + b e = tf.reduce_mean(tf.square(hypothesis - y_train)) optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.01) train = optimizer.minimize(e) sess = tf.Session() sess.run(tf.global_variables_initializer()) for step in range(2001): sess.run(train) if step % 20 == 0: print("iteration : ", step, "e : ", sess.run(e), " ( y = ", sess.run(a),"x + ", sess.run(b), " )")
Tensorflow