ABOUT ME

-

Today
-
Yesterday
-
Total
-
  • tensorflow v1
    Computer Science/๊ฐœ๋ฐœ 2024. 4. 23. 14:49
    ๋ฐ˜์‘ํ˜•

    ์„ค์น˜

    ๊ฐœ๋ฐœํ™˜๊ฒฝ: Ubuntu 20.04.6 LTS

     

    1. conda create -n tfv1 python=3.7

    (3.8๋ถ€ํ„ฐ๋Š” v1 ์„ค์น˜๊ฐ€ ์ง€์›๋˜์ง€ ์•Š๋Š”๋‹ค)

     

    2. python -m pip install --upgrade pip

     

    3. pip3 install tensorflow==1.15.2

    (ํ•ด๋‹น tensorflow๋Š” cpu์šฉ์ž„)

     

    4. python3

    >> import tensorflow as tf

    >> print(tf.__version__)

    1.15.2

     

    Basic TensorFlow code (v1)

    N, D, H = 64, 1000, 1000 # ๋ฐฐ์น˜, ์ฐจ์›, ๋…ธ๋“œ
    x = tf.placeholder(tf.float32, shape=(N,D))
    y = tf.placeholder(tf.float32, shape=(N,D))
    init = tf.contrib.layers.xavier_initializer()
    
    # 2-layer perceptron
    h = tf.layers.dense(inputs=x, units=H, activation=tf.nn.relu, kernel_initializer=init)
    y_pred = tf.layers.dense(inputs=h, units=D, kernel_initializer=init)
    
    # loss ์ •์˜, computational graph ๋งˆ์ € ํ˜•์„ฑ
    loss=tf.losses.mean_squared_error(y_pred,y)
    optimizer = tf.train.GradientDescentOptimizer(1e-5)
    updates = optimizer.minimize(loss)

     

    tensorflow์—์„œ tf.layers ๋Š” layer ์ข…๋ฅ˜์— ๋”ฐ๋ผ ์ž๋™์ ์œผ๋กœ weight ๋ฐ bias๋ฅผ ์ƒ์„ฑํ•ด์ค€๋‹ค.

    tf.layers.dense์˜ ๊ฒฝ์šฐ dense layer (fc layer + *activation function)์— ๋Œ€ํ•œ ํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ ์ƒ์„ฑํ•œ๋‹ค.

     

    ์ฒซ๋ฒˆ์งธ ๋ ˆ์ด์–ด: 

     

    inputs=x: ๋ ˆ์ด์–ด์˜ ์ž…๋ ฅ (์ž…๋ ฅ ๋…ธ๋“œ ๊ฐœ์ˆ˜: D๊ฐœ)

    units=H: ์ถœ๋ ฅ๋…ธ๋“œ์˜ ๊ฐœ์ˆ˜๊ฐ€ H๊ฐœ 

    activation=tf.nn.relu: ๋…ธ๋“œ์˜ ์ถœ๋ ฅ์— ReLU ํ•จ์ˆ˜๋ฅผ ์ ์šฉ

    kernel_initializer=init: ๊ฐ€์ค‘์น˜ ์ดˆ๊ธฐํ™” ๋ฐฉ๋ฒ• ์ ์šฉ

     

    ๋‘๋ฒˆ์งธ ๋ ˆ์ด์–ด:

    inputs=h ๋กœ ์ฒซ๋ฒˆ์งธ ๋ ˆ์ด์–ด์˜ ์ถœ๋ ฅ๋…ธ๋“œ๋งŒํผ ์ž…๋ ฅ์œผ๋กœ ๋ฐ›๊ณ ,

    D๊ฐœ์˜ ์ถœ๋ ฅ ๋…ธ๋“œ๋ฅผ ๊ฐ€์ง€๋ฉฐ, ๋ณ„๋„์˜ ํ™œ์„ฑํ™” ํ•จ์ˆ˜๋Š” ์ ์šฉํ•˜์ง€ ์•Š์Œ

     

    TensorFlow v1์€ Pytorch๋‚˜ TensorFlow v2์ฒ˜๋Ÿผ eager execution์„ ์ง€์›ํ•˜์ง€ ์•Š๊ธฐ๋•Œ๋ฌธ์— ์—ฌ๊ธฐ๊นŒ์ง€๋Š” Computational graph๋ฅผ ํ˜•์„ฑํ•˜๊ธฐ ์œ„ํ•œ Construction phase ์˜€๋‹ค. 

     

    with tf.Session() as sess:
        sess.run(tf.global_variables_initializer())
        values = {x: np.random.randn(N, D),
        y: np.random.randn(N, D),}
        losses = []
        for t in range(50):
            loss_val, _ = sess.run([loss, updates],
                                   feed_dict=values)
            print(loss_val)

     

    ์ด์ œ tf.Session ์œผ๋กœ ๊ฐ์‹ธ์„œ sess.run์„ ํ•ด์ฃผ๋ฉด ์œ„์—์„œ ์ƒ์„ฑํ•œ computational graph์˜ ์—ฐ์‚ฐ์ด ์‹คํ–‰์ด ๋œ๋‹ค.

    ์‹คํ–‰ํ•  ์—ฐ์‚ฐ์„ ์ธ์ž๋กœ ๋„˜๊ฒจ์ค˜์•ผ ํ•˜๋Š”๋ฐ loss์™€ updates๋ฅผ ๋ชจ๋‘ ์ œ๊ณตํ•จ์œผ๋กœ์จ

    forward pass์™€ backward pass ์—ฐ์‚ฐ ๋ชจ๋‘ ์ˆ˜ํ–‰ํ•˜๊ณ  ์žˆ๋‹ค.

Designed by Tistory.