我是靠谱客的博主 害怕万宝路,最近开发中收集的这篇文章主要介绍Tensorflow tf.GradientTape() 查看每一次epoch参数更新,觉得挺不错的,现在分享给大家,希望可以做个参考。

概述

这是一个非常简单的例子。

拟合 y  = x * 3 + 2(其中x是一维,y是是实数)

import tensorflow as tf

tf.enable_eager_execution()
tf.executing_eagerly()
tfe = tf.contrib.eager

NUM_EAMPLES = 1000
training_inputs = tf.random_normal([NUM_EAMPLES])
noise = tf.random_normal([NUM_EAMPLES])
training_outputs = training_inputs*3 + 2 + noise

def prediction(input,weight,bias):
    return input * weight + bias

def loss(weights,biases):
    error = prediction(training_inputs,weights,biases) - training_outputs
    return tf.reduce_mean(tf.square(error))

def grad(weights,biases):
    with tf.GradientTape() as tape:
        loss_value = loss(weights,biases)
    return tape.gradient(loss_value,[weights,biases])

traing_steps = 200
learning_rate = 0.01
W = tf.Variable(0.)
B = tf.Variable(0.)

print("Initial loss: {:.3f}".format(loss(W,B)))
for i in range(traing_steps):
    dw,db = grad(W,B)
    W.assign_sub(dw*learning_rate)
    B.assign_sub(db*learning_rate)
    if i%20 == 0:
        print("Loss at step {:03d}: {:.3f}".format(i,loss(W,B)))
print("Final loss: {:.3f}".format(loss(W,B)))
print("W = {}, B = {}".format(W.numpy(),B.numpy()))

程序输出

Initial loss: 14.057
Loss at step 000: 13.534
Loss at step 020: 6.526
Loss at step 040: 3.425
Loss at step 060: 2.053
Loss at step 080: 1.446
Loss at step 100: 1.177
Loss at step 120: 1.058
Loss at step 140: 1.005
Loss at step 160: 0.982
Loss at step 180: 0.972
Final loss: 0.967
W = 2.950300931930542, B = 1.959231972694397

最后

以上就是害怕万宝路为你收集整理的Tensorflow tf.GradientTape() 查看每一次epoch参数更新的全部内容,希望文章能够帮你解决Tensorflow tf.GradientTape() 查看每一次epoch参数更新所遇到的程序开发问题。

如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。

本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
点赞(39)

评论列表共有 0 条评论

立即
投稿
返回
顶部