我是靠谱客的博主 纯真芝麻,最近开发中收集的这篇文章主要介绍tensorflow API:tf.set_random_seed同一随机种子不改变的设置,觉得挺不错的,现在分享给大家,希望可以做个参考。
概述
-
- 会话级种子
- 图级种子
- 同一个sess生成两次结果不一样
- 情况1:设置了seed参数但是同样的两次结果不一样:
- 情况2:定义的两个变量的随机生成函数一样,种子一样,结果一样:
- 为了一样,在一个sess.run中运行:
- 设为变量variable,得到同一个session可复用的结果:
会话级种子
设置随机函数的seed
参数,对应的变量可以跨session
生成相同的随机数:
- 例子
tf.reset_default_graph()
a = tf.random_uniform([1], seed=1)
b = tf.random_normal([1])
# Repeatedly running this block with the same graph will generate the same
# sequence of values for 'a', but different sequences of values for 'b'.
print("Session 1")
with tf.Session() as sess1:
print(sess1.run(a)) # generates 'A1'
print(sess1.run(a)) # generates 'A2'
print(sess1.run(b)) # generates 'B1'
print(sess1.run(b)) # generates 'B2'
print("Session 2")
with tf.Session() as sess2:
print(sess2.run(a)) # generates 'A1'
print(sess2.run(a)) # generates 'A2'
print(sess2.run(b)) # generates 'B3'
print(sess2.run(b)) # generates 'B4'
结果:变量a跨会话生成的随机数相同,b则不同。
Session 1
[0.2390374]
[0.22267115]
[0.9374042]
[0.57995176]
Session 2
[0.2390374]
[0.22267115]
[-1.6857139]
[0.6809292]
图级种子
通过tf.set_random_seed
设定种子数,后面定义的全部变量都可以跨会话生成相同的随机数。
* 例子:
tf.reset_default_graph()
tf.set_random_seed(1234)
a = tf.random_uniform([1])
b = tf.random_normal([1])
# Repeatedly running this block with the same graph will generate the same
# sequences of 'a' and 'b'.
print("Session 1")
with tf.Session() as sess1:
print(sess1.run(a)) # generates 'A1'
print(sess1.run(a)) # generates 'A2'
print(sess1.run(b)) # generates 'B1'
print(sess1.run(b)) # generates 'B2'
print("Session 2")
with tf.Session() as sess2:
print(sess2.run(a)) # generates 'A1'
print(sess2.run(a)) # generates 'A2'
print(sess2.run(b)) # generates 'B1'
print(sess2.run(b)) # generates 'B2'
结果:
Session 1
[0.96046877]
[0.8362156]
[0.4987599]
[0.54880583]
Session 2
[0.96046877]
[0.8362156]
[0.4987599]
[0.54880583]
同一个sess生成两次结果不一样
情况1:设置了seed
参数但是同样的两次结果不一样:
import tensorflow as tf
tf.reset_default_graph()
embedding1 = tf.random_uniform(seed=1234,minval=0,maxval=10,shape=(5,5))
with tf.Session() as sess:
print("embedding1:",sess.run(embedding1))
print("embedding1:",sess.run(embedding1))
结果:
embedding1: [[8.48307 3.2357132 3.067001 0.69699764 9.138565 ]
[1.7047906 2.833712 3.5627055 5.4155626 0.75256824]
[0.7449007 8.65954 5.5905952 0.09341955 0.67013025]
[4.9774456 2.989055 8.094423 6.584983 4.8154235 ]
[1.5170729 5.910175 3.3441043 1.5515935 2.4549198 ]]
embedding1: [[9.550312 0.6379819 1.1600447 9.080754 1.3742483 ]
[6.237757 5.177211 2.1648896 5.644026 5.9398937 ]
[1.3058889 5.9784565 1.05986 5.3199635 4.053446 ]
[4.7531548 2.615583 5.401784 5.989144 3.236668 ]
[8.574158 1.7371535 0.89173794 7.255272 8.292801 ]]
情况2:定义的两个变量的随机生成函数一样,种子一样,结果一样:
import tensorflow as tf
tf.reset_default_graph()
embedding1 = tf.random_uniform(seed=1234,minval=0,maxval=10,shape=(5,5))
embedding2 = tf.random_uniform(seed=1234,minval=0,maxval=10,shape=(5,5))
sess = tf.Session()
print("embedding1:",sess.run(embedding1))
print("embedding2:",sess.run(embedding2))
结果:
embedding1: [[8.48307 3.2357132 3.067001 0.69699764 9.138565 ]
[1.7047906 2.833712 3.5627055 5.4155626 0.75256824]
[0.7449007 8.65954 5.5905952 0.09341955 0.67013025]
[4.9774456 2.989055 8.094423 6.584983 4.8154235 ]
[1.5170729 5.910175 3.3441043 1.5515935 2.4549198 ]]
embedding2: [[8.48307 3.2357132 3.067001 0.69699764 9.138565 ]
[1.7047906 2.833712 3.5627055 5.4155626 0.75256824]
[0.7449007 8.65954 5.5905952 0.09341955 0.67013025]
[4.9774456 2.989055 8.094423 6.584983 4.8154235 ]
[1.5170729 5.910175 3.3441043 1.5515935 2.4549198 ]]
为了一样,在一个sess.run中运行:
import tensorflow as tf
tf.reset_default_graph()
tf.set_random_seed(1234)
embedding1 = tf.random_uniform(minval=0,maxval=10,shape=(5,5))
embedding2 = tf.random_uniform(minval=0,maxval=10,shape=(5,5))
ids = [2,1,0]
some_embedding = tf.nn.embedding_lookup(embedding1,ids=ids)
with tf.Session() as sess:
print("embedding table:{}n result:{}".format(*sess.run([embedding1,some_embedding])))
#print("result:",sess.run(some_embedding))
#print(*sess.run([embedding1,some_embedding]))
结果:
embedding table:[[9.604688 5.811516 6.4159 9.621765 0.5434954 ]
[4.1893444 5.8865128 7.9785547 8.296125 8.388672 ]
[0.41017294 5.350975 4.223858 9.372683 9.035423 ]
[1.5520871 1.4448678 3.6297297 8.929963 5.167904 ]
[1.5287185 6.8655777 8.099522 1.5997577 6.136037 ]]
result:[[0.41017294 5.350975 4.223858 9.372683 9.035423 ]
[4.1893444 5.8865128 7.9785547 8.296125 8.388672 ]
[9.604688 5.811516 6.4159 9.621765 0.5434954 ]]
设为变量variable
,得到同一个session
可复用的结果:
f_tf = tf.Variable(tf.random_normal([1, 3, 1, 1]))
# ...
init_op = tf.global_variables_initializer()
# ...
with tf.Session() as sess:
sess.run(init_op)
print(sess.run(f_tf))
print(sess.run(f_tf))
结果:
[[[[1.0966153 ]]
[[0.2081248 ]]
[[0.15507936]]]]
[[[[1.0966153 ]]
[[0.2081248 ]]
[[0.15507936]]]]
原因:TensorFlow has several ops that create random tensors with different distributions. The random ops are stateful, and create new random values each time they are evaluated.
参考:https://stackoverflow.com/questions/41637974/running-session-multiple-times-with-tf-random-returns-different-values-for-conv2
最后
以上就是纯真芝麻为你收集整理的tensorflow API:tf.set_random_seed同一随机种子不改变的设置的全部内容,希望文章能够帮你解决tensorflow API:tf.set_random_seed同一随机种子不改变的设置所遇到的程序开发问题。
如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。
本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
发表评论 取消回复