Tag Archives: Tensorflow tf.get_variable_scope()

Tensorflow: Common Usage of tf.get_variable_scope()

We know that tensorflow, once constructed, cannot be changed during training. And once the input is added to the graph, the data will flow in a series of parameters, and this series of parameters can only act on the data sent in from this input interface. If we get a group of data or a queue at this time, we want to make the data of this queue go through the same route as the previous input queue. In other words, we want to achieve this operation:

input_1 = tf.placeholder(...)
out_1 = model(input_1)
input_2 = tf.placeholder(...)
out_2 = model(input_2)

This will make a mistake. Let’s look at the following code:

import tensorflow as tf
import vgg

inputs_1 = tf.random_normal([10,224,224,3])
inputs_2 = tf.random_normal([10,224,224,3])
with tf.variable_scope('vgg_16') :
    net ,end_points = vgg.vgg_16(inputs_1,100,False)
    # tf.get_variable_scope().reuse_variables()
    net_, end_points_ = vgg.vgg_16(inputs_2,100,False)

with tf.Session() as sess:
    print("no error")

If the above code is correct, the final output is no error

But the error message is this:

ValueError: Variable vgg_16/vgg_16/conv1/conv1_1/weights already exists, disallowed. 
Did you mean to set reuse=True or reuse=tf.AUTO_REUSE in VarScope? Originally defined at:

This means that a variable already exists and cannot be generated again. In fact, the first input generates a series of variables, which can only be used for the data sent by the first input. Now the second input also wants to use the parameter corresponding to the first input, which is not OK. The solution is to add that line of comment:

import tensorflow as tf
import vgg

inputs_1 = tf.random_normal([10,224,224,3])
inputs_2 = tf.random_normal([10,224,224,3])
with tf.variable_scope('vgg_16') :
    net ,end_points = vgg.vgg_16(inputs_1,100,False)
    tf.get_variable_scope().reuse_variables()
    net_, end_points_ = vgg.vgg_16(inputs_2,100,False)

with tf.Session() as sess:
    print("no error")

tf.get_ variable_ scope().reuse_ Variables() will be in the current variable_ Under scope, set the variable to reuse = true. You can see the meaning of the name, that is, the two inputs can share the variable