Modular Programming in TensorFlow

Hi,

A very basic question:

I liked the idea of modular programming in TensorFlow, wherein if you repeat the name_scopes it would automatically append “_1”, “_2” and so on. This keeps the code small and clean. It worked for simple variable and constants; but when using complex functions like,

# default 3x3 kernel with no skips and ReLU activations
def create_convolution_layer(source, fmaps, kernel_size=3, strides=1, padding="SAME", activation=tf.nn.relu):
    with tf.name_scope("convolution"):
        return tf.layers.conv2d(source, filters=fmaps, kernel_size=kernel_size, strides=strides, padding=padding, activation=activation, name="layer")

It simply throws an error on calling the above function twice, stating some inner variable of layer (layer/kernel) is already defined.

Am I doing something wrong?

Note: It works if I remove the inner

name=“layer”

part from the above code. Maybe that’s the correct way to do it!?

Regards,
Saurabh Singh Parihar

Hi Saurabh,

You are right. The right way of creating a layer by not putting the name.

1 Like

Thanks for clarifying :slightly_smiling_face: