Tensor flow doubt

Hi,

1)In tensor flow section 4=>
theta = tf.Variable(tf.random_uniform([n + 1, 1], -1.0, 1.0, seed=42), name=“theta”)
in the above code why we are writing [n + 1, 1 and -1.0, 1.0???

2)in tensor flow section3 =>
for epoch in range(n_epochs):
if epoch % 10 == 0:
print(“Epoch”, epoch, “MSE =”, mse.eval())
x = sess.run(training_op)
best_theta = theta.eval()
In this loop what is the significance of this line???

please explain the above doubt

Hi, Queen.

  1. theta = tf.Variable(tf.random_uniform([n + 1, 1], -1.0, 1.0, seed=42), name=“theta”) —>

a) Here we are trying to create a Tensor of (n+1 by 1) dimensions and the values in it will be between the -1.0 to 1.0 randomly.
b) The constant seed = 42 we are using so that the random values that we got here should be constant throughout the sessions of executions or model.
c) That is stored in a variable named “theta”.

  1. for epoch in range(n_epochs):
    if epoch % 10 == 0: —> This is a logic to show the loss for every 10th iterations like 10,20,30,40,50 etc. multiple of 10.

print(“Epoch”, epoch, “MSE =”, mse.eval()) --> This is the loss MSE that you are calculating by calling the mse.eval() function.

x = sess.run(training_op) --> running the sessions and storing the value to x.

best_theta = theta.eval() --> calling the theta.eval() that you must have defined.

All the best!