Backpropagation. Indicate whether each of the following stat…
Backpropagation. Indicate whether each of the following statements is True or False (1 pt each). (a) Backpropagation relies on repeated application of the chain rule to compute gradients. (b) In backpropagation, gradients are propagated from the input layer to the output layer. (c) In Pytorch, the purpose of calling optimizer.zero_grad() in a training loop is to retain and accumulate the gradients from the previous backward pass. (d) The softmax function is differentiable and thus compatible with backpropagation.