Vanishing gradient problem. Indicate whether each of the fol…

Vanishing gradient problem. Indicate whether each of the following statements is True or False (1 pt each). (a) The vanishing gradient problem is more likely to occur in shallow networks than in deep networks. (b) Activation functions like sigmoid and tanh are more likely to cause vanishing gradients than ReLU. (c) Both batch normalization and residual connections help mitigate the vanishing gradient problem.

Complete the code for  a program that takes a list of words…

Complete the code for  a program that takes a list of words and prints only the words that start with a vowel (a, e, i, o, u).  sample input words = output: apple  orange  umbrella   Code words = for ______________________________________    if_____________________________________      _____________________________________________