Part A: Trait Analysis Table You are given four organisms:…

Part A: Trait Analysis Table You are given four organisms: Brook trout Desert spadefoot toad Spruce Tree Red-tailed hawk Black bear Using the list of traits below, create your own summary table that indicates whether each trait is present or absent in each organism. Your table should clearly indicate which organisms share each trait. Traits to consider: Vertebrae Legs Shelled egg Feathers Warm-blooded (endothermy) Instructions: For each trait, show whether it is present or absent in each organism by selecting from the dropdown in each cell. Trait Brook Trout Spadefoot Toad Spruce Tree Red-tailed Hawk Black Bear Vertebrae Legs Shelled Egg Feathers Warm-blooded  

Part B: Trait Organization Below is a simplified phylogeneti…

Part B: Trait Organization Below is a simplified phylogenetic tree showing the relationships among five organisms. The branching points are labeled Trait 1–4, but the evolutionary traits that define each branch are missing.            ┌── Red-tailed Hawk (Present)         │     Trait 4         │         └── Black Bear     Trait 3         │         └── Spadefoot Toad         │     Trait 2         │         └── Brook Trout     Trait 1         │         └── Spruce Tree (Toward Common Ancestor)   Instructions: Below is a list of five traits, but only four of them correctly describe the evolutionary branching points shown in the tree. Match each branching point (Trait 1–4) with the most appropriate trait from the list. Use each selected trait only once. One trait will not be used. Trait Options: (One trait does not fit in the tree – choose carefully.) A. Legs B. Warm-blooded C. Feathers D. Vertebrae E. Shelled egg  Branching Point Trait Name Point 1 Point 2 Point 3 Point 4  

Multi-layer perceptron. Consider the following neural networ…

Multi-layer perceptron. Consider the following neural network defined in PyTorch.  class NeuralNetwork(nn.Module):    def __init__(self):        super().__init__()        self.linear_relu_stack = nn.Sequential(            nn.Linear(20, 100),            nn.ReLU(),            nn.Linear(100, 100),            nn.ReLU(),            nn.Linear(100, 3),        )    def forward(self, x):        logits = self.linear_relu_stack(x)        return logits (a) (2 pts) How many learnable layers does the neural network have? Count only layers that contain trainable parameters. (b) (2 pts) How many parameters does the neural network have? You may disregard bias/offset terms.

Backpropagation. Indicate whether each of the following stat…

Backpropagation. Indicate whether each of the following statements is True or False (1 pt each). (a) Backpropagation relies on repeated application of the chain rule to compute gradients. (b) In backpropagation, gradients are propagated from the input layer to the output layer. (c) In Pytorch, the purpose of calling optimizer.zero_grad() in a training loop is to retain and accumulate the gradients from the previous backward pass. (d) The softmax function is differentiable and thus compatible with backpropagation.

Vanishing gradient problem. Indicate whether each of the fol…

Vanishing gradient problem. Indicate whether each of the following statements is True or False (1 pt each). (a) The vanishing gradient problem is more likely to occur in shallow networks than in deep networks. (b) Activation functions like sigmoid and tanh are more likely to cause vanishing gradients than ReLU. (c) Both batch normalization and residual connections help mitigate the vanishing gradient problem.