The above figure is for the gradient boosting algorithm for…

The above figure is for the gradient boosting algorithm for regression. Step 1. A new decision tree (DT) is trained with feature X and label r (i.e., residual) to predict the residual. Step 2. The predicted residual in Step 1 is multiplied by the learning rate and is added to the prior predicted The learning rate is between 0 and 1 for slow learning to avoid overfitting. Step 3. The residual is updated by subtracting the new DT in Step 1 multiplied by the learning rate. Step 4. The final predicted Y in the gradient boosting is the additive function of DTs multiplied by the learning       rate in each stage. Overall, gradient boosting is a ____________ ; therefore, its computational processes may slower than random forest. In addition, a new decision tree in each stage is created based on the information from the prior trees to improve performance.