Skip to main navigationSkip to main contentSkip to footer
Wiki Cram
  • Home
  • Blog
Wiki Cram

How do L1 and L2 regularization differ? A. L1 shrinks weight…

How do L1 and L2 regularization differ? A. L1 shrinks weights uniformly while L2 ignores small weights during optimizationB. L1 promotes sparsity by setting weights to zero, L2 reduces weights without eliminating themC. L1 and L2 both eliminate all neurons with small values to improve training performanceD. L1 and L2 are identical in behavior but applied to different layers during backpropagation

How do L1 and L2 regularization differ? A. L1 shrinks weight…

Posted on: November 26, 2025 Last updated on: November 26, 2025 Written by: Anonymous Categorized in: Uncategorized
Skip back to main navigation
Powered by Studyeffect

Post navigation

Previous Post What is Transfer Learning? A. Training a model from scratch…
Next Post What is Temporal Difference learning? A. Learning by averagi…
  • Privacy Policy
  • Terms of Service
Copyright © 2025 WIKI CRAM — Powered by NanoSpace