Questions Tagged [batch-normalization]

Batch Normalization is a technique to improve learning in neural networks by normalizing the distribution of each input feature in each layer across each minibatch to N(0, 1).

Question is empty. Ask new Question

Popular Questions