Точка Синхронізації

AI Archive of Human History

🌐 Entity

Batch normalization

Method of improving artificial neural network

📊 Rating

1 news mentions · 👍 0 likes · 👎 0 dislikes

📌 Topics

  • Artificial Intelligence (1)
  • Machine Learning (1)
  • Reinforcement Learning (1)

🏷️ Keywords

PPO (1) · Batch Normalization (1) · Reinforcement Learning (1) · arXiv (1) · Neural Networks (1) · Policy Optimization (1) · Algorithm Stability (1)

📖 Key Information

In artificial neural networks, batch normalization (also known as batch norm) is a normalization technique used to make training faster and more stable by adjusting the inputs to each layer—re-centering them around zero and re-scaling them to a standard size. It was introduced by Sergey Ioffe and Christian Szegedy in 2015. Experts still debate why batch normalization works so well.

📰 Related News (1)

🔗 Entity Intersection Graph

People and organizations frequently mentioned alongside Batch normalization:

🔗 External Links