Backpropagation
Optimization algorithm for artificial neural networks
📊 Rating
1 news mentions · 👍 0 likes · 👎 0 dislikes
📌 Topics
- Artificial Intelligence (1)
- Machine Learning (1)
- Optimization (1)
🏷️ Keywords
Zeroth-Order Optimization (1) · Deep Neural Networks (1) · HZO (1) · Machine Learning (1) · Backpropagation (1) · arXiv (1) · Algorithm (1)
📖 Key Information
In machine learning, backpropagation is a gradient computation method commonly used for training a neural network in computing parameter updates.
It is an efficient application of the chain rule to neural networks. Backpropagation computes the gradient of a loss function with respect to the weights of the network for a single input–output example, and does so efficiently, computing the gradient one layer at a time, iterating backward from the last layer to avoid redundant calculations of intermediate terms in the chain rule; this can be derived through dynamic programming.
📰 Related News (1)
-
🇺🇸 Hierarchical Zero-Order Optimization for Deep Neural Networks
arXiv:2602.10607v1 Announce Type: cross Abstract: Zeroth-order (ZO) optimization has long been favored for its biological plausibility and its capaci...
🔗 Entity Intersection Graph
People and organizations frequently mentioned alongside Backpropagation:
- 🌐 Machine learning (1 shared articles)
- 🏢 HZO (1 shared articles)
- 🌐 Algorithm (1 shared articles)