Shapley value
Concept in game theory
📊 Rating
1 news mentions · 👍 0 likes · 👎 0 dislikes
📌 Topics
- Artificial Intelligence (1)
- Machine Learning (1)
- Multi-Agent Systems (1)
🏷️ Keywords
SHARP optimization (1) · Large Language Models (1) · Shapley value (1) · Credit assignment (1) · Multi-agent systems (1) · arXiv (1) · AI training (1)
📖 Key Information
In cooperative game theory, the Shapley value is a method (solution concept) for fairly distributing the total gains or costs among a group of players who have collaborated. For example, in a team project where each member contributed differently, the Shapley value provides a way to determine how much credit or blame each member deserves. It was named in honor of Lloyd Shapley, who introduced it in 1951.
📰 Related News (1)
-
🇺🇸 Who Deserves the Reward? SHARP: Shapley Credit-based Optimization for Multi-Agent System
arXiv:2602.08335v1 Announce Type: new Abstract: Integrating Large Language Models (LLMs) with external tools via multi-agent systems offers a promisi...
🔗 Entity Intersection Graph
People and organizations frequently mentioned alongside Shapley value:
- 🌐 Machine learning (1 shared articles)
- 🌐 Large language model (1 shared articles)