SP
BravenNow
Rethinking Disentanglement under Dependent Factors of Variation
| USA | technology | ✓ Verified - arxiv.org

Rethinking Disentanglement under Dependent Factors of Variation

#Disentanglement #Representation Learning #Factors of Variation #Information Theory #Machine Learning #Independent Factors #Dependent Factors

📌 Key Takeaways

  • Researchers developed a new definition of disentanglement valid for dependent factors of variation
  • The paper connects this definition to the Information Bottleneck Method
  • A novel measurement method for disentanglement with dependent factors is proposed
  • Experiments show the new method works where existing approaches fail

📖 Full Retelling

Computer scientists Antonio Almudévar and Alfonso Ortega published their third revision of the research paper 'Rethinking Disentanglement under Dependent Factors of Variation' on arXiv on February 24, 2026, addressing a fundamental limitation in representation learning that has restricted its real-world applications. The paper challenges the conventional assumption in representation learning that factors of variation are independent of each other, which the authors note is generally false in real-world scenarios. This unrealistic assumption has significantly limited the practical applications of disentanglement definitions and metrics to very specific and controlled environments. In their work, the researchers introduce a new definition of disentanglement based on information theory that remains valid even when factors of variation are dependent. They establish connections between this definition and the Information Bottleneck Method, while proposing a novel measurement approach for disentanglement that works effectively with dependent factors. Through comprehensive experiments, the authors demonstrate that their proposed method accurately measures disentanglement in scenarios with non-independent factors, while existing approaches fail in such realistic conditions.

🏷️ Themes

Machine Learning, Representation Learning, Information Theory

📚 Related People & Topics

Information theory

Scientific study of digital information

Information theory is the mathematical study of the quantification, storage, and communication of a particular type of mathematically defined information. The field was established and formalized by Claude Shannon in the 1940s, though early contributions were made in the 1920s through the works of H...

View Profile → Wikipedia ↗

Machine learning

Study of algorithms that improve automatically through experience

Machine learning (ML) is a field of study in artificial intelligence concerned with the development and study of statistical algorithms that can learn from data and generalize to unseen data, and thus perform tasks without explicit instructions. Within a subdiscipline in machine learning, advances i...

View Profile → Wikipedia ↗

Entity Intersection Graph

Connections for Information theory:

🌐 Quantum contextuality 1 shared
🌐 Artificial intelligence 1 shared
View full profile
Original Source
--> Computer Science > Machine Learning arXiv:2408.07016 [Submitted on 13 Aug 2024 ( v1 ), last revised 24 Feb 2026 (this version, v3)] Title: Rethinking Disentanglement under Dependent Factors of Variation Authors: Antonio Almudévar , Alfonso Ortega View a PDF of the paper titled Rethinking Disentanglement under Dependent Factors of Variation, by Antonio Almud\'evar and Alfonso Ortega View PDF HTML Abstract: Representation learning is an approach that allows to discover and extract the factors of variation from the data. Intuitively, a representation is said to be disentangled if it separates the different factors of variation in a way that is understandable to humans. Definitions of disentanglement and metrics to measure it usually assume that the factors of variation are independent of each other. However, this is generally false in the real world, which limits the use of these definitions and metrics to very specific and unrealistic scenarios. In this paper we give a definition of disentanglement based on information theory that is also valid when the factors of variation are not independent. Furthermore, we relate this definition to the Information Bottleneck Method. Finally, we propose a method to measure the degree of disentanglement from the given definition that works when the factors of variation are not independent. We show through different experiments that the method proposed in this paper correctly measures disentanglement with non-independent factors of variation, while other methods fail in this scenario. Subjects: Machine Learning (cs.LG) ; Artificial Intelligence (cs.AI); Machine Learning (stat.ML) Cite as: arXiv:2408.07016 [cs.LG] (or arXiv:2408.07016v3 [cs.LG] for this version) https://doi.org/10.48550/arXiv.2408.07016 Focus to learn more arXiv-issued DOI via DataCite Submission history From: Antonio Almudévar [ view email ] [v1] Tue, 13 Aug 2024 16:30:36 UTC (87 KB) [v2] Thu, 11 Sep 2025 08:55:14 UTC (2,542 KB) [v3] Tue, 24 Feb 2026 17:19:33 UTC...
Read full article at source

Source

arxiv.org

More from USA

News from Other Countries

🇬🇧 United Kingdom

🇺🇦 Ukraine