SP
BravenNow
ECG-MoE: Mixture-of-Expert Electrocardiogram Foundation Model
| USA | technology | ✓ Verified - arxiv.org

ECG-MoE: Mixture-of-Expert Electrocardiogram Foundation Model

#ECG-MoE #mixture-of-experts #electrocardiogram #foundation model #cardiac diagnostics #AI healthcare #ECG analysis

📌 Key Takeaways

  • ECG-MoE is a new foundation model for electrocardiogram analysis using a mixture-of-experts architecture.
  • The model aims to improve accuracy and efficiency in interpreting ECG data for medical diagnostics.
  • It leverages specialized sub-networks to handle diverse cardiac conditions and signal variations.
  • ECG-MoE represents an advancement in AI-driven healthcare tools for cardiovascular assessment.

📖 Full Retelling

arXiv:2603.04589v1 Announce Type: new Abstract: Electrocardiography (ECG) analysis is crucial for cardiac diagnosis, yet existing foundation models often fail to capture the periodicity and diverse features required for varied clinical tasks. We propose ECG-MoE, a hybrid architecture that integrates multi-model temporal features with a cardiac period-aware expert module. Our approach uses a dual-path Mixture-of-Experts to separately model beat-level morphology and rhythm, combined with a hierar

🏷️ Themes

Healthcare AI, Medical Diagnostics

📚 Related People & Topics

Electrocardiography

Electrocardiography

Examination of the heart's electrical activity

Electrocardiography is the process of using an electrocardiograph (a device) to produce an electrocardiogram (a recording, often called an ECG or EKG) that shows a line graph of the heart's electrical activity through repeated cardiac cycles. It is an electrogram of the heart which is a graph of vol...

View Profile → Wikipedia ↗

Entity Intersection Graph

Connections for Electrocardiography:

🌐 Machine learning 1 shared
View full profile

Mentioned Entities

Electrocardiography

Electrocardiography

Examination of the heart's electrical activity

}
Original Source
--> Computer Science > Artificial Intelligence arXiv:2603.04589 [Submitted on 4 Mar 2026] Title: ECG-MoE: Mixture-of-Expert Electrocardiogram Foundation Model Authors: Yuhao Xu , Xiaoda Wang , Yi Wu , Wei Jin , Xiao Hu , Carl Yang View a PDF of the paper titled ECG-MoE: Mixture-of-Expert Electrocardiogram Foundation Model, by Yuhao Xu and Xiaoda Wang and Yi Wu and Wei Jin and Xiao Hu and Carl Yang View PDF HTML Abstract: Electrocardiography analysis is crucial for cardiac diagnosis, yet existing foundation models often fail to capture the periodicity and diverse features required for varied clinical tasks. We propose ECG-MoE, a hybrid architecture that integrates multi-model temporal features with a cardiac period-aware expert module. Our approach uses a dual-path Mixture-of-Experts to separately model beat-level morphology and rhythm, combined with a hierarchical fusion network using LoRA for efficient inference. Evaluated on five public clinical tasks, ECG-MoE achieves state-of-the-art performance with 40% faster inference than multi-task baselines. Subjects: Artificial Intelligence (cs.AI) Cite as: arXiv:2603.04589 [cs.AI] (or arXiv:2603.04589v1 [cs.AI] for this version) https://doi.org/10.48550/arXiv.2603.04589 Focus to learn more arXiv-issued DOI via DataCite (pending registration) Submission history From: Yuhao Xu [ view email ] [v1] Wed, 4 Mar 2026 20:36:05 UTC (559 KB) Full-text links: Access Paper: View a PDF of the paper titled ECG-MoE: Mixture-of-Expert Electrocardiogram Foundation Model, by Yuhao Xu and Xiaoda Wang and Yi Wu and Wei Jin and Xiao Hu and Carl Yang View PDF HTML TeX Source view license Current browse context: cs.AI < prev | next > new | recent | 2026-03 Change to browse by: cs References & Citations NASA ADS Google Scholar Semantic Scholar export BibTeX citation Loading... BibTeX formatted citation × loading... Data provided by: Bookmark Bibliographic Tools Bibliographic and Citation Tools Bibliographic Explorer Toggle Bibliographic Explore...
Read full article at source

Source

arxiv.org

More from USA

News from Other Countries

🇬🇧 United Kingdom

🇺🇦 Ukraine