SP
BravenNow
Rational Neural Networks have Expressivity Advantages
| USA | technology | ✓ Verified - arxiv.org

Rational Neural Networks have Expressivity Advantages

#Rational Activation Functions #Neural Networks #Parameter Efficiency #Expressivity #Approximation Theory #Machine Learning #arXiv Research

📌 Key Takeaways

  • Neural networks with rational activation functions are more expressive than modern alternatives
  • Rational activation functions offer better parameter efficiency
  • The study establishes approximation-theoretic separations between different activation types
  • This research could lead to more efficient neural network architectures

📖 Full Retelling

Researchers published a groundbreaking study on February 12, 2026, demonstrating that neural networks with trainable low-degree rational activation functions offer significant advantages in expressiveness and parameter efficiency compared to modern activation functions like ELU, LeakyReLU, and ReLU, establishing new theoretical foundations for neural network architecture design. The study, currently available on the arXiv preprint server, represents a significant advancement in neural network theory, systematically comparing rational activation functions against a comprehensive list of modern alternatives including ELU, LeakyReLU, LogSigmoid, PReLU, ReLU, SELU, CELU, Sigmoid, SiLU, Mish, Softplus, Tanh, Softmin, Softmax, and LogSoftmax. Their analysis revealed that rational functions can achieve the same error targets with fewer parameters, potentially leading to more efficient neural network architectures across various computational domains. The researchers established approximation-theoretic separations, proving that networks built from standard activation functions require more resources to achieve the same level of accuracy as those using rational activation functions, with implications for future developments in artificial intelligence and machine learning.

🏷️ Themes

Neural Network Architecture, Computational Efficiency, Theoretical Computer Science

📚 Related People & Topics

Expressivity

Topics referred to by the same term

Expressivity, expressiveness, and expressive power may refer to:

View Profile → Wikipedia ↗

Neural network

Structure in biology and artificial intelligence

A neural network is a group of interconnected units called neurons that send signals to one another. Neurons can be either biological cells or mathematical models. While individual neurons are simple, many of them together in a network can perform complex tasks.

View Profile → Wikipedia ↗
Approximation theory

Approximation theory

Theory of getting acceptably close inexact mathematical calculations

In mathematics, approximation theory is concerned with how functions can best be approximated with simpler functions, and with quantitatively characterizing the errors introduced thereby. What is meant by best and simpler will depend on the application. A closely related topic is the approximation o...

View Profile → Wikipedia ↗

Entity Intersection Graph

No entity connections available yet for this article.

Original Source
arXiv:2602.12390v1 Announce Type: cross Abstract: We study neural networks with trainable low-degree rational activation functions and show that they are more expressive and parameter-efficient than modern piecewise-linear and smooth activations such as ELU, LeakyReLU, LogSigmoid, PReLU, ReLU, SELU, CELU, Sigmoid, SiLU, Mish, Softplus, Tanh, Softmin, Softmax, and LogSoftmax. For an error target of $\varepsilon>0$, we establish approximation-theoretic separations: Any network built from stand
Read full article at source

Source

arxiv.org

More from USA

News from Other Countries

🇬🇧 United Kingdom

🇺🇦 Ukraine