Finite basis Kolmogorov-Arnold networks: domain decomposition for data-driven and physics-informed problems
Kolmogorov-Arnold networks (KANs) have attracted attention recently as an alternative to multilayer perceptrons (MLPs) for scientific machine learning. However, KANs can be expensive to train, even for relatively small networks. Inspired by finite basis physics-informed neural networks (FBPINNs), in...
Saved in:
Main Authors | , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
28.06.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Kolmogorov-Arnold networks (KANs) have attracted attention recently as an
alternative to multilayer perceptrons (MLPs) for scientific machine learning.
However, KANs can be expensive to train, even for relatively small networks.
Inspired by finite basis physics-informed neural networks (FBPINNs), in this
work, we develop a domain decomposition method for KANs that allows for several
small KANs to be trained in parallel to give accurate solutions for multiscale
problems. We show that finite basis KANs (FBKANs) can provide accurate results
with noisy data and for physics-informed training. |
---|---|
DOI: | 10.48550/arxiv.2406.19662 |