Data-driven constitutive model of complex fluids using recurrent neural networks

This study introduces the Constitutive Neural Network (ConNN) model, a machine learning algorithm that accurately predicts the temporal response of complex fluids under specific deformations. The ConNN model utilizes a recurrent neural network architecture to capture the time dependent stress respon...

Full description

Saved in:
Bibliographic Details
Published inRheologica acta Vol. 62; no. 10; pp. 569 - 586
Main Authors Jin, Howon, Yoon, Sangwoong, Park, Frank C., Ahn, Kyung Hyun
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.10.2023
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:This study introduces the Constitutive Neural Network (ConNN) model, a machine learning algorithm that accurately predicts the temporal response of complex fluids under specific deformations. The ConNN model utilizes a recurrent neural network architecture to capture the time dependent stress responses, and the recurrent units are specifically designed to reflect the characteristics of complex fluids (fading memory, finite elastic deformation, and relaxation spectrum), without presuming any equation of motion of the fluid. We demonstrate that the ConNN model can effectively replicate the temporal data generated by the Giesekus model and the Thixotropic-Elasto-Visco-Plastic (TEVP) fluid model under varying shear rates. To test the performance of the trained model, we subject it to an oscillatory shear flow, with periodic reversals in flow direction, which has not been trained on. The ConNN model successfully replicates the shear moduli of the original models, and the trained values of the recurrent parameters match the physical prediction of the original models. However, we do observe a slight deviation in the normal stresses, indicating that further improvements are necessary to achieve more rigorous physical symmetry and improve the model prediction.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0035-4511
1435-1528
DOI:10.1007/s00397-023-01405-z