Transfer learning for photonic delay-based reservoir computing to compensate parameter drift

Photonic reservoir computing has been demonstrated to be able to solve various complex problems. Although training a reservoir computing system is much simpler compared to other neural network approaches, it still requires considerable amounts of resources which becomes an issue when retraining is r...

Full description

Saved in:
Bibliographic Details
Published inNanophotonics (Berlin, Germany) Vol. 12; no. 5; pp. 949 - 961
Main Authors Bauwens, Ian, Harkhoe, Krishan, Bienstman, Peter, Verschaffelt, Guy, Van der Sande, Guy
Format Journal Article
LanguageEnglish
Published Berlin De Gruyter 10.03.2023
Walter de Gruyter GmbH
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Photonic reservoir computing has been demonstrated to be able to solve various complex problems. Although training a reservoir computing system is much simpler compared to other neural network approaches, it still requires considerable amounts of resources which becomes an issue when retraining is required. Transfer learning is a technique that allows us to re-use information between tasks, thereby reducing the cost of retraining. We propose transfer learning as a viable technique to compensate for the unavoidable parameter drift in experimental setups. Solving this parameter drift usually requires retraining the system, which is very time and energy consuming. Based on numerical studies on a delay-based reservoir computing system with semiconductor lasers, we investigate the use of transfer learning to mitigate these parameter fluctuations. Additionally, we demonstrate that transfer learning applied to two slightly different tasks allows us to reduce the amount of input samples required for training of the second task, thus reducing the amount of retraining.
ISSN:2192-8614
2192-8606
2192-8614
DOI:10.1515/nanoph-2022-0399