Jack and Masters of all Trades: One-Pass Learning Sets of Model Sets From Large Pre-Trained Models

For deep learning, size is power. Massive neural nets trained on broad data for a spectrum of tasks are at the forefront of artificial intelligence. These large pre-trained models or Jacks of All Trades (JATs), when fine-tuned for downstream tasks, are gaining importance in driving deep learning adv...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Han, Xiang Choong, Yew-Soon Ong, Gupta, Abhishek, Chen, Caishun, Lim, Ray
Format Paper Journal Article
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 21.06.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:For deep learning, size is power. Massive neural nets trained on broad data for a spectrum of tasks are at the forefront of artificial intelligence. These large pre-trained models or Jacks of All Trades (JATs), when fine-tuned for downstream tasks, are gaining importance in driving deep learning advancements. However, environments with tight resource constraints, changing objectives and intentions, or varied task requirements, could limit the real-world utility of a singular JAT. Hence, in tandem with current trends towards building increasingly large JATs, this paper conducts an initial exploration into concepts underlying the creation of a diverse set of compact machine learning model sets. Composed of many smaller and specialized models, the Set of Sets is formulated to simultaneously fulfil many task settings and environmental conditions. A means to arrive at such a set tractably in one pass of a neuroevolutionary multitasking algorithm is presented for the first time, bringing us closer to models that are collectively Masters of All Trades.
ISSN:2331-8422
DOI:10.48550/arxiv.2205.00671