BioGPT: generative pre-trained transformer for biomedical text generation and mining
Pre-trained language models have attracted increasing attention in the biomedical domain, inspired by their great success in the general natural language domain. Among the two main branches of pre-trained language models in the general language domain, i.e. BERT (and its variants) and GPT (and its v...
Saved in:
Published in | Briefings in bioinformatics Vol. 23; no. 6 |
---|---|
Main Authors | , , , , , , |
Format | Journal Article |
Language | English |
Published |
England
19.11.2022
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Be the first to leave a comment!