Pre-trained Model

Share

A pre-trained model refers to a model or a saved network created by someone else and trained on a large dataset to solve a similar problem. AI teams can use a pre-trained model as a starting point, instead of building a model from scratch. Examples of successful large-scale pre-trained language models are Bidirectional Encoder Representations from Transformers (BERT) and the Generative Pre-trained Transformer (GPT-n) series.

Filter terms by

Related resources

deci-updates-tensorboard-blog-featured-2
Open Source
pytorch-training-sg-new-features-featured-2
Open Source
new-hardware-support-featured
Engineering