Pre-trained Model

Share

A pre-trained model refers to a model or a saved network created by someone else and trained on a large dataset to solve a similar problem. AI teams can use a pre-trained model as a starting point, instead of building a model from scratch. Examples of successful large-scale pre-trained language models are Bidirectional Encoder Representations from Transformers (BERT) and the Generative Pre-trained Transformer (GPT-n) series.

Filter terms by

Glossary Alphabetical filter

Related resources

sg-w&b-integration
Training
featured image for how to measure inference time
Deployment
resnet50-how-to-achieve-SOTA-accuracy-on-imagenet
Computer Vision
Share
Add Your Heading Text Here
				
					from transformers import AutoFeatureExtractor, AutoModelForImageClassification

extractor = AutoFeatureExtractor.from_pretrained("microsoft/resnet-50")

model = AutoModelForImageClassification.from_pretrained("microsoft/resnet-50")