BART
Denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension.
A denoising sequence-to-sequence language model for natural language generation, translation, and comprehension.
Model Intelligence
Benchmarkable
No
Model level
family
Recent stories
0 linked stories
No linked stories yet.