BART
Denoising sequence-to-sequence pre-training for language tasks.
BART is a denoising sequence-to-sequence pre-trained language model for text generation, translation, and comprehension.
Pricing
Official site · May 10, 2026, 1:06 PM
Pricing notes were collected, but there are no normalized numeric fields to display yet.
No public paid pricing was disclosed by Meta for BART itself. The official source I found is the open-source fairseq repository rather than a billing or pricing page.
Meta's BART is distributed through the official fairseq repository as model code/checkpoints, and I found no official Meta pricing page, API price sheet, or other public commercial pricing for BART. The public offering is an open-source model release rather than a metered paid text API.
Model Intelligence
Benchmarkable
No
Model level
family
Recent stories
0 linked stories
No linked stories yet.