Skip to content
AI Primer
release

Meta ships SAM 3.1 with object multiplexing for 16 tracked objects

SAM 3.1 is a drop-in update that shares video computation across up to 16 tracked objects instead of rerunning most of the model per object. Meta's H100 numbers show roughly 30 FPS at 16 objects versus under 10 FPS for SAM 3, which cuts multi-object video tracking cost.

2 min read
Meta ships SAM 3.1 with object multiplexing for 16 tracked objects
Meta ships SAM 3.1 with object multiplexing for 16 tracked objects

TL;DR

  • Meta's SAM 3.1 release positions SAM 3.1 as a "drop-in update" to SAM 3, with object multiplexing as the main change for multi-object video processing.
  • In practice, the performance thread says SAM 3.1 can track up to 16 objects in one forward pass instead of rerunning the model once per object, which cuts repeated video compute.
  • The H100 chart shows the biggest gain under heavier workloads: about 30.16 FPS at 16 objects versus 9.77 FPS for SAM 3, and 11.46 FPS at 128 objects versus 1.57 FPS.

What changed in SAM 3.1?

The architectural shift is narrow but useful. Meta's release post describes SAM 3.1 as a drop-in update, while the workflow diagram shows how it now multiplexes multiple tracked objects into a single shared computation and then demultiplexes the outputs. In the older SAM 3 path, each object triggered a separate pass, so most of the same frame-level work was repeated.

That matters because the bottleneck was less about raw vision quality than systems efficiency. As the thread puts it, SAM 3.1 "shares that heavy work across objects," and the attached

shows throughput holding up much better as object count rises: 33.77 FPS at one object, 30.16 at 16, and 11.46 at 128, versus 26.47, 9.77, and 1.57 for SAM 3.

Further reading

Discussion across the web

Where this story is being discussed, in original context.

On X· 2 threads
TL;DR1 post
What changed in SAM 3.1?1 post
Share on X