Modular
Inference from Kernel to Cloud.
An AI inference platform that accelerates model serving and supports self-hosted, managed cloud, and BYOC deployments with OpenAI-compatible endpoints and hardware optimization.

Recent stories
0 linked stories
No linked stories yet.