High-performance serving framework for LLM and VLM workloads.
OpenClaw beta added live control of a real Chrome session through Chrome DevTools MCP; the project also added native SGLang provider support and parallel tool calling work. Try it if you need self-hosted agents to handle authenticated browser flows with local inference backends.