A MotionDesign post claimed Blender-style easing wastes 40-70% of frames and argued for deterministic motion control instead. Motion artists should treat the claim as a debate prompt and test the method against their own motion and pipeline needs.

You can inspect Blender's own F-Curve docs, check the editing page for interpolation and easing controls, and compare that with a motion-control vendor's description of repeatable programmed camera moves. The gap between those sources is the whole story: one side is talking about animation ergonomics, the other is talking like a CNC controller.
The post's headline number came from a simulated 90 degree rotation over 1.5 seconds at 96 FPS. In the code attached to the Reddit post, frames count as useless when the per-frame angle change drops below 0.5 degrees, while frames count as critical when jumps exceed 3 degrees.
The script also hardcodes a "CNC" comparison as a constant linear sweep, then prints outputs like effective FPS in fast segments, average useless-frame percentage across scenarios, and an example of wasted render time if each frame took two minutes to render. Those numbers are presented as proof of a bad distribution problem, not as measurements from a Blender project file.
Blender's official F-Curve introduction frames interpolation as a convenience feature: set a few keyframes, let the curve calculate the in-between values, avoid touching every frame manually. Its Graph Editor editing docs separate interpolation mode from easing type, which is a useful reminder that the Reddit post is arguing against a specific motion profile, not against keyframing as a category.
That distinction matters because the O.C.E.A.N test is really a threshold test. Change the thresholds, the motion path, or the interpolation choice, and the waste number moves with it.
The author's replies in the Reddit thread compared digital animation to robots, CNC machines, and motion-control rigs. A basic motion-control explainer makes the same core point about physical rigs: they are prized for precise, repeatable camera moves.
Where the thread split was simpler. Several commenters quoted in the Reddit thread said precision is not the same thing as a better artistic workflow, and one of the most upvoted responses said the pitch needs shot examples, not charts. At the time of the post, the only promised next step was a Discord link "soon" on O-C-E-A-N.de, plus the attached plots and code.
The thread landed less like a product launch than a live argument about what counts as proof. One commenter in the Reddit thread said, "pure numbers and stats don't define if a tool or workflow is better," while another asked for outputs that show why artists would want the system.
That leaves the post in an unusual spot. It introduced a concrete test harness, a strong anti-keyframe thesis, and a vocabulary borrowed from industrial control, but the community response focused on missing demos and missing context, not on the math alone.