Introducing
Measuring Physical Frame Rate
from Visual Dynamics.
Texas A&M University
The PoM Leaderboard is now live!
Join us and test if your generated videos align with the pulse of motion.
Modern video generators produce stunning visuals — yet they lack a reliable internal clock. A hummingbird might flap in extreme slow motion. A falling person might defy gravity. We call this Chronometric Hallucination.
Pulse of Motion recovers the true Physical FPS directly from visual motion, enabling corrections that dramatically improve perceived naturalness.
Original AI-generated videos vs. PhyFPS-corrected versions. Users overwhelmingly preferred the corrected version in every case.
A Pomeranian dog chasing a soccer ball across a lawn.
24 fps → 35.8 fps
A snake slithering across polished wooden floorboards.
24 fps → 60.2 fps
A detailed view of the churning white wake trailing behind a large ship.
16 fps → 44.9 fps
A continuous tracking shot moving steadily through a brightly lit subway tunnel.
16 fps → 36.7 fps
Martial arts students performing synchronized stretching exercises.
24 fps → 51.5 fps
A chef tossing a crab in a flaming wok filled with hot oil.
24 fps → 49.8 fps
Onion rings frying in bubbling hot oil.
24 fps → 58.9 fps
A chameleon shooting its tongue out to catch an ant.
24 fps → 52.5 fps
Captured fish struggling inside a fishing net.
24 fps → 15.0 fps
Raindrops falling and hitting green leaves.
16 fps → 21.1 fps
State-of-the-art generators exhibit large gaps between nominal frame rate and actual physical motion speed.
Physical speed fluctuates both across prompts and within individual videos — even under identical settings.
Re-timing to predicted PhyFPS significantly improves human-perceived naturalness in controlled user studies.
General-purpose Vision-Language Models are unreliable temporal judges — a dedicated predictor is essential.
@article{gao2026pulse,
title={The Pulse of Motion: Measuring Physical Frame Rate from Visual Dynamics},
author={Gao, Xiangbo and Wu, Mingyang and Yang, Siyuan and Yu, Jiongze and Taghavi, Pardis and Lin, Fangzhou and Tu, Zhengzhong},
journal={arXiv preprint arXiv:2603.14375},
year={2026}
}