2

I have a real-time 30fps video, and I wish to convert it to a timelapse 30fps video so that each frame of the target video is the result of arithmetical mean of 10 original frames in a linear color-space. I thought of using the framerate filter, but it is designed to interpolates frames only when increasing the frame-rate. When I reduce the frame rate it simply drops some 9/10 of the frames resulting in jerky movement of fast moving objects instead of a nice smooth motion blur.

Any way to achieve this with the command-line ffmpeg?

ybungalobill
  • 143
  • 7
  • See if you can adapt the basic method [here](http://video.stackexchange.com/q/16552/1871). – Gyan Mar 20 '16 at 14:13
  • I'm so unsure what exactly it does that it would be easier to write in C with libav directly... – ybungalobill Mar 20 '16 at 23:17
  • Each frame in the output is mean of 4 frames i.e. output frame 1 = mean of input frames 1,2,3,4, output frame 2 = mean of input frames 5,6,7,8... – Gyan Mar 21 '16 at 04:54
  • @Mulvya this is obvious, but in what color space does it work? For nice motion blur it is really important. Besides this won't be getting me beyond the power-of-two factors. – ybungalobill Mar 21 '16 at 07:29
  • It performs arithmetic upon the values as decoded. If those are gamma-weighted, so is the result.If you want a linear space, you can apply a pixel format change beforehand (with an expanded depth), then use the [lut](https://ffmpeg.org/ffmpeg-all.html#lut_002c-lutrgb_002c-lutyuv) or other applicable filter and adjust gamma. – Gyan Mar 21 '16 at 07:47

0 Answers0