With 5 images I have to create a video of 60 seconds in ffmpeg, each image has to display for 15 seconds. after 15 seconds, first image has to fade out and 2nd image has to fade in, after that 2nd image has to fade out, 3rd image has to fade in..etc. Please guide me how can I achieve this using ffmpeg commands.
2 Answers
Dip/fade to black
Scroll down for crossfade method.

Example where each image displayed for 5 seconds and each has a fade that lasts 1 second. Each image input has the same width, height, and sample aspect ratio. If they vary in size see example #3 below.
MP4 output
ffmpeg \
-loop 1 -t 5 -i input0.png \
-loop 1 -t 5 -i input1.png \
-loop 1 -t 5 -i input2.png \
-loop 1 -t 5 -i input3.png \
-loop 1 -t 5 -i input4.png \
-filter_complex \
"[0:v]fade=t=out:st=4:d=1[v0]; \
[1:v]fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v1]; \
[2:v]fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v2]; \
[3:v]fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v3]; \
[4:v]fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v4]; \
[v0][v1][v2][v3][v4]concat=n=5:v=1:a=0,format=yuv420p[v]" -map "[v]" out.mp4
With audio
Same as above but with audio:
ffmpeg \
-loop 1 -t 5 -i input0.png \
-loop 1 -t 5 -i input1.png \
-loop 1 -t 5 -i input2.png \
-loop 1 -t 5 -i input3.png \
-loop 1 -t 5 -i input4.png \
-i audio.m4a \
-filter_complex \
"[0:v]fade=t=out:st=4:d=1[v0]; \
[1:v]fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v1]; \
[2:v]fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v2]; \
[3:v]fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v3]; \
[4:v]fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v4]; \
[v0][v1][v2][v3][v4]concat=n=5:v=1:a=0,format=yuv420p[v]" -map "[v]" -map 5:a -shortest out.mp4
For input images with varying or arbitrary sizes
Like the first example, but with input images that vary in width x height. They will be padded to fit within a 1280x720 box:
ffmpeg \
-loop 1 -t 5 -i input0.png \
-loop 1 -t 5 -i input1.png \
-loop 1 -t 5 -i input2.png \
-loop 1 -t 5 -i input3.png \
-loop 1 -t 5 -i input4.png \
-filter_complex \
"[0:v]scale=1280:720:force_original_aspect_ratio=decrease,pad=1280:720:(ow-iw)/2:(oh-ih)/2,setsar=1,fade=t=out:st=4:d=1[v0]; \
[1:v]scale=1280:720:force_original_aspect_ratio=decrease,pad=1280:720:(ow-iw)/2:(oh-ih)/2,setsar=1,fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v1]; \
[2:v]scale=1280:720:force_original_aspect_ratio=decrease,pad=1280:720:(ow-iw)/2:(oh-ih)/2,setsar=1,fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v2]; \
[3:v]scale=1280:720:force_original_aspect_ratio=decrease,pad=1280:720:(ow-iw)/2:(oh-ih)/2,setsar=1,fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v3]; \
[4:v]scale=1280:720:force_original_aspect_ratio=decrease,pad=1280:720:(ow-iw)/2:(oh-ih)/2,setsar=1,fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v4]; \
[v0][v1][v2][v3][v4]concat=n=5:v=1:a=0,format=yuv420p[v]" -map "[v]" out.mp4
See the examples in Resizing videos to fit into static sized player if you want to crop (fill the screen) instead of pad (letterbox/pillarbox), or if you want to prevent upscaling.
GIF output
Adds the filters from How do I convert a video to GIF using ffmpeg, with reasonable quality?
ffmpeg \
-framerate 10 -loop 1 -t 5 -i input0.png \
-framerate 10 -loop 1 -t 5 -i input1.png \
-framerate 10 -loop 1 -t 5 -i input2.png \
-framerate 10 -loop 1 -t 5 -i input3.png \
-framerate 10 -loop 1 -t 5 -i input4.png \
-filter_complex \
"[0:v]fade=t=out:st=4:d=1[v0]; \
[1:v]fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v1]; \
[2:v]fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v2]; \
[3:v]fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v3]; \
[4:v]fade=t=in:st=0:d=1,fade=t=out:st=4:d=1[v4]; \
[v0][v1][v2][v3][v4]concat=n=5:v=1:a=0,split[v0][v1]; \
[v0]palettegen[p];[v1][p]paletteuse[v]" -map "[v]" out.gif
Use the -loop output option to control the number of times the GIF loops. Default is infinite loop if this option is not used. A value of -1 is no loop.
Options and filters used:
-tto set duration in seconds of each input.-loop 1loops the image otherwise it would have a duration of 1 frame.-framerateto set input image frame rate (default when undeclared is 25). Useful for making GIF.scale with pad to fit the input images into a specific, uniform size (used in example #3).
fade to fade in and out.
dis the duration of the fade.stis when it starts.concat to concatenate (or "join") each image.
format to output a chroma subsampling scheme that is compatible with non-FFmpeg based players if outputting MP4 and encoding with libx264 (the default encoder for MP4 output if it is supported by your build).
split to make copies of a filter output. Needed by the palette* filters to do everything in one command.
palettegen and paletteuse for making nice looking GIF.
Crossfade
Example where each image displayed for 5 seconds and each has a crossfade that lasts 1 second. Each image input has the same width, height, and sample aspect ratio. If they vary in size then adapt example #3 above.
MP4 output
ffmpeg \
-loop 1 -t 5 -i 1.png \
-loop 1 -t 5 -i 2.png \
-loop 1 -t 5 -i 3.png \
-loop 1 -t 5 -i 4.png \
-loop 1 -t 5 -i 5.png \
-filter_complex \
"[1]format=yuva444p,fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+4/TB[f0]; \
[2]format=yuva444p,fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+8/TB[f1]; \
[3]format=yuva444p,fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+12/TB[f2]; \
[4]format=yuva444p,fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+16/TB[f3]; \
[0][f0]overlay[bg1];[bg1][f1]overlay[bg2];[bg2][f2]overlay[bg3]; \
[bg3][f3]overlay,format=yuv420p[v]" -map "[v]" -movflags +faststart out.mp4
With audio
ffmpeg \
-loop 1 -t 5 -i 1.png \
-loop 1 -t 5 -i 2.png \
-loop 1 -t 5 -i 3.png \
-loop 1 -t 5 -i 4.png \
-loop 1 -t 5 -i 5.png \
-i music.mp3 \
-filter_complex \
"[1]format=yuva444p,fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+4/TB[f0]; \
[2]format=yuva444p,fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+8/TB[f1]; \
[3]format=yuva444p,fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+12/TB[f2]; \
[4]format=yuva444p,fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+16/TB[f3]; \
[0][f0]overlay[bg1];[bg1][f1]overlay[bg2];[bg2][f2]overlay[bg3]; \
[bg3][f3]overlay,format=yuv420p[v]" -map "[v]" -map 5:a -shortest -movflags +faststart out.mp4
Crossfade between two videos with audio
Select 5 second segment from each input and add a 1 second crossfade:
ffmpeg -i input0.mp4 -i input1.mp4 -filter_complex \
"[0:v]trim=start=5:end=10,setpts=PTS-STARTPTS[v0];
[1:v]trim=start=12:end=17,setpts=PTS-STARTPTS+4/TB,format=yuva444p,fade=st=4:d=1:t=in:alpha=1[v1];
[v0][v1]overlay,format=yuv420p[v];
[0:a]atrim=start=5:end=10,asetpts=PTS-STARTPTS[a0];
[1:a]atrim=start=12:end=17,asetpts=PTS-STARTPTS[a1];
[a0][a1]acrossfade=d=1[a]" \
-map "[v]" -map "[a]" output.mp4
GIF output
ffmpeg \
-framerate 10 -loop 1 -t 5 -i 1.png \
-framerate 10 -loop 1 -t 5 -i 2.png \
-framerate 10 -loop 1 -t 5 -i 3.png \
-framerate 10 -loop 1 -t 5 -i 4.png \
-framerate 10 -loop 1 -t 5 -i 5.png \
-filter_complex \
"[1]format=rgba,fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+4/TB[f0]; \
[2]format=rgba,fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+8/TB[f1]; \
[3]format=rgba,fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+12/TB[f2]; \
[4]format=rgba,fade=d=1:t=in:alpha=1,setpts=PTS-STARTPTS+16/TB[f3]; \
[0][f0]overlay[bg1];[bg1][f1]overlay[bg2];[bg2][f2]overlay[bg3];[bg3][f3]overlay,split[v0][v1]; \
[v0]palettegen[p];[v1][p]paletteuse[v]" -map "[v]" out.gif
Use the -loop output option to control the number of times the GIF loops. Default is infinite loop if this option is not used. A value of -1 is no loop.
- 57,139
- 15
- 118
- 145
-
your solution is not working in android it gives error `No such filter` can you please help?? – Janki Gadhiya May 18 '16 at 06:52
-
@jankigadhiya Make a new question. Include your full `ffmpeg` command and the complete console/log output. – llogan May 18 '16 at 16:43
-
1@LordNeckbeard, thanks for this answer, I used it for another example. Have I understood well that number following `-t` in `-loop 1 -t 1 -i 001.png` defines the duration of individual frames, and that numbers following `T/` within `filter_complex` block defines the transition's duration? And is the frame duration in this example counted including transition's duration or not? – cincplug May 19 '16 at 08:27
-
@LordNeckbeard Its taking too much time to generate video, can we reduce it ? – Nisarg Aug 05 '16 at 12:42
-
@LordNeckbeard Yea sure please look at [this](http://pastebin.com/CVZrf1E6) – Nisarg Aug 06 '16 at 04:30
-
@LordNeckbeard Yes Please check [this](http://pastebin.com/dLtxZTgW) – Nisarg Aug 06 '16 at 08:51
-
1@Nisarg That does not appear to be the complete output and I'm not sure what command it is from of the two you displayed earlier. Anyway, try adding `-preset ultrafast`. – llogan Aug 06 '16 at 16:06
-
@LordNeckbeard Yea thats what i did :) but it still takes 1 to 2 mintues, can we still reduce this i mean by managing inputs ? – Nisarg Aug 07 '16 at 13:12
-
1@LordNeckbeard I need to combine a set of images, video clips and an audio track to create a single video file (preferably ogg, but that is less relevant at this point). In addition, I need to create some transition effects between adjacent images. Is there any way to script this whole task using ffmpeg and/or other command line tools? The goal is to automate the task via a command line interface. – Web User Aug 17 '16 at 17:37
-
1@WebUser Transitions are probably going to be easier using `melt`. – llogan Aug 23 '16 at 05:31
-
@LordNeckbeard sorry I am unfamiliar with melt. Can you point me to it? – Web User Aug 25 '16 at 22:59
-
How would you implement the setting of the SAR and DAR within the filter_complex? For example I need to add the additional filter "setsar=sar=1/1,setdar=dar=16/9" to this line in the filter_complex: "[1:v][0:v]blend=all_expr='A*(if(gte(T,0.5),1,T/0.5))+B*(1-(if(gte(T,0.5),1,T/0.5)))'[b1v];". Where would I place this additional filter? I have tried to place it after the "[0:v]" and I get the ffmpeg error, "Too many inputs specified for the "setsar" filter." – jrkt Sep 22 '16 at 19:22
-
@JonStevens `[0:v]setsar=1[sar0];[1:v][sar0]blend...` – llogan Sep 23 '16 at 02:17
-
I tried adding that and it still won't work.. I have opened a new question if you wouldn't mind looking at it @LordNeckbeard? http://superuser.com/questions/1127542/creating-video-from-images-with-different-sars-with-ffmpeg – jrkt Sep 23 '16 at 14:06
-
Thanks for replying. I am trying to include a specific zoompan filter for each image so I can apply that filter at the same time. How would I also include that in the command above that uses the blend filter? Where would I put it? One example of a zoompan I'm trying to include is: zoompan=z='min(zoom+0.0005,1.5)':x='if(gte(zoom,1.5),x,x-1)':y='y' – jrkt Sep 26 '16 at 17:35
-
@LordNeckbeard I've created a question asking the above about combining the blend and zoompan filters into one command http://superuser.com/questions/1128563/ffmpeg-create-mp4-from-images-with-blend-and-zoompan-filters. I could really use some help. – jrkt Sep 27 '16 at 14:10
-
4The ffmpeg approach is working nicely for me; thanks! One tip for newcomers to the page; in the `concat=n=9` part of the command, the `9` comes from the 5 images in the example + 4 transitions between the images. If you're handling a different number of images, you'll need to adjust that accordingly. – Jim Miller Mar 13 '17 at 20:58
-
Spoke too soon on the ffmpeg crossfade approach. It works for the original 1 sec / 0.5 sec, but when I try to do 2 sec / 0.5, it gives me image times of 2, 3.5, and 3.5 sec for the following 3 image merge: `/usr/local/bin/ffmpeg -y -loop 1 -t 2 -i file1.png -loop 1 -t 2 -i file2.png -loop 1 -t 2 -i file3.png -filter_complex "[1:v][0:v]blend=all_expr='A*(if(gte(T,0.5),1,T/0.5))+B*(1-(if(gte(T,0.5),1,T/0.5)))'[b1v]; [2:v][1:v]blend=all_expr='A*(if(gte(T,0.5),1,T/0.5))+B*(1-(if(gte(T,0.5),1,T/0.5)))'[b2v]; [0:v][b1v][1:v][b2v][2:v]concat=n=5:v=1:a=0,format=yuv420p[v]" -map "[v]" out.mp4` ideas? – Jim Miller Mar 16 '17 at 00:33
-
@LordNeckbeard How can I do it withouth the error from scale is not the same or SAR is not the same? There is a parameter that permit create a video with images with different scales and SAR? – 7sega7 Apr 24 '17 at 15:01
-
@7sega7 I'd have to see your command and console output to see what exactly is going on. You can use a pastebin site and provide the link here. – llogan Apr 26 '17 at 01:37
-
@LordNeckbeard Thanks for the great answer once again. we can add setdar=16/9 to prevent aspect ratio related issues. Can you suggest me a solution to map audio stream as well in the same script i am getting weird responses each time i am mapping it via complex filter – Shubham AgaRwal Aug 02 '18 at 11:52
-
@7sega7 add scale=1280:720,setdar=16/9 for each image to scale and maintain aspect ratio – Shubham AgaRwal Aug 02 '18 at 11:58
-
1@Killer Added an example that has audio. – llogan Aug 02 '18 at 18:50
-
Tear of joy ( ͡↑ ͜ʖ ͡↑) thanks man @LordNeckbeard i tried every possible convention nearest was -map [5:a] but you finally updated the answer. Master – Shubham AgaRwal Aug 02 '18 at 19:37
-
getting error Invalid stream specifier: "[v]" Stream map '"[v]"' matches no streams. – Abhay Koradiya Sep 02 '20 at 15:40
-
@AbhayKoradiya Need to see your command. – llogan Sep 02 '20 at 17:39
I wrote a general bash script that takes in a path to a folder of images, and outputs a crossfade video with ffmpeg:
https://gist.github.com/anguyen8/d0630b6aef6c1cd79b9a1341e88a573e
The script essentially looks at the images in a folder and prints out a command that is similar to the answer by @LordNeckbeard above, and executes the command. This script helps when you have many images in a folder and don't want to manually type in a depressingly long command.
- 171
- 1
- 4
-
Sorry, but your script fails with ffmpeg 3.0.1, with inputs #0 to #4 it returns:"Invalid file index 5 in filtergraph description" – Krzysztof Bociurko May 18 '16 at 12:30
-
TobySpeight: good point, I've edited the answer to be more clear. Basically the main idea is already given by @LordNeckbeard above. This script just generalizes to many images. – anh_ng8 Sep 21 '16 at 12:54
