43

I'm trying to blur a portion of a video using FFmpeg (specifically to blur a face).

I have been trying to use a combination of timeline editing and the various bluring filters, but I cannot find a way to blur only a section of the video.

I'm hoping for something like:

-vf boxblur=enable='between(t,10,100)':width=20:height=20:x=400:y=200

Where width/height is size of blurred box and x/y are location of blurred box.

Is something like this possible?

llogan
  • 57,139
  • 15
  • 118
  • 145
occvtech
  • 1,700
  • 1
  • 14
  • 18

4 Answers4

64

It is possible to apply temporal and spatial blurring to a segment/section – assuming the area you want to blur is a static location.

Black lab pup
Original black lab pup image.

Using a mask image

enter image description hereenter image description here
Grayscale PNG mask image and resulting blurred image.

You can make a grayscale mask image to indicate the area to blur. For ease of use it should be the same size as the image or video you want to blur.

Example using alphamerge, avgblur, and overlay:

ffmpeg -i video.mp4 -i mask.png -filter_complex "[0:v][1:v]alphamerge,avgblur=10[alf];[0:v][alf]overlay[v]" -map "[v]" -map 0:a -c:v libx264 -c:a copy -movflags +faststart maskedblur.mp4
  • The white area is where the blur will occur, but this can easily be reversed with the negate filter for instance: [1:v]negate[mask];[0:v][mask]alphamerge,avgblur=10[alf]...

  • You could use the geq filter to generate a mask such as a gradient.

Blur specific area (without a mask)

Black lab pup with blur effect

ffmpeg -i input.mp4 -filter_complex "[0:v]crop=200:200:60:30,avgblur=10[fg];[0:v][fg]overlay=60:30[v]" -map "[v]" -map 0:a -c:v libx264 -c:a copy -movflags +faststart derpdogblur.mp4

Note: The x and y offset numbers in overlay (60 and 30 in this example) must match the crop offsets.

What this example does:

  1. Crop the copy to be the size of the area to be blurred. In this example: a 200x200 pixel box that is 60 pixels to the right (x axis) and 30 pixels down (y axis) from the top left corner.
  2. Blur the cropped area.
  3. Overlay the blurred area using the same x and y parameters from the crop filter.

Multiple blurs over specific areas (without a mask)

enter image description here
Blurred areas in top left, near center, and bottom.

ffmpeg -i input.mp4 -filter_complex "[0:v]crop=50:50:20:10,avgblur=10[b0];[0:v]crop=iw:30:(iw-ow)/2:ih-oh,avgblur=10[b1];[0:v]crop=100:100:120:80,avgblur=10[b2];[0:v][b0]overlay=20:10[ovr0];[ovr0][b1]overlay=(W-w)/2:H-h[ovr1];[ovr1][b2]overlay=120:80" -c:a copy -movflags +faststart output.mp4

Specific area not blurred (without a mask)

enter image description here

ffmpeg -i input.mp4 -filter_complex "[0:v]avgblur=10[bg];[0:v]crop=200:200:60:30[fg];[bg][fg]overlay=60:30" -c:a copy -movflags +faststart output.mp4

Additional stuff

llogan
  • 57,139
  • 15
  • 118
  • 145
  • Thanks so much for your response. That all makes great sense. As a side note, it also made the split filter make sense finally! Also, could it be possible through arithmetic expressions to dynamically move the blurred box around the image? I.E. for the purpose of blurring someone's face as they move in a non-linear fashion? – occvtech Apr 15 '15 at 21:49
  • Thanks again! I'll take a crack at it. I know that a non-linear editor would be 1000 times easier here, but I'm hoping to batch process multiple files and don't want to wait through the import/key frame/export process. Thanks again! – occvtech Apr 16 '15 at 15:11
  • 1
    does FFMPEG offer other shapes besides boxes, such as circles? – Sun Sep 23 '15 at 16:09
  • @LordNeckbeard I'm using cmd and I want to use Example 1 but when I execute the code I get this error `Unrecognized option 'filter_complex[0:v]crop=200:200:60:30,boxblur=10[fg];[0:v][fg]overlay=60:30[v]-map [v] -map 0:a -c:v libx264 -c:a copy -movflags +faststart output.mp4'. Error splitting the argument list: Option not found` – Jim May 03 '17 at 18:10
  • @LordNeckbeard Sorry for such a stupid question, I'm new to all of this. how to type these 4 lines of command in 1 line? I removed the / and this is the result https://pastebin.com/ajWHzzsj and I let the / inline and this is the result (the error from my last message) https://pastebin.com/6nwNi7Y0 – Jim May 03 '17 at 19:40
  • 1
    @Jim I noticed that my example command was missing a quote. You command should look something like this: `ffmpeg -i input.mp4 -filter_complex "[0:v]crop=200:200:60:30,boxblur=10[fg]; [0:v][fg]overlay=60:30[v]" -map "[v]" -map 0:a -c:v libx264 -c:a copy -movflags +faststart output.mp4` – llogan May 03 '17 at 19:51
  • @LordNeckbeard Is it possible to use a mask image and get the blurred crop from another area in the same frame, I get this error "Input frame sizes do not match (246x60 vs 1280x720)" and of course if I make the input match it will get the same area and it will be like a normal mask https://pastebin.com/Mpde3Rx3 – Jim May 07 '17 at 10:51
  • @Jim I don't quite understand what you're trying to do, but try switching the alphamerge and crop positions (and note that alphamerge takes two inputs–the image to apply the transparency and the mask that provides it): `[0:v][1:v]alphamerge,crop=246:60:998:98,boxblur=1[alf]`. A link to an image example may be helpful to see what you want to achieve. – llogan May 07 '17 at 17:48
  • @LordNeckbeard let me show you this example I'm trying to hide the second sun from this frame http://i.imgur.com/iSyFdh5.png by getting part of the sky and blur it, BTW I need to use a mask http://i.imgur.com/6erW3HS.png because I have different shapes in different videos I can do the same with this code `ffmpeg -i input.mp4 -filter_complex "[0:v]crop=245:62:999:101,boxblur=10[fg]; [0:v][fg]overlay=main_w-overlay_w-38:39[v]" -map "[v]" -map 0:a -c:v -c:a -vcodec libx264 -movflags +faststart output.mp4 ` but its a rectangular or square I can't do shapes with it – Jim May 09 '17 at 10:30
  • @LordNeckbeard can you please answer the similar [question](https://superuser.com/questions/1342355/alphamerge-filter-only-works-on-first-frame)? – Zain Ali Jul 22 '18 at 17:14
  • @ZainAli Looks like I was too slow. – llogan Jul 22 '18 at 22:46
  • @LordNeckbeard I have observed for above case(using mask image to apply blur effect) that as you increase boxblur value then blur effect decrease, like from 1-16 it keeps on increasing and after that it start decreasing(no matter how much high value you choose), do you have any idea that why this could happen or any better suggestion to use masks to blur with correct blur intensity? – Zain Ali Aug 02 '18 at 01:41
  • We've masked out this dog's face to protect its identity. – deltaray Jul 26 '20 at 21:56
  • I'm trying to add per-frame control with `enable` but converting `avgblur=10[fg];` into `avgblur=enable='between(t,1,3)':10[fg];` did not work. Tips? *Edit:* Aha! Already solved: `avgblur=enable='between(t,1,3)':sizeX=10[fg];` – Steven Lu Jun 02 '22 at 23:07
  • @llogan I would like to thank you very much for your answer because it was instrumental to being able to understand and become familiar with the filter graph syntax and featureset. It's indeed very powerful and makes it possible to do many cool things. However! To the original problem: The typical case of blurring of a face in video surely involves that face moving around all over the place in the frame. With AI models we can automate extracting the coordinates and size of the area to crop, but there is no way to use timeline option on the `crop` filter! How can we work around this? – Steven Lu Jun 03 '22 at 07:31
  • I need to be able to specify a region in the frame to blur, independently for each frame of a video file. It looks certainly to be doable when using the `drawbox` filter by supplying a copy of those filters for each frame and using `enable=eq(n, $i)` but this cannot be done for `crop` even if it can be done for `avgblur` as well as `overlay`. – Steven Lu Jun 03 '22 at 07:34
4

For whose which didn't get how to specify a duration, here's an example

ffmpeg -i derpdog.mp4 -filter_complex \
 "[0:v]crop=200:100:60:30,boxblur=10:enable='between(t,60*2,60*2+10)'[fg]; \
  [0:v][fg]overlay=60:30[v]" \
-map "[v]" -map 0:a -c:v libx264 -c:a copy -movflags +faststart derpdogblur.mp4

The blur will appear at the x=60, y=30 with a width=200, height=100 from second=120 to second=130

deFreitas
  • 631
  • 1
  • 6
  • 7
2

For the case when one dislikes the sharp edge of the blurring, I made a script that layers different stages of blurring so that the edge is not sharp and it looks like this:Softly_blurred_image

Instead of this:Original image

It is a python script:

#!/usr/bin/env python3
import os,stat
def blur_softly(matrix,video_in="video_to_be_blurred.mp4",video_out=""):
    if video_out == "":
        video_out = video_in[:-4] + "_blurred" + video_in[-4:]
    s0 = "ffmpeg -i " + video_in + " -filter_complex \\\n\"[0:v]null[v_int0]; \\\n"
    s1 = ''
    a = 0
    for m in matrix:
        blur = m[6]
        multiple = m[7]
        width = m[0]+blur*multiple*2
        height = m[1]+blur*multiple*2
        x_cord = m[2]-blur*multiple
        y_cord = m[3]-blur*multiple
        timein = m[4]
        timeout = m[5]
        step = m[8]
        margin = m[9]
        for i in range(blur):
            ii = multiple*i
            s0 = s0 + "[v_int0]crop="+str(width-2*ii+(margin//2)*2)+":"+str(height-2*ii+(margin//2)*2)+":"+str(x_cord+ii-margin//2)+":"+str(y_cord+ii-margin//2) + \
            ",boxblur="+str((i+1)*step)+":enable='between(t,"+str(timein)+","+str(timeout)+ \
            ")',crop="+str(width-2*ii)+ ":"+str(height-2*ii)+":"+str(margin//2)+":"+str(margin//2)+ "[blur_int" + str(i+1+a)+"]; \\\n"
            s1 = s1 + "[v_int"+ str(i+a) +"][blur_int"+str(i+a+1)+"]overlay="+str(x_cord+ii)+":"+str(y_cord+ii)+":enable='between(t,"+str(timein)+","+str(timeout)+ ")'[v_int"+str(i+a+1)+"]; \\\n"
        a += i+1
    s = s0 + s1 + "[v_int"+str(a)+"]null[with_subtitles]\" \\\n-map \"[with_subtitles]\" -map 0:a -c:v libx264 -c:a copy -crf 17 -preset slow -y "+video_out+"\n"
    print(s)
    file_object = open('blur.sh', 'w')
    file_object.write(s)
    file_object.close()
    st = os.stat('blur.sh')
    os.chmod('blur.sh', st.st_mode | stat.S_IXUSR | stat.S_IXGRP | stat.S_IXOTH)
#w,h,x,y,timein,timeout,blur,multiple,step,margin
matrix = [[729,70,599,499,14.96,16.40,25,1,1,90],]
blur_softly(matrix,video_in="your_video.mp4",video_out="output_video.mp4")

You can change the parameters in the last and penultimate lines, the last two parametres between quatation marks are path to your video and output video (assuming they are placed in the working directory). In the penultimate line:

  • the first two numbers indicate the size of the initial area to which maximum blur will be applied,
  • the second two indicate the x and y coordinates thereof (top left corner),
  • the third two indicate the times in seconds when the blurring should be applied,
  • "25" in this example indicates that there will be 25 boxes applied on top of each other)
  • the next "1" indicates that bigger boxes with less blurr should be just one pixel wider than their predecessors
  • the second "1" indicates that blurring should increase by one until until maximum of 25 (from above)
  • "30" indicates the margin that is taken into consideration for applying the blur, so increasing this makes the blur respect more of its surrounding. Increasing this value also solves error texted like Invalid chroma radius value 21, must be >= 0 and <= 20

By running it, one should get an output like the following (it gets written to a filethat can be run and printed on the output that can be copypasted and run):

ffmpeg -i video_to_be_blurred.mp4 -filter_complex \
"[0:v]null[v_int0]; \
[v_int0]crop=869:210:529:429,boxblur=1:enable='between(t,14.96,16.4)',crop=779:120:45:45[blur_int1]; \
[v_int0]crop=867:208:530:430,boxblur=2:enable='between(t,14.96,16.4)',crop=777:118:45:45[blur_int2]; \
[v_int0]crop=865:206:531:431,boxblur=3:enable='between(t,14.96,16.4)',crop=775:116:45:45[blur_int3]; \
[v_int0]crop=863:204:532:432,boxblur=4:enable='between(t,14.96,16.4)',crop=773:114:45:45[blur_int4]; \
[v_int0]crop=861:202:533:433,boxblur=5:enable='between(t,14.96,16.4)',crop=771:112:45:45[blur_int5]; \
[v_int0]crop=859:200:534:434,boxblur=6:enable='between(t,14.96,16.4)',crop=769:110:45:45[blur_int6]; \
[v_int0]crop=857:198:535:435,boxblur=7:enable='between(t,14.96,16.4)',crop=767:108:45:45[blur_int7]; \
[v_int0]crop=855:196:536:436,boxblur=8:enable='between(t,14.96,16.4)',crop=765:106:45:45[blur_int8]; \
[v_int0]crop=853:194:537:437,boxblur=9:enable='between(t,14.96,16.4)',crop=763:104:45:45[blur_int9]; \
[v_int0]crop=851:192:538:438,boxblur=10:enable='between(t,14.96,16.4)',crop=761:102:45:45[blur_int10]; \
[v_int0]crop=849:190:539:439,boxblur=11:enable='between(t,14.96,16.4)',crop=759:100:45:45[blur_int11]; \
[v_int0]crop=847:188:540:440,boxblur=12:enable='between(t,14.96,16.4)',crop=757:98:45:45[blur_int12]; \
[v_int0]crop=845:186:541:441,boxblur=13:enable='between(t,14.96,16.4)',crop=755:96:45:45[blur_int13]; \
[v_int0]crop=843:184:542:442,boxblur=14:enable='between(t,14.96,16.4)',crop=753:94:45:45[blur_int14]; \
[v_int0]crop=841:182:543:443,boxblur=15:enable='between(t,14.96,16.4)',crop=751:92:45:45[blur_int15]; \
[v_int0]crop=839:180:544:444,boxblur=16:enable='between(t,14.96,16.4)',crop=749:90:45:45[blur_int16]; \
[v_int0]crop=837:178:545:445,boxblur=17:enable='between(t,14.96,16.4)',crop=747:88:45:45[blur_int17]; \
[v_int0]crop=835:176:546:446,boxblur=18:enable='between(t,14.96,16.4)',crop=745:86:45:45[blur_int18]; \
[v_int0]crop=833:174:547:447,boxblur=19:enable='between(t,14.96,16.4)',crop=743:84:45:45[blur_int19]; \
[v_int0]crop=831:172:548:448,boxblur=20:enable='between(t,14.96,16.4)',crop=741:82:45:45[blur_int20]; \
[v_int0]crop=829:170:549:449,boxblur=21:enable='between(t,14.96,16.4)',crop=739:80:45:45[blur_int21]; \
[v_int0]crop=827:168:550:450,boxblur=22:enable='between(t,14.96,16.4)',crop=737:78:45:45[blur_int22]; \
[v_int0]crop=825:166:551:451,boxblur=23:enable='between(t,14.96,16.4)',crop=735:76:45:45[blur_int23]; \
[v_int0]crop=823:164:552:452,boxblur=24:enable='between(t,14.96,16.4)',crop=733:74:45:45[blur_int24]; \
[v_int0]crop=821:162:553:453,boxblur=25:enable='between(t,14.96,16.4)',crop=731:72:45:45[blur_int25]; \
[v_int0][blur_int1]overlay=574:474:enable='between(t,14.96,16.4)'[v_int1]; \
[v_int1][blur_int2]overlay=575:475:enable='between(t,14.96,16.4)'[v_int2]; \
[v_int2][blur_int3]overlay=576:476:enable='between(t,14.96,16.4)'[v_int3]; \
[v_int3][blur_int4]overlay=577:477:enable='between(t,14.96,16.4)'[v_int4]; \
[v_int4][blur_int5]overlay=578:478:enable='between(t,14.96,16.4)'[v_int5]; \
[v_int5][blur_int6]overlay=579:479:enable='between(t,14.96,16.4)'[v_int6]; \
[v_int6][blur_int7]overlay=580:480:enable='between(t,14.96,16.4)'[v_int7]; \
[v_int7][blur_int8]overlay=581:481:enable='between(t,14.96,16.4)'[v_int8]; \
[v_int8][blur_int9]overlay=582:482:enable='between(t,14.96,16.4)'[v_int9]; \
[v_int9][blur_int10]overlay=583:483:enable='between(t,14.96,16.4)'[v_int10]; \
[v_int10][blur_int11]overlay=584:484:enable='between(t,14.96,16.4)'[v_int11]; \
[v_int11][blur_int12]overlay=585:485:enable='between(t,14.96,16.4)'[v_int12]; \
[v_int12][blur_int13]overlay=586:486:enable='between(t,14.96,16.4)'[v_int13]; \
[v_int13][blur_int14]overlay=587:487:enable='between(t,14.96,16.4)'[v_int14]; \
[v_int14][blur_int15]overlay=588:488:enable='between(t,14.96,16.4)'[v_int15]; \
[v_int15][blur_int16]overlay=589:489:enable='between(t,14.96,16.4)'[v_int16]; \
[v_int16][blur_int17]overlay=590:490:enable='between(t,14.96,16.4)'[v_int17]; \
[v_int17][blur_int18]overlay=591:491:enable='between(t,14.96,16.4)'[v_int18]; \
[v_int18][blur_int19]overlay=592:492:enable='between(t,14.96,16.4)'[v_int19]; \
[v_int19][blur_int20]overlay=593:493:enable='between(t,14.96,16.4)'[v_int20]; \
[v_int20][blur_int21]overlay=594:494:enable='between(t,14.96,16.4)'[v_int21]; \
[v_int21][blur_int22]overlay=595:495:enable='between(t,14.96,16.4)'[v_int22]; \
[v_int22][blur_int23]overlay=596:496:enable='between(t,14.96,16.4)'[v_int23]; \
[v_int23][blur_int24]overlay=597:497:enable='between(t,14.96,16.4)'[v_int24]; \
[v_int24][blur_int25]overlay=598:498:enable='between(t,14.96,16.4)'[v_int25]; \
[v_int25]null[with_subtitles]" \
-map "[with_subtitles]" -map 0:a -c:v libx264 -c:a copy -crf 17 -slow preset -y video_to_be_blurred_blurred.mp4
sup
  • 769
  • 1
  • 9
  • 18
  • If anybody knows about a easier way to achieve blury edges, I would be interested. Also, this is rather slow. – sup Feb 22 '19 at 09:46
  • Apply a box blur to the mask before merging it. – Gyan Feb 22 '19 at 10:45
  • @Gyan What do you mean? I think I am doing that already. – sup Feb 23 '19 at 11:26
  • Anyway, I improved the code further on, I am still not sure I am doing what you recommended. – sup Feb 24 '19 at 13:43
  • How about using an image mask, as in @llogan's answer? – Ondra Žižka Mar 03 '21 at 02:46
  • @OndraŽižka Not sure how that would help - I need the bluring to kind of "blend into" the picture. – sup Mar 03 '21 at 16:18
  • Then I guess you need to have a mask that spans the face with gray values? Not sure what kind of blending you need, but with alpha blending and an overlapping mask, I got pretty good results, except static. – Ondra Žižka Mar 03 '21 at 20:48
  • Well, I am aiming for a gradual bluring, so there is no shart edge. I typically need this for texts, so rectangles work quite well. I guess a mask could be used in the same way if I needed to mask somethign notn-rectangular but that is not really mu use case. I do not see what it would do better for this use case though. – sup Mar 04 '21 at 09:53
  • 1
    I want to give you a thumbs up! it was impressive! – Omid Ghayour Nov 21 '21 at 10:34
2

Since you are asking specifically about blurring a face in a video, not blurring a region using ffmpeg:

There's the Deface project which splits video into frames, detects faces using OpenCV and a trained neural network, and applies a blur to those places.

Blurred faces sample image

The results are so-so, it is not usable for any serious anonymization where it really matters, because there are a lot of false negatives. But there is a clearly visible unobstructed face not at the edge of the video, it does the job.

A few simple improvements of Deface could fix the false negatives/positives, see the project's tickets. So if you happen to be a programmer, I encourage you to fork and implement those :)

Ondra Žižka
  • 740
  • 3
  • 10
  • 24