0

I'm trying to achieve a simple camera streaming using FFMpeg and FFserver. I have two slightly different systems acting as source, both Debian OS:

  • the first one runs ffmpeg 3.4.8, as indicated in figure 1

First system FFMPEG version

  • the second one runs ffmpeg 2.8.17, as indicated in figure 2

Second system FFMPEG version

The ffmpeg command used to send the stream to to ffserver is the following, identical for both systems:

ffmpeg -re -f v4l2 -s 640x360 -thread_queue_size 20 -probesize 32 -i /dev/video0 -threads 4 -fflags nobuffer -tune zerolatency http://myserverIP:myserverPort/liveFeed.ffm

In order to see the stream result I access the live stream from a third system using openCV pointing to the server URL:

VideoCapture videoCap = new VideoCapture("http://myserverIP:myserverPort/liveFeed.flv"); 
...
videoCap.read(imageInput);

and start grabbing the incoming frames from the stream.

The wierd thing happens here:

  • with the first system the video stream visualized through openCV is pretty much real time, with 1-2 seconds of delay from the original source.
  • with the second system the video stream is affected by a variable delay which is comparable with the elapsed time between the start time of the stream source and the start time of the stream acquisition with openCV (for example: if I start the source stream at 12:00:00 and wait 30 seconds before access the stream with openCV, I have a delay of about 30 seconds shown on the third system)

The ffserver configuration is the following

HTTPBindAddress 0.0.0.0
MaxHTTPConnections 2000
MaxClients 1000
MaxBandwidth 6000
CustomLog -
#NoDaemon

<Feed liveFeed.ffm>
    File /tmp/SCT-0001_3.ffm
    FileMaxSize 5M
</Feed>

<Stream liveFeed.flv>
    Format flv
    Feed liveFeed.ffm

    VideoCodec libx264
    VideoFrameRate 20
    VideoBitRate 200
    VideoSize 640x360
    AVOptionVideo preset superfast
    AVOptionVideo tune zerolatency
    AVOptionVideo flags +global_header

    NoAudio
</Stream>

##################################################################
# Special streams
##################################################################
<Stream stat.html>
    Format status
    # Only allow local people to get the status
    ACL allow localhost
    ACL allow 192.168.0.0 192.168.255.255
</Stream>

# Redirect index.html to the appropriate site
<Redirect index.html>
    URL http://www.ffmpeg.org/
</Redirect> 

Any help to spot the problem would be great! Thanks

  • I don't think it's an OpenCV problem. the server appears to serve the entire file, so *any player* playing the URL will start from the beginning instead of from _now_ – Christoph Rackwitz May 02 '22 at 09:06
  • I run some more test in order to better understand the issue: for simplicity I use `ffplay` to receive the live stream video. I notice that `ffplay -i http://myserverIP:myserverPort/liveFeed.flv` produces a similar result as when I use openCV with videocapture, instead if I add the `-fflag nobuffer` to the `ffplay` command the stream result in real time (<1s latency). Is there any possibility to achieve the same result in openCV? – Sandro Pellizza May 02 '22 at 19:16

0 Answers0