ffmpeg hangs and never finishes

I am running an ffmpeg command to copy raw H.265 data into an MP4 file and the command never completes. The file isn’t large, just 10 seconds worth of data. Here’s the command I’m running:

$   ffmpeg -rtsp_transport tcp -i rtsp://user:pass@71.185.124.195:554/c1/b1558830329/e1558830339/replay/ -vcodec copy -y test_clip.mp4 

I then get output like this:

ffmpeg version 3.4.6-0ubuntu0.18.04.1 Copyright (c) 2000-2019 the FFmpeg developers   built with gcc 7 (Ubuntu 7.3.0-16ubuntu3)   configuration: --prefix=/usr --extra-version=0ubuntu0.18.04.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-librsvg --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared   libavutil      55. 78.100 / 55. 78.100   libavcodec     57.107.100 / 57.107.100   libavformat    57. 83.100 / 57. 83.100   libavdevice    57. 10.100 / 57. 10.100   libavfilter     6.107.100 /  6.107.100   libavresample   3.  7.  0 /  3.  7.  0   libswscale      4.  8.100 /  4.  8.100   libswresample   2.  9.100 /  2.  9.100   libpostproc    54.  7.100 / 54.  7.100 Guessed Channel Layout for Input Stream #0.1 : mono Input #0, rtsp, from 'rtsp://admin:strawberryfluff1@71.85.104.195:554/c1/b1558830329/e1558830339/replay/':   Metadata:     title           : ONVIF RTSP Server   Duration: N/A, start: 0.000000, bitrate: N/A     Stream #0:0: Video: h264 (High), yuvj420p(pc, bt709, progressive), 1920x1080, 30 fps, 30 tbr, 90k tbn, 60 tbc     Stream #0:1: Audio: pcm_mulaw, 8000 Hz, mono, s16, 64 kb/s Stream mapping:   Stream #0:0 -> #0:0 (copy)   Stream #0:1 -> #0:1 (pcm_mulaw (native) -> aac (native)) Press [q] to stop, [?] for help [aac @ 0x55b71ce31900] Too many bits 8832.000000 > 6144 per frame requested, clamping to max Output #0, mp4, to 'test_clip.mp4':   Metadata:     title           : ONVIF RTSP Server     encoder         : Lavf57.83.100     Stream #0:0: Video: h264 (High) (avc1 / 0x31637661), yuvj420p(pc, bt709, progressive), 1920x1080, q=2-31, 30 fps, 30 tbr, 90k tbn, 90k tbc     Stream #0:1: Audio: aac (LC) (mp4a / 0x6134706D), 8000 Hz, mono, fltp, 48 kb/s     Metadata:       encoder         : Lavc57.107.100 aac [mp4 @ 0x55b71ce17e00] Non-monotonous DTS in output stream 0:0; previous: 18000, current: 3000; changing to 18001. This may result in incorrect timestamps in the output file. [mp4 @ 0x55b71ce17e00] Non-monotonous DTS in output stream 0:0; previous: 18001, current: 6000; changing to 18002. This may result in incorrect timestamps in the output file. [mp4 @ 0x55b71ce17e00] Non-monotonous DTS in output stream 0:0; previous: 18002, current: 9000; changing to 18003. This may result in incorrect timestamps in the output file. [mp4 @ 0x55b71ce17e00] Non-monotonous DTS in output stream 0:0; previous: 18003, current: 12000; changing to 18004. This may result in incorrect timestamps in the output file. [mp4 @ 0x55b71ce17e00] Non-monotonous DTS in output stream 0:0; previous: 18004, current: 15000; changing to 18005. This may result in incorrect timestamps in the output file. [mp4 @ 0x55b71ce17e00] Non-monotonous DTS in output stream 0:0; previous: 18005, current: 18000; changing to 18006. This may result in incorrect timestamps in the output file. frame=   44 fps=0.0 q=-1.0 size=     256kB time=00:00:01.43 bitrate=1463.4kbits/frame=   60 fps= 57 q=-1.0 size=     512kB time=00:00:01.96 bitrate=2132.9kbits/frame=   76 fps= 48 q=-1.0 size=     768kB time=00:00:02.50 bitrate=2516.7kbits/frame=   92 fps= 44 q=-1.0 size=    1024kB time=00:00:03.03 bitrate=2765.6kbits/frame=  108 fps= 41 q=-1.0 size=    1024kB time=00:00:03.56 bitrate=2352.0kbits/[NULL @ 0x55b71cdfa540] SEI type 5 size 2208 truncated at 1944 frame=  123 fps= 39 q=-1.0 size=    1280kB time=00:00:04.06 bitrate=2578.6kbits/frame=  139 fps= 38 q=-1.0 size=    1536kB time=00:00:04.60 bitrate=2735.5kbits/frame=  155 fps= 37 q=-1.0 size=    1536kB time=00:00:05.13 bitrate=2451.3kbits/frame=  171 fps= 36 q=-1.0 size=    1792kB time=00:00:05.66 bitrate=2590.7kbits/frame=  187 fps= 36 q=-1.0 size=    2048kB time=00:00:06.20 bitrate=2706.1kbits/frame=  203 fps= 35 q=-1.0 size=    2304kB time=00:00:06.73 bitrate=2803.2kbits/frame=  219 fps= 35 q=-1.0 size=    2304kB time=00:00:07.26 bitrate=2597.4kbits/frame=  235 fps= 34 q=-1.0 size=    2560kB time=00:00:07.80 bitrate=2688.7kbits/frame=  246 fps= 33 q=-1.0 size=    2560kB time=00:00:08.16 bitrate=2568.0kbits/frame=  267 fps= 34 q=-1.0 size=    3072kB time=00:00:08.86 bitrate=2838.3kbits/frame=  282 fps= 34 q=-1.0 size=    3072kB time=00:00:09.36 bitrate=2686.8kbits/frame=  298 fps= 33 q=-1.0 size=    3328kB time=00:00:09.90 bitrate=2753.9kbits/frame=  314 fps= 33 q=-1.0 size=    3328kB time=00:00:10.43 bitrate=2613.1kbits/^Cspeed=1.11x     

The command never completes and I need to kill it using Ctrl-C.

I have also tried adding the options -nostdin -loglevel error and appending this to the end of the command: > /dev/null 2>&1 < /dev/null but to no avail.

This is on a vanilla Ubuntu 18.04.

record webrtc steam from mediasoup v3 with ffmpeg

I am using mediasoup v3 and wants to record the webrtc video stream on the server. I created a plain transport and then a consumer. The following is the debug log from my mediasoup server:

created plain transport with id: f1ec98ed-2a45-4584-b393-13ff491e4e23 tuple: {“localIp”:”xxx.xxx.xxx.xxx”,”localPort”:48343,”protocol”:”udp”}

new consumer created: {“codecs”:[{“mimeType”:”video/VP8″,”clockRate”:90000,”payloadType”:101,”rtcpFeedback”:[{“type”:”nack”},{“type”:”nack”,”parameter”:”pli”},{“type”:”ccm”,”parameter”:”fir”},{“type”:”goog-remb”}],”parameters”:{}},{“mimeType”:”video/rtx”,”clockRate”:90000,”payloadType”:102,”rtcpFeedback”:[],”parameters”:{“apt”:101}}],”headerExtensions”:[{“uri”:”http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time”,”id”:4},{“uri”:”urn:3gpp:video-orientation”,”id”:9},{“uri”:”urn:ietf:params:rtp-hdrext:toffset”,”id”:10}],”encodings”:[{“ssrc”:787151607,”rtx”:{“ssrc”:950796434},”scalabilityMode”:”L3T3″}],”rtcp”:{“cname”:”n/02NZ5XH9xJT4HC”,”reducedSize”:true,”mux”:true}}

Then I tried different combination of parameters in the ffmpeg command with no luck, for example:

ffmpeg -protocol_whitelist udp,rtp -i rtp://@xxx.xxx.xxx.xxx:48343 -ssrc 787151607 -c copy mmmm.webm 

What should be the correct parameters passed to the ffmpeg command?

Thanks!

ffmpeg synchronizing rawvideo with audio

I am trying to add sound from a webcam microphone in real time to a rawvideo feed generated by opencv from the webcam. The ffmpeg string I am using is as below – I get audio on the video but the video lags the audio by 3 seconds.

ffmpeg -y -f rawvideo -vcodec rawvideo -s 640x480 -pix_fmt bgr24 -i - -f alsa -ac 1 -i hw:2 -vcodec libx264 -crf 18 -preset fast Output.mp4 

What would be the best way to synchronise the video and sound within ffmpeg string.

ffmpeg static builds from john requires libraries?

I’m trying to run ffmpeg with ubuntu 18.04. I downloaded latest git build from https://johnvansickle.com/ffmpeg/

But when I try to run any ffmpeg command, it complains about a lot of missing libraries like

libfdk-aac.so.1 libva.so.2 libass.so.9 libSDL2-2.0.so.0 

Just to name a few…isn’t the point of a static build to not require anything else to run? Or am I doing something wrong?

I feel like there are endless libraries I have to install to make it work. Is there a way to just install everything it needs?

And I believe most libraries in the repos would be super old right?

How to convert raw PCM data to a valid WAV file with ffmpeg?

I run this command: ffmpeg -f f32le -i pipe:0 -f wav pipe:1

ffmpeg version 4.1-tessus  https://evermeet.cx/ffmpeg/  Copyright (c) 2000-2018 the FFmpeg developers built with Apple LLVM version 10.0.0 (clang-1000.11.45.5) configuration: --cc=/usr/bin/clang --prefix=/opt/ffmpeg --extra-version=tessus --enable-avisynth --enable-fontconfig --enable-gpl --enable-libaom --enable-libass --enable-libbluray --enable-libfreetype --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libmysofa --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopus --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-libzmq --enable-libzvbi --enable-version3 --pkg-config-flags=--static --disable-ffplay libavutil      56. 22.100 / 56. 22.100 libavcodec     58. 35.100 / 58. 35.100 libavformat    58. 20.100 / 58. 20.100 libavdevice    58.  5.100 / 58.  5.100 libavfilter     7. 40.101 /  7. 40.101 libswscale      5.  3.100 /  5.  3.100 libswresample   3.  3.100 /  3.  3.100 libpostproc    55.  3.100 / 55.  3.100 Guessed Channel Layout for Input Stream #0.0 : mono Input #0, f32le, from 'pipe:0': Duration: N/A, bitrate: 1411 kb/s     Stream #0:0: Audio: pcm_f32le, 44100 Hz, mono, flt, 1411 kb/s Stream mapping: Stream #0:0 -> #0:0 (pcm_f32le (native) -> pcm_s16le (native)) Output #0, wav, to 'pipe:1': Metadata:     ISFT            : Lavf58.20.100     Stream #0:0: Audio: pcm_s16le ([1][0][0][0] / 0x0001), 44100 Hz, mono, s16, 705 kb/s     Metadata:     encoder         : Lavc58.35.100 pcm_s16le size=   15144kB time=00:02:55.82 bitrate= 705.6kbits/s speed= 352x     size=   30684kB time=00:05:56.24 bitrate= 705.6kbits/s speed= 356x     size=   32043kB time=00:06:12.01 bitrate= 705.6kbits/s speed= 354x     video:0kB audio:32042kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.000238% 

Where pipe:0 is the raw PCM data of the left channel of this file. Here is the output wav.

The output seems to be semi valid. It won’t play in Quicktime or iTunes. When attempting to add it to an Ableton project, I get “output.wav could not be read. It may be corrupt or not licensed.”

However, it does play fine if I simply drag the output wav in Chrome.

The ffprobe output confirms some sort of problem with the file. For this command:

ffprobe -loglevel verbose /Users/maximedupre/Desktop/Dropbox/Programming/api/gg.wav

ffprobe version 3.4.2 Copyright (c) 2007-2018 the FFmpeg developers built with Apple LLVM version 9.0.0 (clang-900.0.39.2) configuration: --prefix=/usr/local/Cellar/ffmpeg/3.4.2 --enable-shared --enable-pthreads --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --disable-jack --enable-gpl --enable-libmp3lame --enable-libx264 --enable-libxvid --enable-opencl --enable-videotoolbox --disable-lzma libavutil      55. 78.100 / 55. 78.100 libavcodec     57.107.100 / 57.107.100 libavformat    57. 83.100 / 57. 83.100 libavdevice    57. 10.100 / 57. 10.100 libavfilter     6.107.100 /  6.107.100 libavresample   3.  7.  0 /  3.  7.  0 libswscale      4.  8.100 /  4.  8.100 libswresample   2.  9.100 /  2.  9.100 libpostproc    54.  7.100 / 54.  7.100 [wav @ 0x7f81e3004a00] Ignoring maximum wav data size, file may be invalid [wav @ 0x7f81e3004a00] parser not found for codec pcm_s16le, packets or times may be invalid. [wav @ 0x7f81e3004a00] Estimating duration from bitrate, this may be inaccurate Input #0, wav, from '/Users/maximedupre/Desktop/Dropbox/Programming/api/gg.wav': Metadata:     encoder         : Lavf58.20.100 Duration: 00:06:12.01, bitrate: 705 kb/s     Stream #0:0: Audio: pcm_s16le ([1][0][0][0] / 0x0001), 44100 Hz, 1 channels, s16, 705 kb/s 

How can I modify my ffmpeg command to produce a valid wav file?

Cheers!

FFMPEG: Two smaller IP cameras with background of another camera, stream to youtube

I want to do smth like this:

          ---------------           |  1 |    | 2 |           |----|    |---|           | Main Camera |           --------------- 

Two smaller cameras in the corners and the background is another camera

I have smth like this:

ffmpeg  -i "rtsp://............"  -i "rtsp://............" -i "rtsp://............" -f lavfi -i anullsrc   -filter_complex " [0:v] setpts=PTS-STARTPTS,scale=1920x1080,setsar=1[center];  [1:v] setpts=PTS-STARTPTS, scale=640x480,setsar=1[upperright]; [2:v] setpts=PTS-STARTPTS, scale=640x480,setsar=1[upperleft];   [upperleft][upperright]hstack[base]"  -map [base] -map 2 -f flv "rtmp://......................" 

OR

ffmpeg     -i 1.avi -i 2.avi -i 3.avi -i 4.avi     -filter_complex "         nullsrc=size=640x480 [base];         [0:v] setpts=PTS-STARTPTS, scale=320x240 [upperleft];         [1:v] setpts=PTS-STARTPTS, scale=320x240 [upperright];         [2:v] setpts=PTS-STARTPTS, scale=320x240 [lowerleft];         [3:v] setpts=PTS-STARTPTS, scale=320x240 [lowerright];         [base][upperleft] overlay=shortest=1 [tmp1];         [tmp1][upperright] overlay=shortest=1:x=320 [tmp2];         [tmp2][lowerleft] overlay=shortest=1:y=240 [tmp3];         [tmp3][lowerright] overlay=shortest=1:x=320:y=240     "     -c:v libx264 output.mkv 

But obviously it wont work, smth need to be changed but i’m not so experienced with the ffmpeg, so could anyone help me?

How to extract specific audio track (track 2) from mp4 file using ffmpeg?

I am working on a mp4 file (36017P.mp4) in which I want to extract Track 2 -[English] using ffmpeg.

enter image description here

I tried with the following command on terminal but it seems to extract Track 1 - [English]:

ffmpeg -i 36017P.mp4 filename.mp3 

Problem Statement:

I am wondering what changes I need to make in the ffmpeg command above so that it extract Track 2 -[English] from mp4 file