I have a camera sending raw frames to my application and I need to generate a h264 stream from such frames and make it playable via browser with low latency. My idea is to use a webRTC stream in order to keep latency at minimum.
Until now my approach has been the following:
Use FFmpeg to generate a h264/RTSP stream by means of the command
ffmpeg -fflags nobuffer -re -i "frames%05d.bmp" -pix_fmt yuv420p -c:v libx264 -crf 23 -f rtsp rtsp://localhost:8554/mystream
Use RTSP simple server to publish the RTSP stream.
Use RTSPtoWeb to generate a webRTC stream playable by browsers.
Please note input frames are 728x544 bitmaps.
Until now I had no luck: the RTSP stream produced at step (2) is playable by means of VLC but is has problems when played by means of webRTC, e.g. continuous freezes. Please note I can reproduce h264/RTSP stream produced by AXIS IP cameras by means of RTSPtoWeb with no problem.
Furthermore I'd need the frames to be passed to FFmpeg by a C# application instead of reading them from disk.
Of course if you know of a way to directly generate a h264/webRTC stream from an image sequence that would be wounderful.
Has anyone ever tried something like this?
Related
I have a URL (<ip>/ipcam/mpeg4.cgi) which points to my IP camera which is connected via Ethernet.
Accessing the URL resuls in a infinite stream of video (possibly with audio) data.
I would like to store this data into a video file and play it later with a video player (HTML5's video tag is preferred as the player).
However, a straightforward approach, which is simple saving the stream data into .mp4 file, didn't work.
I have looked into the file and here is what I saw (click to enlarge):
It turned out, there are some HTML headers, which I further on manually excluded using the binary editing tool, and yet no player could play the rest of the file.
The HTML headers are:
--myboundary
Content-Type: image/mpeg4
Content-Length: 76241
X-Status: 0
X-Tag: 1693923
X-Flags: 0
X-Alarm: 0
X-Frametype: I
X-Framerate: 30
X-Resolution: 1920*1080
X-Audio: 1
X-Time: 2000-02-03 02:46:31
alarm: 0000
My question is pretty clear now, and I would like any help or suggestion. I suspect, I have to manually create some MP4 headers myself based on those values above, however, I fail to understand format descriptions such as these.
I have the following video stream settings on my IP camera (click to enlarge):
I could also use the ffmpeg tool, but no matter how I try and mix the arguments to the program, it keeps telling me this error:
It looks like your server is sending H.264 encoded 'rawvideo' in Annex B byte stream format.
It might be reformatted to .mp4 with something like below command line:
ffmpeg -i {input file} -f rawvideo -bsf h264_mp4toannexb -vcodec copy out.mp4
Saving audio/video streaming into file is not an easy job. If it's video only, using MPEG2 TS format is easiest way to go.
For .mp4 streaming, consider -movflags faststart -> Recommendation on the best quality/performance H264 encoder for video encoding?
** Update: -bsf h264_mp4toannexb option could be omitted, I'm not sure.
Well, this is not so straightforward as it seems:
1) HTML5 <video> tag has some requirements for the MP4 stream - it must be fragmented (it means that the internal data atoms that describe length and other data must be in the beginning of the stream). Most MP4 video files do not have this feature, so your option is to reformat them with FFmpeg or other tools (see this) and then you can actually provide the file as is.
2) Nginx has a plugin that allows streaming MP4 files, I haven't used it but it could be useful to you, since I guess it takes care of the internal stuff.
I'm developing a C# video streaming application.
At the moment I was able to capture video frames using opencv and encode it using ffmpeg and also capture and encode
audio using ffmpeg and save data in to queue.
The problem that I am facing is that when I send both streams simultaneously I am losing video stream.That means I stream it by first sending a video packet and then an audio packet. But, the player identifies video stream first and start to play. And did not play audio packets.
Can any one give a method to do this task?
It will be greatly appreciated.
I have read through several posts about H264 recording, but none of them really answer my question, so here is what I am trying to do.
A server is sending H264 encoded video packets to me, and I would like to capture and turn the packets into a video file (.mpeg or .avi)
Here is how I envision the setup:
I need to setup UDP listener to capture the video packets, then send the packet payload to a DirecShow graph for processing.
The DirectShow graph should consist of a H264 decoder and a MPEG encoder.
Am I on the right track?
Thank you
If all you want to do is capture the h.264 stream and stick it into a container, I would utilize FFmpeg. I don't know the exact command line, so this is untested but try something like...
ffmpeg -i - -f mp4 output.mp4
Then, write to it over STDIN. It should detect your stream type after several packets, and begin writing to the MP4 file.
I'm quite new to Windows programming and I'm trying to setup a directshow graph to stream a webcam feed over a network so I can monitor the output on an ipad.
I've setup a graph in csharp using DirectShowLib and FFDShow that compresses the raw output from a capture device and writes it to an avi file.
Now I'm trying to work out how to broadcast the stream over the network.
The only sample code I can find for network streaming with directshow relates to the WM library which only seems to output Asf formatted streams.
How can I broadcast a stream in a format other than Asf using directshow? Can I configure the ASF Writer to output an avi/mpeg formated stream or do I need to write my own directshow filter?
Are there any examples of streaming avi over a network using directshow?
Thanks for reading,
Josh
Well I ended up using VLC to create an MJPEG stream. I did try using VLS's HLS plugin but I found that iOS will only play one video at a time which was no good as I want to display several webcams. MJPEG gets round this.
I'm using flow player 3.1.1 for streaming videos to my browser.The videos are uploaded by the users and they may upload different formats. What will be solution to stream the videos as mp4 , what ever be the format they upload. I'm currently using ffmpeg commands.
ffmpeg -i "InputFile.mp4" -sameq -vcodec libx264 -r 35 -acodec libfaac -y "OutputFile.mp4"
But video files of more size(say 100mb) are taking a minute more for laoding in to the flowplayer and buffering. I think the problem with my encoding.
Welcome your valuable Suggestions!!!
The problem come from metadata. ffmpeg put this data at the end of file, for a progressive download you must move this data at the begininng. You can use MP4Box or qt-faststart after ffmpeg process.
MP4Box -inter 1000 file.mp4 or qt-faststart in.mp4 out.mp4