I'm quite new to Windows programming and I'm trying to setup a directshow graph to stream a webcam feed over a network so I can monitor the output on an ipad.
I've setup a graph in csharp using DirectShowLib and FFDShow that compresses the raw output from a capture device and writes it to an avi file.
Now I'm trying to work out how to broadcast the stream over the network.
The only sample code I can find for network streaming with directshow relates to the WM library which only seems to output Asf formatted streams.
How can I broadcast a stream in a format other than Asf using directshow? Can I configure the ASF Writer to output an avi/mpeg formated stream or do I need to write my own directshow filter?
Are there any examples of streaming avi over a network using directshow?
Thanks for reading,
Josh
Well I ended up using VLC to create an MJPEG stream. I did try using VLS's HLS plugin but I found that iOS will only play one video at a time which was no good as I want to display several webcams. MJPEG gets round this.
Related
I"m trying to consume an mp4 stream in C# (WPF) direct from sources without any server/mid software to convert. The stream is continuous. No control or audio are necessary, just the constant video stream.
media element & embedded web browser controls with some added codecs tried but no luck.
just want a simple, continuous mp4 stream. Once working, application will read many streams simultaneously for multiple source monitoring without any control or audio of media so a rolled class is an option if a simple stream converter/consumer isn't available.
any assistance is appreciated - thanks
I'm developing a C# video streaming application.
At the moment I was able to capture video frames using opencv and encode it using ffmpeg and also capture and encode
audio using ffmpeg and save data in to queue.
The problem that I am facing is that when I send both streams simultaneously I am losing video stream.That means I stream it by first sending a video packet and then an audio packet. But, the player identifies video stream first and start to play. And did not play audio packets.
Can any one give a method to do this task?
It will be greatly appreciated.
I'm creating an video chat application but I'm having issues streaming microphone audio. I have video streams working already but I was hoping to find out the best method of capturing a laptop's built in microphone and stream it. At the moment I'm using a NetworkStream for sending the video. I have limited experience with NAudio (http://naudio.codeplex.com/) but every example of microphone capture doesn't seem to include a New Audio Frame event (which is my method for sending video frames).
I've been looking at http://voicerecorder.codeplex.com/ but it seems to be more than I need and doesn't cover streaming.
How do I capture microphone audio and stream it, if possible without the overhead of saving the audio to a file? I'd appreciate a simple example.
Create a new WaveIn object
call StartRecording
in the DataAvailable event handler, transmit args.BytesRecorded bytes from args.Buffer across the network.
Note that this will mean you are transmitting PCM, which is not very efficient. Normally for network streaming you would use a codec. In the NAudioDemo source code there is a Network Chat demo showing this in action.
Is there an embeddable video player that can do the following:
Stream video from a server (It's saved to disk)
Run across browsers (or at least give the
option for a fallback)
Run across mobile devices (Android and iOS)?
Have you looked into HTML5? video Embed, you should be able to stream from that. In terms of video protocol, you might have to make a converter but AVI might work. Give it a shot, I'd be interested to know what you find.
EDIT:
Links:
Streaming via RTSP or RTP in HTML5
http://www.pcworld.com/article/191030/how_to_ditch_flash_and_stream_video_with_html5.html
I have a video server with an IP:192.168.1.XX
It has 3 possible formats JPEG, MPEG-4 or H.264
The video server is broadcasting a video(with audio) on real time
I have no problems streaming the video with AFORGE lib
but i also need to stream the audio
the video server has several protocols: HTTP,RTSP,RTP,RTCP
according to the user's manual RTSP is the protocol I should use to get MPEG-4(Audio and video), but I haven't found anything to stream by RTSP on C# so I'm trying to stream audio and video separate
the ports are:
RTSP: 554
RTP(Video): 5556
RTP(Audio):5558
RTCP(Video): 5557
RTCP(Audio): 5559
Does any body know how RTP works or how can I get the sound from the video server?
I would learn gstreamer. I am assuming that you are using windows since you are doing this in C#. It has a fairly stable windows port with a nice .net wrapper. If you aren't using Windows, then gstreamer is most certainly your best bet.
In gstreamer you would most likely use a pipeline like:
your video src -> x264enc or ffenc_mpv4 -> rtph264pay or rtpmp4vpay -> udpsink
your audio src -> ffenc_aac or preferably a lower latency codec like mULaw -> rtppay -> udpsink
and so on. It is very easy to use. They even have a nice rtpbin for your to use if you want to actually manage an rtp session.
More information can be found here:
http://gstreamer.freedesktop.org/
Here is a nice sample of how to do rtp:
http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good-plugins/html/gst-plugins-good-plugins-gstrtpbin.html
I have done this sort of thing with the direct show filters but it is much more involved. You usually have to manually handle the rtp payloading and the transport--not to mention deal with COM--whereas GStreamer provides those mechanisms for you out of the box.
You can use https://net7mma.codeplex.com/
It is a C# Media Server and it will get you each RtpPacket and from there you can get them to a Decoder of your choice among other things all without bogging down the source stream.