Audio Streaming from Microphone - c#

I'm creating an video chat application but I'm having issues streaming microphone audio. I have video streams working already but I was hoping to find out the best method of capturing a laptop's built in microphone and stream it. At the moment I'm using a NetworkStream for sending the video. I have limited experience with NAudio (http://naudio.codeplex.com/) but every example of microphone capture doesn't seem to include a New Audio Frame event (which is my method for sending video frames).
I've been looking at http://voicerecorder.codeplex.com/ but it seems to be more than I need and doesn't cover streaming.
How do I capture microphone audio and stream it, if possible without the overhead of saving the audio to a file? I'd appreciate a simple example.

Create a new WaveIn object
call StartRecording
in the DataAvailable event handler, transmit args.BytesRecorded bytes from args.Buffer across the network.
Note that this will mean you are transmitting PCM, which is not very efficient. Normally for network streaming you would use a codec. In the NAudioDemo source code there is a Network Chat demo showing this in action.

Related

Grab Audio Sessions as they appear in the windows audio mixer (C# or C++)

I'm trying to figure out how to grab the individual audio streams as they appear in the audio mixer to reroute them to an aggregate audio device. I'm specifically looking to keep them as discreet streams for the purposes of the program I'm making (If they're muxed down to a 2-channel mix, that defeats the purposes of what I'm trying to achieve.)
E.X.: (As I've just made this account, I apparently am not able to post images, so here's a link to the image)
windows audio mixer
In this, I'm hoping to grab "System Sounds" and "Stream Client Bootstrapper" as discreet audio streams to route elsewhere, while maintaining their original destination as well (essentially copying the audio going to the original audio device to another simultaneously).
I'm looking to do this in either C# or C++. I've perused the audio APIs that microsoft has published, and while some things look to be close to what I'm trying to do, nothing has hit the nail on the head. I appreciate any help. Thanks.
The sessions can be enumerated using IAudioSessionManager2::GetSessionEnumerator and friends (sample C++ code is here and there). Standard Windows volume mixer application is using this API as well.
The API however has no access to data streams, you won't have either (you certainly don't have data whether they are downmixed or not). Neither you can reroute streams to another device. Applications are not allowed to interfere that deep. The best you can do is to create your own device, interactively select it as default output device and then accept data from applications playing audio through this device.

stream audio and video data separately

I'm developing a C# video streaming application.
At the moment I was able to capture video frames using opencv and encode it using ffmpeg and also capture and encode
audio using ffmpeg and save data in to queue.
The problem that I am facing is that when I send both streams simultaneously I am losing video stream.That means I stream it by first sending a video packet and then an audio packet. But, the player identifies video stream first and start to play. And did not play audio packets.
Can any one give a method to do this task?
It will be greatly appreciated.

Send synchronized audio and video data over the network

I have separately sent Audio and Video over the network but i want to send synchronized audio video data over the network.if any one could me to the solution it would be greatly appreciated.
p.s: i am using Naudio for audio and DirectShow for video.
I assume the first point you have to make sure is that you are generating the RTCP packets at some fixed interval of time. RTCP maps the rtp timestamps with NTP. And this is required at the client end for syncing audio and video.

streaming video over LAN C#

I've recently learned about data streaming and C#. I've practice building a simple server-client Login program like so:
public static TcpClient SocketCLiente = new TcpClient();
NetworkStream _serverStream = default(NetworkStream);
public static string IPServer="127.0.0.1";
byte[] outStream = System.Text.Encoding.ASCII.GetBytes(this._txtUserName.Text + "%" + this.txtPassword.Text + "$");
_serverStream.Write(outStream, 0, outStream.Length);
This works great with text, but what if i want to stream a video to a client? How can I stream a live video from my webcam to a c# application over Lan?
Like the comment mentioned you should check out How can I stream webcam video with C#?. In general all you stream out are bytes. As long as those bytes are formatted in a way that a player, such as flash, or silverlight, or whatever, can understand then it can play the video back.
Now I don't think that post really covers the internet streaming aspect. It really touches more on capturing and playing back locally. Thats what directshow is good at. It abstracts away all the driver nonsense that you used to have to deal with to capture video and audio frame data.
Streaming to the internet, short of rolling your own streaming server, is usually accomplished by hooking into something like red-5, wowza, adobe FMS, rtsp, silverlight streaming etc.
I would use one of those solutions to hook into wherever you are getting your video data from. The gist is that you capture video/audio packets from directshow and then funnel those packets (formatted however the solution wants) to one of these streaming platforms.
Another option, though it is a little more "hackish", would be to stream your video as motion jpeg. This wouldn't let you send audio though. Motion jpeg basically tells the client to keep a connection open and constantly download a new image and replace the old one. It's like a flipbook for web. The advantage here is you can roll your own motion jpeg code pretty easily and it'd be a fun little project.

IStreamBufferMediaSeeking setPosition doesn't seem to render

I have created two DirectShow graphs. One captures from a Hauppauge HD-PVR and stores it in a StreamBufferSink. The second one uses a StreamBufferSource, sends the output to an MPEG-2 Demultiplexer, sending the video to the ArcSoft Video Decoder and on to a Video Mixing Renderer 9 set up in windowless mode.
This all works fine for previewing the data. When I use the IStreamBufferMediaSeeking.SetPositions method (getting the interface from the StreamBufferSource) to change the playback position, if I set it anywhere but at the beginning of the stream, the video freezes and stops updating. Calling GetCurrentPosition on IStreamBufferMediaSeeking shows the position is moving on the stream, but the video just doesn't follow along.
I am using C# and DirectShowLib-2005 for programming things.
Any ideas on what is wrong or how to figure out what is going wrong?
What I have discovered is the StreamBufferSink/StreamBufferSource only understand MPEG-2 or DV video. H.264 is not supported so it doesn't know how to seek within the stream and thus I cannot use this component for what I want to do unless I transcode my stream to MPEG-2 which defeats the purpose for having an H.264 stream in the first place.
Further information: This actually will work under Windows 7 with the updates to the Streaming Buffer Engine. To get rewind, I had to demux the stream and add the MPEG-2 Video Stream Analyzer filter before putting the data into the Stream Buffer Sink.

Categories