I'm new in audio streaming.
I have a stream of audio data, encoded into G.711 m-Law format (from a 3rd party device). And I need to provide an access to it via RTP protocol.
Could you advise me some free library, which would allow me to create a RTP server and feed it with my data?
Everything I found seems to be a RTP client but not a server.
I would use libVLC with a C# wrapper (like https://github.com/videolan/libvlcsharp).
Depending on your source audio format - you may need a μ-law audio encoder. Which should be pretty straight forward to implement or find.
Related
I have develop a winforms application in c#.net. Essentially
capture audio from microphone
capture video from webcam
I want send this two buffer from client to other client like a video call application.
I read that I would use RTP client/server protocol but I don't know how to get this.
Can you advise me how merge both audio and video data and send it to socket from client to client? Or simple send separate audio and video with rtp protocol.
I use this library to capture audio and video: accord-net framework
I have managed to establish TCP host/clients sockets between multiple Raspberry Pi3. I would like to find out how to stream audio via RTSP among the host and clients. I have seen a lot of video streaming discussion but I haven't run into any audio streaming thread which I could reference. Can anyone help?
Thanks.
RTSP is a realtime streaming protocol. It means that you can stream whatever you want in real time, such as video, audio, text and so on. RTP is a transport protocol which is used to transport media data which is negotiated over RTSP. So we can say RTSP is over RTP. The library SharpRSTP which is mentioned in my comment, supports in UWP. If you want stream audio only, you can use the G711Payload. G711 is a general audio encoder.Of course, you can find some other library from network. You can refer to the RTSPClient.cs sample in the repo.
You can search a sample(titile is TCP Audio Streamer and Player (Voice Chat over IP)), it works fine.This is a proprietary VoIP project to send and receive audio data by TCP. You can move the client to UWP.
I'm developing a C#’s application with the ability to view an MJPEG encoded video stream, from an encoder (Teleste MPX-E8).
I can see a stream through the Encoder’s Video Encoding Settings page, and I’ve read the stream is encapsulated in an SDP file.
So, into my C# application, I’ve tried to implement the SDP protocol, but the only documentations / examples I found all relates to SIP protocol.
With Wireshark, I can see that when I made an HTTP request to the encoder, it responds with a 200 OK, and SDP.
I suppose that I need to send an acknowledge to my encoder to receive stream, but I can't figure out how to send it.
Does anyone can help me ?!
Thanks in advance for your response !
You can use SDPLib Session Description Protocol library library to read SDP file
The video encoder implements RTSP protocol, SDP is a part of it that describes details of video feed. The RTSP configuration is set up on the device, see P6I_MPX_E.pdf page 45, and remotely you connect with RTSP client software, such as VLC.
In C# you need a library that implements RTSP client - it will do all networking you need and delvier you JPEG frames from the device, either compressed or already decoded. You will present the frames on the application UI.
https://net7mma.codeplex.com/ Does Sdp..
It also contains a RtspClient and RtspServer all in pure C#
There is an article here
http://www.codeproject.com/Articles/507218/Managed-Media-Aggregation-using-Rtsp-and-Rtp
Let me know if you need anything else!
I am writing an C#.Net based application which requires publishing video and audio streams to Red 5 Media Server and retrieving the same published stream in another application on a local network and later on on internet.
In short, I intend to use Red5 as intermediate or proxy for the transmission of video and audio data between two clients.
[client 1] <-Video and Audio -> <- Video and Audio -> [Client 2]
I am looking for .NET implementation(library) of the RTMP protocol that can help me publish the stream to Media Server and retirve the published stream and play it back on my application.
I searched over SOF and came to these options:
Borrocoli RTMP Client Library
FlourineFx.NET
WebORb.Net
Each has some limitations.
Borrocoli RTMP Library has only plyback support for audio/video streams but there is no option of publishing a video/audio stream to the media server. I have played with the library and seen its examples but no avail. If i am wrong please correct me.
FlourinFx.Net says that supports NetStream.Publish(), NetStream.AttachAudio() and NetStream.AttachVideo() methods. But in latest snapshot of code, there is nothing like this. Same is true for their production release. The NetStream class doesn't have the stated methods and/or does not have any methods that can help publish streaming content TO the media server.
WebOrb.Net: I have not explored it, but as evident from their licensing page, the free version works with IIS only. The enterprise version seems to have all the support for publishing streaming video...
Questions:
Is it possible that I can use WebOrb.Net library in conjunction with Red5 Media Server?
Will the free version allow me to publish audio and video to Red5 media server?
Is there any other free alternative I can use if the above questions are not valid?
You can use ffmpeg to send stream to Red 5 MediaServer..
Set the source video to ffmpeg and the output to rtmp of red5, something like this:
ffmpeg -re -i file.mp4 -c copy -f flv rtmp://server/live/streamName
See this answer for an examples to integrate ffmpeg in c#.
you can use weborb.lib for peer to peer video streaming by using flex and .net via RTMP protocol.you can use for the peer to peer streaming.your process can be done by as follows...
1.develop a Flex client code connecting to the server and subscribing to receive server-side updates with the CPU readings.
The code also includes a chart to visualize the received data.
2.The server-side 'Application handler' which plugs into WebORB, polls CPU and delivers data to the client.It will works try it.
Use RTMPdump's librtmp. It's written in C but you should be able to write C# wrappers for it easily enough.
I an trying to develop a windows application using C# that can play streamed audio data. Basically, I will have a client application that is responsible for playing different audio files. Currently, from the client application, I am extracting the hardware config param from the file header and then will stream the file data (PCM stream) over network.
So is it possible to use the hardware config params sent from client to configure the actual hardware (on the server end) and then give it the file data stream to it so that it can play the audio data.
While searching, I got to know about NAudio. Is NAudio capable of doing this stuff or the better option for me would be to switch to nativ C/C++ code using Directsound APIS.
update:
By configuring hardware, I mean setting the param related to audio playback. These param would include sample rate (eg: 44100 Hz), number of channels (eg: stereo), storage format (eg: 16 bit little endian) etc.
My client application is on Linux and I have planted an ALSA driver that intercepts PCM stream and hw_params configuration and then send them to server.
update ends
Thanks.
If you look at the latest NAudio code, you will see there are two examples in the NAudioDemo sample app that play streaming audio. One is a rudimentary chat application that sends compressed voice via UDP, the other plays streaming MP3 internet radio. I'd suggest you have a look at that and try the sample app to see if it meets your needs.