I build the ASP.NET MVC application that should access client's web cam in a browser and send the video stream to my web server and there some screenshots must be taken from that stream at a specific moment. I reviewed a lot of technologies but I can not figure out which matches.
The Flash is not suitable because I do not need a separate Flash media server. The WebRTC seems like establishing peer-to-peer connection between clients omitting web srver (I do not see how to capture WebRTC video stream on a web server). IIS Media service require some streaming encoders and it is not about working directly with client's web cam.
And for Silverlight I only found examples how to use it as video player and screenshot maker, but not a broadcaster. But I must make screenshots on the server, not in the client.
So I lost my way.
Can anybody prompt me the right direction?
Short answer: You can capture a WebRTC video/audio on the server. Just provide a WebRTC client (running on the server) that communicates with the other WebRTC clients. You may want to check the open source C client by Google (https://code.google.com/p/webrtc/). However, this means that you do not need ASP.NET at all. You can adapt the aforementioned client and copy them to a specific directory and process them with ASP.NET from there.
Related
I have a web page for data processing. Web page waits for data to process.
And I have a C# application for data. I want to send data to open web page. But I don't want to use socket, Post/Get methods or any web request.
Web page and C# application are client side. They run in same Windows at the same time.
I want to send data to web page from C# app. This operation need to be done with Windows OS or some command line based trigger mechanism.
Web page (Chrome tab or Firefox tab; it doesn't matter) should have tab id the work on. With using this id I may be able to send data to web page from C# app.
But I couldn't really find anything useful.
Is there any way to do this? Is it even possible?
Any advice appreciated.
The way I would approach this is this, assuming you are using either WinForms or WPF:
In your application, embed a web browser:
WPF: https://www.wpf-tutorial.com/misc-controls/the-webbrowser-control/
WinForms: https://learn.microsoft.com/en-us/dotnet/desktop/winforms/controls/webbrowser-control-windows-forms?view=netframeworkdesktop-4.8
Load the web page in that browser
Establish a two-way communication between your client application and the web page:
https://learn.microsoft.com/en-us/dotnet/desktop/winforms/controls/implement-two-way-com-between-dhtml-and-client?view=netframeworkdesktop-4.8
If using other technologies let me know and I'll update answer.
One solution is Raw TCP/IP connection on browser but it's not yet fully supported and implemented. If by Web requests you mean online requests then what #DevRacker said is the best I know, too.
However consider TCP/IP, Web sockets and even REST-APIs are frequently used for Inter-Process Communication (IPC) too, when there is no online transmission of data and the data/command is only transmitted over a local machine.
If I were you, I would use Web sockets or maybe a simpler solution such as Socket-IO.
I am creating a "tracking system" for my android phone. Basically, the phone would transmit it latitude/longitude to a C# program that will receive the coordinates and display it on a map. I managed to get the phone's latitude/longitude coordinates but I need a way to transmit that data to my C# application running in my PC. I know my way around C# and Java but never really got into network programming. Is there a way to do this ?
I'd look into C# Webservices, very powerful. The communication protocol is SOAP (Simple Object Access Protocol) which is a popular and well supported standard.
Good starting reference: Your first C# Web Service
I can't speak to the C# side of it, but if you write a C# server to receive post requests, the Apache HttpClient makes it relatively simple to send requests. There are a bunch of other questions on SO that will help with setting up the client side on your phone if you poke around a bit.
You have different way to do it.
The first is to send data over the cloud to interact through distant server. So the both devices have to be connected to Internet
The second id to send the data over a local network so the data have to be on the same network and have to share their IPs
The third is to use adb forward which allows to build a socket connection between the 2 devices over a USB cable. The requirements are that the phone have to be on debug mode I think and that the driver are installed on the PC so the phone is recognized by ADB.
I can add one more by using a Bluetooth connection. Find information here
The correct way to do it would be to build WCF service which allows many different protocols to receive data. This would could be a very powerful solution that can be extended in future.
The fastest and easiest IMO would be to create ASP.NET MVC application, define a POCO Model for your longitude/latitude and create an action that would take your Model as parameter. Then use HttpClient and HttpPost on android to send the data. The trick here is to name the variables in your post request same as ones you define in your C# Model.
I am writing an C#.Net based application which requires publishing video and audio streams to Red 5 Media Server and retrieving the same published stream in another application on a local network and later on on internet.
In short, I intend to use Red5 as intermediate or proxy for the transmission of video and audio data between two clients.
[client 1] <-Video and Audio -> <- Video and Audio -> [Client 2]
I am looking for .NET implementation(library) of the RTMP protocol that can help me publish the stream to Media Server and retirve the published stream and play it back on my application.
I searched over SOF and came to these options:
Borrocoli RTMP Client Library
FlourineFx.NET
WebORb.Net
Each has some limitations.
Borrocoli RTMP Library has only plyback support for audio/video streams but there is no option of publishing a video/audio stream to the media server. I have played with the library and seen its examples but no avail. If i am wrong please correct me.
FlourinFx.Net says that supports NetStream.Publish(), NetStream.AttachAudio() and NetStream.AttachVideo() methods. But in latest snapshot of code, there is nothing like this. Same is true for their production release. The NetStream class doesn't have the stated methods and/or does not have any methods that can help publish streaming content TO the media server.
WebOrb.Net: I have not explored it, but as evident from their licensing page, the free version works with IIS only. The enterprise version seems to have all the support for publishing streaming video...
Questions:
Is it possible that I can use WebOrb.Net library in conjunction with Red5 Media Server?
Will the free version allow me to publish audio and video to Red5 media server?
Is there any other free alternative I can use if the above questions are not valid?
You can use ffmpeg to send stream to Red 5 MediaServer..
Set the source video to ffmpeg and the output to rtmp of red5, something like this:
ffmpeg -re -i file.mp4 -c copy -f flv rtmp://server/live/streamName
See this answer for an examples to integrate ffmpeg in c#.
you can use weborb.lib for peer to peer video streaming by using flex and .net via RTMP protocol.you can use for the peer to peer streaming.your process can be done by as follows...
1.develop a Flex client code connecting to the server and subscribing to receive server-side updates with the CPU readings.
The code also includes a chart to visualize the received data.
2.The server-side 'Application handler' which plugs into WebORB, polls CPU and delivers data to the client.It will works try it.
Use RTMPdump's librtmp. It's written in C but you should be able to write C# wrappers for it easily enough.
We are starting to develop a new application and I'm searching for information/tips/guides on application architecture.
Application should:
read the data from an external (USB) device
send the data to the remote server (through internet)
receive the data from the remote server
perform a video call with to the calling (support) center
receive a video call call from the calling (support) center
support touch screens
In addition: some of the data should also be visible through the web page.
So I was thinking about:
On the server side:
use the database (probably MS SQL)
use ORM (nHibernate) to map the data from the DB to the domain objects
create a layer with business logic in C#
create a web (WCF) services (for client application)
create an asp.net mvc application (for item 7.) to enable data view through the browser
On the client side I would use WPF 4 application which will communicate with external device and the wcf services on the server.
So far so good. Now the problem begins. I have no idea how to create a video call (outgoing or incoming) part of the application.
I believe that there is no problem to communicate with microphone, speaker, camera with WPF/C#. But how to communicate with the call center? What protocol and encoding should be used?
I think that I will need to create some kind of server which will:
have a list of operators in the calling center and track which operator is occupied and which operator is free
have a list of connected end users
receive incoming calls from end users and delegate call to free operator
delegate calls from calling center to the end user
Any info, link, anything on where to start would be much appreciated.
Many thanks!
It sounds like you are in the support business, not in the making of video conferencing software business. Maybe look at licensing 3rd party SDK's to fill in the video conferencing component of the application. It should end up being a lot faster to get your application live than reinventing the wheel. I googled quickly for "Video conferencing SDK". Several popped up immediately. I haven't ever used any so I am not about to start recommending any but I would think that is a good place to start.
I suggest using an existing service (Skype, Google Chat, ...) and control if from your WPF application(s) for the operator and client.
Use a SIP solution with a video codec. Open standard one there are many SIP SDK's out there.
We need to capture a live video stream from WebRTC (or any other capturing mechanism from the client webcam, even if it is not supported on all browsers, but as a PoC).
This live video needs to be handled by a server component (ASP.Net MVC / Web API), I imagine that the code on the server will look like:
[HttpPost]
public ActionResult HandleVideoStream(Stream videoStream)
{
//Handle the live stream
}
Looking for any keyword or helpful link.
We have already implemented a way to send individual frames using base64 jpg, but this is not useful at all, because there is a huge overhead of the base64 encoding and because we could use any video encoding to send the video more efficiently (send the difference between the frames using VPx -vp8- for example), the required solution needs to capture a video from the webcam of the client and send it live (not recorded) to the server (asp.net) as a stream -or chunks of data representing the new video data-.
Your question is too broad and asking for off-site resources is considered off-topic on stackoverflow. In order to avoid opinion-prone statements I will restrict the answer to general concepts.
Flash/RTMP
WebRTC is not yet available on all browser so the most widely used way of capturing webcam input from a browser currently in use is via a plugin. The most common solution uses the Adobe Flash Player, whether people like it or not. This is due to the H.264 encoding support in recent versions, along with AAC, MP3 etc. for audio.
The streaming is accomplished using the RTMP protocol which was initially designed for Flash communication. The protocol works on TCP and has multiple flavors like RTMPS (RTMP over TLS/SSL for encryption), RTMPT(RTMP encapsulated in HTTP for firewall traversal).
The stream usually uses the FLV container format.
You can easily find open-source projects that use Flash to capture webcam input and stream it to an RTMP server.
On the server-side you have two options:
implement a basic RTMP server to talk directly to the sending library and read the stream
use one of the open-source RTMP servers and implement just a client in ASP (you can also transcode the incoming stream on the fly depending on what you're trying to do with your app).
WebRTC
With WebRTC you can either:
record small media chunks on a timer and upload them on the server where the stream is reconstructed (needs concatenating and re-stamping the chunks to avoid discontinuities). See this answer for links.
use the peer-to-peer communication features of WebRTC with the server being one of the peers.
A possible solution for the second scenario, which I haven't personally tested yet, is offered by Adam Roach:
Browser retrieves a webpage with javascript in it.
Browser executes javascript, which:
Gets a handle to the camera using getUserMedia,
Creates an RTCPeerConnection
Calls createOffer and setLocalDescription on the
RTCPeerConnection
Sends an request to the server containing the offer (in SDP format)
The server processes the offer SDP and generates its own answer SDP,
which it returns to the browser in its response.
The JavaScript calls setRemoteDescription on the RTCPeerConnection
to start the media flowing.
The server starts receiving DTLS/SRTP packets from the browser,
which it then does whatever it wants to, up to and including storing
in an easily readable format on a local hard drive.
Source
This will use VP8 and Vorbis inside WebM over SRTP (UDP, can also use TCP).
Unless you can implement RTCPeerConnection directly in ASP with a wrapper you'll need a way to forward the stream to your server app.
The PeerConnection API is a powerful feature of WebRTC. It is currently used by the WebRTC version of Google Hangouts. You can read: How does Hangouts use WebRTC.
Agreed that this is an off-topic question, but I recently bumped into the same issue/requirement, and my solution was to use MultiStreamRecorder from WebRTCExperiments. This basically gives you a "blob" of the audio/video stream every X seconds, and you can upload this to your ASP.NET MVC or WebAPI controller as demonstrated here. You can either live-process the blobs on the server part by part, or concatenate them to a file and then process once the stream stops. Note that the APIs used in this library are not fully supported in all browsers, for example there is no iOS support as of yet.
My server side analysis required user to speak full sentences, so in addition I used PitchDetect.js to detect silences in the audio stream before sending the partial blob to server. With this type of setup, you can configure your client to send partial blobs to server after they finish talking, rather than every X seconds.
As for achieving 1-2 second delay, I would suggest looking into WebSockets for delivery, rather than HTTP POST - but you should play with these options and choose the best channel for your requirements.
Most IP cameras these days will use H264 encoding, or MJPEG. You aren't clear about what sort of cameras are being used.
I think the real question is, what components are out there for authoring/editing video and which video format does it require. Only once you know what format you need to be in, can you transcode/transform your video as necessary so you can handle it on the server side.
There are any number of media servers to transform/transcode, and something like FFMPEG or Unreal Media Server can transform, decode, etc on server side to get it to some format you can work with. Most of the IP cameras I have seen just use an H264 web based browser player.
EDIT: Your biggest enemy is going to be your delay. 1-2 seconds of delay is going to be difficult to achieve.