I have a C# library that interacts with webservers. (Specifically, it implements various means of binding local data to a remote source). All the webserver needs to do is return simple strings to simple HTTP requests. (Later, I'll add handling web sockets)
I would like to have a self-sufficient test suite for this library; so it seemed to me that a reasonable way to do this was to implement a simple C# web server (which so far seems delightfully simple). Tests could then start up a thread with the server, then run against it, and then shut down the server thread.
However, I'm running into difficulty. Is there a better way / more canonical to do this?
Note: Complex solutions are not going to be better.
Note2: Wrapping Web services in wrappers and stubbing those would defeat much of the purpose of the library in the first place.
There are other ways of creating simple web servers.
I use Node.js to create Mock web servers for testing things like C# web clients. Python is another option.
If you like to stay with C# here's another simple solution, this can be adapted:
https://codehosting.net/blog/BlogEngine/post/Simple-C-Web-Server
Related
I am currently involved in a simple to medium complex IOT project. The main purpose of our application is gathering data from our devices and analyzing that data as well as calculating statistics.
On the server side we run a MVC application. Up until now we used Hangfire to schedule the calculations. Hangfire is an amazing tool for scheduling emails and other simple stuff, for more advanced things it's too slow. The calculations can take up a lot of time and are processor-intensive (we are trying to optimize them though), so we need to call them in a background task, a simple API call won't be enough.
I thought about splitting the application into multiple parts, the website, the core and a windows service.
The problem is, I never tried that before and I have no idea what the best practice is to achieve that kind of thing. I searched for examples and articles, but all I found were suggestions to use Hangfire and/or Quartz.NET.
Does anyone have any resources on what the best practice is to build a MVC application, a Windows service and how they could communicate (probably through a queue)? What is the best practice in such a situation?
Although there may be many different possible ways to connect a site with a windows service, I'd probably chose one of the following two, based on your statements:
Direct communication
One way of letting your site send data to your backend windows service would be to use WCF. The service would expose an endpoint. For simplicity's sake this could be a basicHttpBinding or a netTcpBinding. The choice should be made based on your specific requirements; if the data is small then basicHttp may be "sufficient".
The advantage of this approach is that there's relatively little overhead needed: You'll just have to setup the windows service (which you'll have to do anyway) and open a port for the WCF binding. The site acts as client, the service as server. There's nothing special with it, just because the client being a MVC site. You can take almost any WCF tutorial as a starting point.
Note that instead of WCF you could use another technology like .NET Remoting or even sockets just as well. Personally, I often use WCF because I'm quite used to it, but this choice is pretty opinion based.
Queued communication
If reliability and integrity is crucial for your project, then using a queue might be a good idea. Again: depending on your needs, there may come diffeent products into consideration. If you don't need much monitoring and out-of-the-box management goodies, then even a very simplistic technology like MSMQ may be sufficient.
If your demands to the aforementioned points are more relevant, then maybe you should look for something else. Just recently I got in touch with Service Bus for Windows Server (SBWS). It's the Azure Service Bus's little brother which can be used on premises locally on your windows server. The nice thing about it is, that it comes at no extra charge as it's already licensed with your windows server licence.
As with the first point: MSMQ and SBWS are just two examples. There may be a lot of other products like NServiceBus, ZeroMQ or others usable, you name it.
We're building a C# application that allows loading custom plugin DLLs and executing them.
Each DLL contains some task, and we'd like that task to be transparently executed either locally or on some remote server.
I have examined various solutions for this, and so far the best solution that was proposed was to use WCF.
I'd like to understand, since i'm currently only through basic tutorials of WCF, if it is at all possible to dynamically deploy new code using WCF to be executed remotely?
The way i see it, i have 2 different scenarios:
Remote machine has a base "execution" library deployed.
Remote machine has no WCF service installed on it currently.
With option #1, i guess i could have some functionality to send across the DLL or something, and execute it remotely, since the execution library knows how to do that already.
With option #2, i would need to basically deploy everything (somehow) from scratch, and then send a command to run it.
Is this scenario possible at all? do you have any tips to perform this kind of task?
Also, if you have any good WCF tutorials (i'm currently reading up on MSDN).
Thanks!
The important thing to remember here is that WCF can be used to transfer data, but not to tranfer execution logic. You can send the result of an addition from one end to the other, but you cannot send some arbitary instruction (like adding or whatever) and let the other end magically execute it.
In other words, if you have a WCF client on the remote end, you could send it your DLL file (as a binary data), and the client could then dynamically load and execute it using reflection (but what's without mentioning all the obvious security and compatibility concerns this would raise).
Another maybe easier option would be to send scripts instead of compiled code, and execute them with some interpreter on the server side. But whatever trick you use, you'll need to do a lot of work outside of WCF as sending instructions is not the objective of WCF.
Yesterday I asked about what technology should I use to create dynamic web content here:
PHP, AJAX and Java
The suggested methods were JSP, JQuery, etc. But I thought maybe because I'm a .Net developer and I don't have any experience in web development but I have experience in WPF and C#, maybe I should go with Silverlight but the main problem here would be how can I communicate with the core part of my system which is implemented in Java?
So the main question would be: What is the best [and easiest to learn] method to send a piece of data to the Java part, get the result and use it in silverlight? A tutorial or simple example would be nice.
Thanks a lot in advance.
You should use Java Web Services as stated. Use WCF to invoke the Java WS by adding a Service Reference in Visual Studio by its url, then use the proxy classes generated automatically (located in Reference.cs) to invoke the WS. This is easy but remember SilverLight WS invocations are always asynchronous, so you must cath the OnCompleted event to get the results of the invocation. WS are slow but if the machines are in the same LAN, invocation could take a few milliseconds.
I think pipes are not your solution as SilverLight executes in a Sandbox and have many restrictions on what you can do.
This will depend on many factors, however a relatively easy approach would be to use Java Web Services. On the .NET side, WSDL will be picked up and transformed into proxy class by WSDL.exe from the Windows SDK. If, however, these two systems are on the same server (and intend on staying this way), you may decide to use pipes.
I am starting on a fairly basic Server/Client application (logic wise), but I am a bit confused as to what I should use for my needs. It looks like there a few options, but basically I am going to have a Master Server, and X amount of client applications (one per dedicated machine). The main purpose of this setup is so that I can basically do the following...
-Issue command to server (console app) via an ASP front end to install software on one of the remote clients.
- Server tells client to download zip package (from a various FTP site) to location and extract it to specific path.
I am not positive, but it looks like C# has Sockets and then some sort of WebClient type of deal. I am assuming Sockets would be the best route to take, and to use asynchronous (each remote client is connected in its own thread, dealing with the server individually of others).
Any information on this would be great!
Without going into too much detail for your specific requirements, I would definitely look at WCF.
It encompasses a lot of the existing remoting, client / server, web services scenarios in a very complete and secure framework.
Client Server Programming with WCF
WebClient allows you to make HTTP requests, so I don't think it's very relevant here.
There are many approaches you can take for this app.
One is of course going with WCF, which provides about a million time more options than you will need. However, WCF does have a learning curve and in particular it's hard to understand what exactly is hidden behind all the abstractions without prior experience. Furthermore, this solution is not available if you are targeting .NET 2.0.
You can also implement a simple TCP client/server model using sockets. While you can program against raw sockets, .NET also offers the convenience classes System.Net.Sockets.TcpListener for the server and System.Net.Sockets.TcpClient for the clients. This approach is much closer to the metal, but this is a tradeoff: it's much easier to understand what exactly you are doing, but you will have to implement a fair bit of functionality yourself.
I'm writing a windows service to do some daily processing, and I want to have a user-friendly way to interact with it. I'll just be doing basic things like checking its status and viewing logs, though I may decide I want to throw in a function call or two as well. After doing some research, it sounds like I need a separate application to perform these functions, since the service will run independently of any user that's logged into the host machine. My idea is to have this application interact with the service through some kind of interface, but I'm not sure where to begin.
What would be the simplest way to have an application communicate with a separate service? Would I use COM, WCF, a message queue, or something else entirely? I know there are probably a few ways to do this, so I would love to hear some pros and cons if possible.
Edit: The service and the application will both be running on the same machine.
Use WCF with NetNamedPipeBinding (allows only IPC on the same machine) or .NET Remoting. If you want to do it quickly choose the technology you are more familiar with. If you are not familiar with any of these technologies choose WCF because it is newer one and you will more probably use it again in the future - so exeprience with it will be useful.
Ideally you would create a separate application and use WCF to communicate between your service and this application.
But there is a 'cheaper' way which is to implement your own simple Web server using HttpListener. See http://msdn.microsoft.com/en-us/magazine/cc163879.aspx
This makes it easy to accept a few simple commands and you can send them using any web browser.
For viewing logs why not just tail the log files (using e.g. baretail)?
Skip WCF, and just use plain .NET Remoting. So much easier. Why they call it deprecated, God knows.
Edit: Seeing it runs on the same PC, the transport would be Named Pipes, IIRC WCF supports this too.