I'm wondering what the preferred architecture would be for the following situation:
I'm required to have a .NET application that will perform batch upload of multiple data files concurrently to a SQL Server. This will be invoked from a WPF application which will allow the user to select the files and the destination tables, as well as reporting on the individual progress for each upload (including error messages). I have absolutely no problem writing the code for any of this. However, there is a requirement that the user is able to close the WPF application altogether and for the upload process to continue. Further, if the user restarts the WPF application from the same machine, it should be able to get a handle on the existing uploads and report on the status as if the program were never closed.
My question is what are the ways of achieving this and which would seem the most standard/suitable?
I've considered simply not actually closing the WPF application but hiding all the windows, but this seems a cheat. Would it be best to create a WCF service on the server where the upload is taking place and simply upload the file? I don't think I can do that and report progress % etc though. What about a locally-running Windows Service, can I achieve a similar effect? Should I be thinking of MemoryMappedFiles?
Appreciate all your thoughts.
Because you are talking about long-running task, I would use local Windows Service communicating with your WPF application through MSMQ. For example, each file to upload can be represented by one MSMQ message. Your WPF application will be putting messages into queue and Windows Service, periodically and without any impact if WPF is running or not, should take it from queue and process. This will provide simple and reliable channel of providing tasks (uploads).
To provide internal status of the Windows Service to its clients (your WPF application), I would host inside it a WCF endpoint with simple service that is telling, for example, about progress.
Related
I have two separate programs. One is a winform that hosts a running process with output/input being redirected through the form. The process is a canned product with little chance of being able to modify it myself.
The second program is a service that can pole data from the running process but it doesn't interact with the winform directly.
My goal is to send a string from the service to the winform to then interact with the running process. OR, I wouldn't mind being able to send commands directly to the process but I'm betting that can't happen without modifying the running process source.
I've look at Named Pipes but I'm not sure that that is the best means. Both programs are hosted on the same machine. I'm just not sure what my options are. Any ideas? =)
The full range if IPC (Inter-Process Communication) options available in Windows is outlined on MSDN.
Clipboard
COM
Data Copy
DDE
File Mapping
Mailslots
Pipes
RPC
Windows Sockets
Then there are mechanisms outside of Windows
MSMQ or similar queue systems
Record commands/state in a database or file
Of all these, Named Pipes is probably the best fit for your particular application. I have used them for similar things in the past. They are easy to setup and use.
There are multiple options to achieve this communication. Mainly you can use Named pipes (WCF), MSMQ
I'm trying to get started with an application that definitely requires some GUI for configuration management and the application has to poll a web service about every hour (to check for updates/messages) or so. Also, the application is expected to run constantly in the background/system tray.
I'm looking for some guidance on the overall architecture for this application design. Can this be a straight up WPF app or would it be better it is a windows service because of the polling and because it is expected for the application to be running all the time? Do you guys have any suggestions?
Firstly, services tend not to have a GUI. They can, but it's not advised.
What I would do, is have two applications. The service itself which performs the monitoring in question, and a user-interface application (that runs on startup), and provides an interface to the service. Communication between the two can be handled in a variety of ways.
The advantage to this is, your service will run even if there isn't a user logged on, and the UI part is present only when a user is on.
To allow for your GUI to communicate with the Windows Service you can expose WCF services on the Windows Service to provide the operations you need (e.g. Start, Stop, GetStatus, etc.).
See this article on MSDN for a starting point: http://msdn.microsoft.com/en-us/library/ms733069.aspx
We are developing a web application in ASP.Net and C#. The requirement here is to interact with a third party exe which is developed in Fortran77. This third party exe produces an output file after being provided with some inputs and shuts down.In windows desktop single user application this is easily possible by using System.Diagnostics.Process and the events provided therein. But in web there will be multi-user environment, and many calls will be made to this exe. What are the best possible ways to handle such an exe in web application?
Is it fine if we invoke exe on each user request as the exe shuts down after generating output file? Or
Is it possible to use windows service? Or
Any other approach?
Thanks in advance.
-Prasad
Typically, invoking a different process to do some job (for a request) does not scale well when your number of requests start growing. Said that, if the process invocation is not going to happen frequently then you should be OK. The number of concurrent requests and through-put etc will really depend on your server hardware and the best bet would be to load test the server. As such you should use Process class to launch the process to get the work done.
Yet another issue that is possible that your legacy executable does not support multiple instance. It's unlikely but there are quite a few desktop windows application that check for existing instance. So in such, you cannot launch process concurrently and only way would be to create a queuing logic - you can create a in-process queue (in your web application) or create a external application (such as windows service) that will do queuing.
There can be alternate approach for this solution that is useful when the time taken for process to complete is large (so that you cannot block your web requests till the job is complete) and/or you need to scale your app to support more load. Essentially idea is use producer-consumer pattern where your web server will add requests to a persisted (e.g. table in database) queue and then you have multiple machines/servers running a job/windows service that would read from this queue and run the process to generate file.
I need to execute a Process where a file is downloaded using ftp and then parsed and the results used to update some tables in a database.
The system is built using a WPF client using WCF services to talk with the database.
I need to start the process from the WPF application. Now my question is:
Should I download the file on the client and then use the wpf to parse the data and to do the update using the services?
or should I download the file to the server where the services are hosted and proceed to update the database there? And if so how do I provide feedback to the client that the process is running/finished,etc?
I preffer the second alternative, but I am not sure on how to implement the feedback on the background process...
Thanks
There could be number of solutions to your problem but I can think of two at the moment.
You could maintain a flag in the database table. Which a background thread in a WPF apllication will poll at set intervals and read the flag to update the status.
You can make use of a FileSystemWatcher. If you are on the Intranet and you can write into the file that you are processing. A FileSystemWatcher can raise events in the WPF application.
I just wanted some input on a project that I'm working on. Basically, I'm creating a service that monitors and processes new files in a directory specified by a configuration file and other parameters through the command line. It should also output text via the command line, i.e. when a user types in '-help' it will display it's usage. A goal is to also make it so that the user can change the configuration file on the fly, so the service will constantly monitor the config file for changes and adjust accordingly.
The challenge I'm seeing is trying to consolidate the ability to enter commands through the command line, display output, as if it were a console app, and have the service be able process these commands while it is running in the Service Control Manager. So far in my research, the solutions I've stumbled upon seem to show how to create a Windows Service App that can also run as a console app, but it either operates as one or the other, not both. Any suggestions or input is appreciated.
UPDATE:
Thank you all for your suggestions, I did some reading on various Windows supported IPC mechanisms. I boiled my options down to File Mapping, Named Pipes and RPC. I'm assuming for now that the Windows Service app and Helper Console app will be on the same computer and will not need to communicate over a network. I'll be looking at Named Pipes first.
The service needs to offer a possibility to communicate with it, but this can't be done directly, it has to be done by some sort of IPC (inter process communication), this could be .net Remoting, WCF, tcp/ip. You will have to write a helper program that parses the command line and uses this IPC to send the commands to the service, which can act appropiately on receiving it.
I would write a console app that sends requests to the service and displays its responses. If you wanted to be clever, you could include both parts in the same executable. I can't think of any way to make the service write to and read from the console while it's running as a service.