I've created an exe file that does some maintainance work on my server.
I want to be able to launch it from the website that sits on the server.
The exe has to be launched on the server itself and not on the client.
My instincts tell me it's not possible but I've had to check with you guys.
If I need to set certain permissions / security - I can.
Yes, it can be done, but it's not recommended.
An ideal solution for running maintenance scripts/executables is to schedule them using cron on Unix/Linux systems or using Scheduled Tasks in Windows. Some advantages of the automated approach vs. remote manual launch:
The server computer is self-maintaining. Clients can fail and people can forget. As long as the server is running the server will be keeping itself up to date, regardless of the status of client machines or persons.
When will the executable be launched? Every time a certain page is visited? What if the page is refreshed? For a resource-intensive script/executable this can severely degrade server performance. You'll need to code rules to handle multiple requests and monitor running maintenance processes. Cron & scheduled tasks handle these possibilities already.
A very crude option, Assuming IIS: Change Execute Access from "Scripts Only" or "None" to "Scripts and Executables"
To make this less crude, you should have the executable implement a CGI interface (if that is under your control.
And, if you want to use ASP.NET to add autorization/authentication, the code (C#) to do this would be:
System.Diagnostics.Process process;
var startInfo = New System.Diagnostics.ProcessStartInfo("C:\file.exe")
process.StartInfo = startInfo;
process.Start();
process.WaitForExit();
It's possible, but almost certainly it's a bad idea. What environment/webserver? You should just need to set the relevant 'execute' permissions for the file.
I really suggest that you don't do this, however, and configure the task to run automatically in the background. The reasoning is that, configured badly, you could end up letting people run any executable, and depending on other factors, completely take over your machine.
Depends what language you're using; most server side scripting languages give you a way to exectue shell commands, for example:
$result=`wc -l /etc/passwd`;
executed a unix command from perl.
Most web languages (I know at least Java and PHP) allow you to execute a command line argument from within a program.
Related
I'd like to write some code that uses BITS for copying very large files between disks on the same server (the second disk being a SAN level clone\snapshot).
I looked into BITS as i thought it would be a good way to get progress\percentage complete data on the transfers, as well as resume functionality etc
I have a lot of hosted Powershell and I thought i'd have a stab at using the inbuilt BITS cmdlets, as this would be a super quick way of doing it, i could write wrappers to get the stuff i need etc Unfortunately i ran into this:
When you use *-BitsTransfer cmdlets from within a process that runs in a noninteractive context, such as a Windows service, you may not be able to add files to BITS jobs, which can result in a suspended state. For the job to proceed, the identity that was used to create a transfer job must be logged on. For example, when creating a BITS job in a PowerShell script that was executed as a Task Scheduler job, the BITS transfer will never complete unless the Task Scheduler's task setting "Run only when user is logged on" is enabled."
Doing anything through an impersonated Powershell runspace throws up the following error:
The operation being requested was not performed because the user has not logged on to the network. The specified service does not exist. (Exception from HRESULT: 0x800704DD)"
My web service is running as AppPoolIdentity, I impersonate when needed to do stuff. It makes sense this doesn't work via hosted Powershell, but can anyone think of a workaround? If this isn't possible, do i have any alternatives?
I was thinking of using BITS Compact Server as an alternative, but the documentation is ancient.
Have you thought of using robocopy? Running with the /ZB switch has always worked well for me, and you can improve performance with /J which enables unbuffered I/O. For output, run with /ETA and /TEE as well.
I'm currently running some computationally intensive simulations, but they are taking a long time to complete. I've already split the workload across all the available physical cores in my processor. What I'm wondering is how to go about splitting the workload further and assigning it to other computers. I'm contemplating buying a couple Xeon servers and using them for the number crunching.
The one big issue I have is that I'm currently running the program within Visual Studio (Ctrl F5) as there are two methods which I'm constantly making small changes to.
Any suggestions on how/if it's possible to assign the workload to other computers / if it's possible to still run the program with VS or would I need to create an *.exe each time I wanted to run it?
It depends on the problem you're solving.
You can use map/reduce and Hadoop if it's easily parallelizable, like SETI#Home.
You can use something like MPI if it's not, like linear algebra.
Isn't the crux of your problem in this statement "The one big issue i have is that im currently running the program within Visual Studio (Ctrl F5) as there are two methods which im constantly making small changes to."?
Is it the "one big issue" because if you distribute then you can't afford modifying the code on all of the nodes when doing the job so you think about something distributing it for you? If this is the case then I assume that you already know how to split the algo or data in a way that nodes can take take of small parts of the job.
If it's the case - sorry if I misunderstood - then externalise the part that you are "constantly making small changes to" into a file or a DataBase encoded in some simple or more elaborate form depending on what you are changing so you don't need to have your nodes change constantly. Deploy the code on all nodes, connect them to the DB or file which contains the varying bit and enjoy your new Ferrari!
You could use the WMI service to start your process on the remote computers. You would build your exe to a shared directory that is visible to the remote computer, then use WMI on the remote computer to launch the exe.
There are plenty of examples out there to do this, but to illustrate, a simple method which assumes no authentication complications is to use a .VBS script file:
strComputer = "acomputer"
strCommandLine = "calc.exe"
Set objWMIService = GetObject("winmgmts:\\" & strComputer & "\root\cimv2")
Set objProcess = objWMIService.Get("Win32_Process")
intReturnValue = objProcess.Create(strCommandLine, , , intPID)
WScript.Echo "Process ID: " & intPID
You can also use PsExec from SysInternals to handle all the details of making this work.
After building the exe in Visual Studio, you could run it on your local machine to ensure it does what you want, then when you are ready to launch it on the remote systems, you can execute a batch script similar to the above VBS to launch the exe on the remote systems.
You will still need to provide some mechanism to divide up the workload so that each client knows what part of the problem it is supposed to work on. You could provide this information in the command line used to start the remote apps, in a config file in the directory with the exe, in a database table, or use a separate command-and-control type server that the clients connect back to (although with that approach you'll soon get to the stage where you would have been better off with learning to use an existing solution rather than rolling your own).
You may also want to include a remote 'kill switch' of some sort. You could use PsKill from SysInternals, or if you want a more graceful shutdown, something simple like the existence of a particular file in the same directory as the exe can serve as a flag for the remote processes to shut themselves down.
You could also consider adding CSScript support to the client so that the remote client programs are static and load and compile a CSScript file to do the work. This might be useful if you encounter some kind of difficulty in frequently redeploying and restarting the client programs, or if you need them to all be slightly different (you might write a program to generate separate script files for each client for example).
I have a console application that writes on a txt files information retrieved from a database. Until now I manually executes the executable generated by the console application.
Now I need to automatize the invocation of the .exe from my web application, so that each time a specific condition happens in my code behind I can run the .exe with a logic "fire and forget".
My goals are:
1) Users must not be affected in any way by the console application execution (the SQL queries and txt file generation might take around 3 to 5 minutes), therefore the logic of "fire and forget" delegated to a separate process.
2) Since the executable will be still run manually in some cases, I would prefer having the all logic in one place, in order to avoid the risk of having a different behaviour.
Can I safely use System.Diagnostics.Process to achieve this?
System.Diagnostics.Process cmd = new System.Diagnostics.Process();
cmd.Start("Logger.exe");
Does the process automatically ends or do I have to set a timeout and explicitly close it? Is it "safe" in a web application environment with different users accessing the web application let them call the executable without the risk of concurring accesses?
Thanks.
EDIT:
Changed to use the built in class for more clarity, thanks for the hint.
As far as the mechanics, I assume CommandLineProcess wraps Process? If so, I don't see anything necessarily wrong with it, at first glance. I just have some issue with running this as an executable from a web application, as you are more likely to reduce security to get it working than rearchitect (if you follow the normal path I see in development).
If you encapsulate the actual business code in a class library, you can run the code in the web application. The main rule is the folder it saves to should be under webroot (physically or logically) so you don't have to reduce security. But, if the logic is encapsulated, you can run the "file creeator" in the web process without spinning up a Process.
Your other option is wrap the process in a service (I like a non-HTTP WCF service, but you can go windows service, if you want). I would only go this direction if it makes sense to follow a SOA path with a service endpoint. As this is likely to be isolated to a single application, in process makes more sense (unless you are saving to a directory outside of webroot).
Hope this makes sense.
Yes, it will die on it's own - provided that the .exe file will terminate on it's own. It will run with the same credentials of the web server.
Keep in mind this is considered unsafe, since you are executing code based on whatever your webapp is doing. However, the problem is with .exe files being executed this way in general and not with the actual users accessing the app.
Similar question here How do I run a command line process from a web application?
I am writing a software for a company in c# which is intended to run on windows platform.
One of my requirements is to allow the user to schedule back ups.
That is, the user will set a time where the database will be backed up automatically by the computer.
On the linux platform I would have use crons but I am a bit lost on the windows platform. I do not want the software itself to be actually opened for the back up to run. I want it to be carried out even if the software itself is not running.
My best bet is to use windows scheduler or create a custom service which will run at start up.
Can anyone point me to how to actually achieve this? Any constructive suggestions are welcomed.
Thanks.
For info the Windows "AT" command is somewhat similar to cron. You can get help from the command line thus:
AT /?
I wouldn't necessarily recommend it for a db backup. Either create a Windows scheduled task, or to backup a SQL Server database, use SQL server's built in scheduler.
Another alternative would be to create a windows service to handle the task. Then you could write any code needed (e.g. Backup / Email logs, etc) quickly and easily, and it would work w/o your application running.
There are ways to accomplish the same task with Task Scheduler built into windows, but just an alternative that I would prefer.
Is it possible to run a windows form application or a console application under system account? Like asp.net application can run under system account by changing machine config file :
<processModel userName="machine" password="AutoGenerate" />
This is to give more privileges to the program ...
It sounds like you're attacking the symptom rather than the problem. What exactly does your program need to do that requires additional permissions? Maybe there's a different way of accomplishing that task without requiring any kind of elevation.
Yes. You can run any app under the system account. One technique is to launch it as a scheduled task, or by using the "at" command line utility.
Unfortunately, however, since Windows Vista, applications run in this way can't interact with the user, since they run in a different session.
This means that running a WinForms (or any kind of GUI, really) application in this way is kinda pointless. Similarly for a console app, if you want to see the output.
If it's for a one-off, you can probably live with it. Otherwise, you should be looking at creating a Windows Service, which can be configured to run under any user account (including SYSTEM). If you want to interact with it, you'll need to implement a separate app that talks to it through (e.g.) .NET remoting.
Can't you do that by launching it from a scheduled task in Windows?
That depends on what your goal is. If you want it to run under the system account and let a user interact with it, you can't do that. If you absolutely need to do this your best bet it to create a service that handles the operations that require additional priveleges and runs as System, and the connect to that service from a GUI running as user. However, if you go this route, realize that you're creating a hole in the security boundary between what a standard user can do and what System can do so be sure you protect the connection between the GUI and the service and limit the scope of the service to only what you absolutely need it to do.
As lassevk mentions if you just need to do this once or occasionally you can use runas to run in another security context but still have an interactive GUI / console.
On the other hand, if you just want it to run unattended at a certain time, you should be able to use the task scheduler like Martin suggests.