I am trying to run a few commands on a remote machine using Win32_Process, but I can't get it to work.
This is what I tried first:
var processClass = new ManagementClass(#"\\server.domain.co.uk\root\cimv2:Win32_Process");
var inParams = processClass.GetMethodParameters("Create");
inParams["CommandLine"] = #"echo. 2>C:\users\user.name\desktop\EmptyFile.txt";
inParams["CurrentDirectory"] = #"C:\windows\system32";
var outParams = processClass.InvokeMethod("Create", inParams, null);
But nothing happens. I also tried running this locally at root\cimv2:Win32_Process, but again there was no effect. I was able to get it working locally when calling notepad.exe instead of the command line, but this does not work on the remote computer.
How can I work out what is going wrong with this?
In outParams, which is a System.Management.ManagementBaseObject, I can see that ClassPath contains the value Evaluation timed out - could this be a clue as to why it isn't working?
After reading around this and running some tests, it seems that similar to those started from CScript, these processes cannot be made interactive:
You can use Win32_Process.Create to execute a script or application on a remote computer. However, for security reasons, the process cannot be interactive. When Win32_Process.Create is called on the local computer, the process can be interactive.
I am still a bit confused, because the code I ran above did not appear in the task manager, yet the article goes on to say:
The remote process has no user interface but it is listed in the Task Manager
Either way, this suits my needs as I am running a shell command - however it is worth noting that executing cmd commands in this way requires that they are formatted as:
cmd /c <command-goes-here>
In my case, the command needs to be run from a specific location, so I need to pass:
cmd /c cd <path-to-executable> && <command-goes-here>
Related
Running a script that starts and kills/cleans two jobs via batch file before I package it for an EXE, works great in ISE elevated but immediately fails in console or command prompt with the following:
The code I have put together is here: https://pastebin.com/FWaZD249
I tested it with:
PS1 to EXE, get the same results
Non-elevated ISE same results
Elevated console same results
Elevated CMD same results
Elevated ISE works (only after saving?)
It's really close to being done, basically, it's just a little script that checks the 5900 port for established connections, updates a form of a list of connections, and sends a little notification if someone new has connected, it runs on our print server computer which is screen-shared remotely via TightVNC, so operators don't get surprised when their mouse starts moving on them.
cmd batch code looks like this
powershell.exe -NoExit ". C:\Users\VS-Print-Server\Desktop\Userchecker.PS1"
As I mentioned in the comment: you initialize the jobs $job1 and job2 with $FormLib respectively $LisLib before those variables are set/initialized. Moving line 1 through 30 (everything before Write-Verbose -Verbose 'Before:') to the very end should make it work.
I am using a .NET library Renci.SshNet to connect to remote Solaris machine (its a VM on ESXi). It connects fine.
I use the following method to execute the commands and get the Standard Output. This works fine on any Linux machine and almost all command on Solaris (Except few - which is where the issue is)
outstring = sshClient.RunCommand(command).Execute();
For example when command = "cat /etc/release | grep Solaris" -it works fine.
However, when command = "smbios -t SMB_TYPE_SYSTEM" - it doesn't return anything. I try redirecting it to a file. The file gets created - but doesn't have anything on it.
I connect to the system using PuTTY and run the command - it runs perfectly and gives the desired output.
I am perplexed by this behavior. I am using a username with root privilloginto logon. So privileges are ruled out (anyway the same user gets the output in PuTTY).
I am wondering if there is any setting or restriction on Solaris (I am running ver 11.3) which does not allow the smbios command to run like this over a remote connection? Or it is something else? Any guidance will be extremely helpful. If any further info is required, please let me know.
Well, it turns out that it was to do with Path settings. When you login through Putty the $PATH as defined gets set. Hence smbios runs from Putty.
But in a SSH session the $PATH environment variable does not get applied. So its not able to find smbios to run. If you give the full path of smbios - like /usr/sbin/smbios it executes fine over Renci.SshNet.
I have a windows service that I would like to be automatically and silently updated. I started using wyBuild to implement this, but have had some issues with it, and decided to try to build my own. I've written a standalone exe that can be called to do the update procedure: checks for a new zip file with the update, downloads it, unzips, stop the windows service, copy files from the zip, then restart the service. This exe works fine when I run it from the commandline and wasn't really difficult to write.
However, now I would like the service (the same one being updated) to shell out to the updater exe to update itself. I first tried Process.Start:
var proc = Process.Start(pathToUpdaterExe);
proc.WaitForExit(60000);
This called the updater, but when the updater stops the service, the process is killed and the update stops. I did some searching and it sounds like the solution is to use a separate AppDomain. This is what I have now:
Evidence baseEvidence = AppDomain.CurrentDomain.Evidence;
Evidence objEvidence = new System.Security.Policy.Evidence(baseEvidence);
AppDomainSetup setup = new AppDomainSetup();
var updateDomain = AppDomain.CreateDomain("updateDomain", objEvidence, setup);
updateDomain.ExecuteAssembly(updater);
AppDomain.Unload(updateDomain);
However, now I get the error System.IO.IOException: "The process cannot access the file 'C:\Program Files (x86)\Company\Service\Service.dll' because it is being used by another process" when attempting to copy over the new Service.dll
Again, I've stopped the service at this point. I've confirmed this with logging. I can't imagine what would have Service.dll still locked, so I added code to check to see what is locking it:
public static IEnumerable<Process> GetProcessesLocking(string filePath)
{
var result = new List<Process>();
result.Clear();
var processes = Process.GetProcesses();
foreach (Process proc in processes)
{
try
{
if (proc.HasExited) continue;
foreach (ProcessModule module in proc.Modules)
{
if ((module.FileName.ToLower().CompareTo(filePath.ToLower()) == 0))
{
result.Add(proc);
break;
}
}
}
catch (Exception ex)
{
Log(ex.ToString());
Log("There was an error checking " + proc.ProcessName );
}
}
return result;
}
However this code indicates that nothing has a lock on the dll (result is empty and nothing is logged indicating an error).
I suspect I'm running afoul of some UAC issue that is the real cause of the IOException. The windows service runs as LocalSystem. All that to ask: How should I be running the update exe from the windows service such that it has rights to copy files in c:\Program Files?
Update
As the comments and answer suggest, Process.Start can work, but there is some nuance. You have to start cmd.exe and use it to start the updater. I also found I could not use a full path for the updater exe and that I needed to set UseShellExecute=false. This is my final working code that launches the updater from the .NET service:
var cmd = "/c start updater.exe";
var startInfo = new ProcessStartInfo("cmd.exe");
startInfo.Arguments = cmd;
startInfo.WorkingDirectory = AssemblyDirectory;
startInfo.UseShellExecute = false;
var proc = Process.Start(startInfo);
I did this exact thing - using a simpler (some might say kludgy) approach. The service:
Produces a batch command,
Downloads the new executables to a staging location,
Starts a process: cmd.exe which, in turn, runs the batch script w/o waiting for it to complete, and then
Immediately terminates itself.
The batch command:
Pings 127.0.0.1 five times,
Copies the executables to the final location and,
Restarts the service.
Works like clockwork. The ping is a reliable 5 second delay - lets the service shutdown before copying the files.
Edit:
Just for completeness - I realized that by batch cmd is pinging 127.0.0.1 not 128.0.0.1 and so I edited this answer to reflect that. I suppose either works - but 128.0.0.1 pings timeout, where 127.0.0.1 resolves to "me". Since I'm only using it as a poor-man's delay, it serves the purpose either way.
I have a small utility I am working on that deletes old user profiles from domain machines.
Basically, where I am stuck is looking for a better process to delete remote directories.
I know I can use the System.IO and delete it from the UNC path, but I am not happy with the performance of the network deletion. It can take hours to delete medium sized profiles, and if there are dozens or hundreds of profile or machines this is not feasible as a solution.
So this appears to be out of the question
The best I can find appears to be PSExec calls, but I want something managed.
Are there any .NET classes that can invoke the remote machine to complete the deletion of the directory instead of relying on the calling machine?
If your client computers don't detect this as a virus, you can use it to execute remote commands on your network computers, including folder deletions:
PsExec v1.98
Introduction
Utilities like Telnet and remote control programs like Symantec's PC
Anywhere let you execute programs on remote systems, but they can be a
pain to set up and require that you install client software on the
remote systems that you wish to access. PsExec is a light-weight
telnet-replacement that lets you execute processes on other systems,
complete with full interactivity for console applications, without
having to manually install client software. PsExec's most powerful
uses include launching interactive command-prompts on remote systems
and remote-enabling tools like IpConfig that otherwise do not have the
ability to show information about remote systems.
I have no idea how it works. In .NET code the idea would be to send RPC calls to a remote application that you control, which is easy enough provided you already have said application running on the target computers. The mechanism used would be .NET Remoting or WCF.
Inspired by this answer, I did some minor modifications. I can't get it to run 100% managed, as I get an error code 9 (The storage control block address is invalid) when I try to run the rd-command from within the code itself.
The base functionality is blindingly fast on my small test-setup, but given that you overrule the "Are you sure?" prompt, it is also fairly dangerous if you specify the wrong path, so wear your hard hat as you proceed:
If you execute echo Y | rd /S c:\Temp\test in any command shell, you'll remove C:\Temp\Test and anything below it very quickly and without warning.
But executing this solution directly in the code doesn't work. So my quick fix is to place a bat-file (called DeleteTest.bat) on the machine, containing only this line and then execute the bat file by WMI.
In my small test, it deletes ~900 files of a total of ~200 mb in a second or so.
Also, in addition to the answer cited I get the return code, so my full code becomes:
var processToRun = new[] { "c:\\Temp\\DeleteTest.bat" };
var connection = new ConnectionOptions();
connection.Username = "me";
connection.Password = "password";
var wmiScope = new ManagementScope(String.Format("\\\\{0}\\root\\cimv2", "MyRemoteMachine"), connection);
var wmiProcess = new ManagementClass(wmiScope, new ManagementPath("Win32_Process"), new ObjectGetOptions());
var result = wmiProcess.InvokeMethod("Create", processToRun);
Console.WriteLine(
"Creation of process returned: " + result);
You will obviously also need the bat file to be generated (by code or pre-generated) and copied to the destination, but that should be trivial.
I am trying to launch a ClickOnce application via an .appref-ms shortcut on a remote machine using WMI, but cannot succeed. The below code works fine if I try to run notepad.exe.
ManagementPath pm = new ManagementPath(#"\\server\root\cimv2:Win32_process");
ManagementClass processClass = new ManagementClass(pm);
//Get an input parameters object for this method
ManagementBaseObject inParams = processClass.GetMethodParameters("Create");
//Fill in input parameter values
inParams["CommandLine"] = #"C:\Documents and Settings\Start Menu\Programs\New\New App.appref-ms";
//Execute the method
ManagementBaseObject outParams = processClass.InvokeMethod("Create", inParams, null);
Try launching the .appref-ms shortcut via rundll32:
inParams["CommandLine"] = #"rundll32.exe dfshim.dll,ShOpenVerbShortcut ""C:\New App.appref-ms"";
Alternatively, instead of relying on the shortcut path, you could use the application deployment URL (you can see it by opening the .appref-ms file in a text editor):
inParams["CommandLine"] = #"rundll32.exe dfshim.dll,ShOpenVerbApplication http://github-windows.s3.amazonaws.com/GitHub.application";
Keep in mind that Win32_Process.Create cannot create interactive processes remotely.
What is the error that you are getting.
BTW, there seem to be two problems with the commandline path that you are using.
Start Menu folder IMO is in C:\Documents and Settings\"Usernamehere"\StartMenu.
I am not sure but I think that you can't run a program remotely which lies inside a user's profile. Try installing the program to some other location like your c:\program files and then try to call it.
Apart from that, if you mention the exact error you are getting, then it would be helpful to diagnose the problem.