Single instance dotnetcore cli app on linux - c#

I am interested in how to inforce a single instance policy for dotnetcore console apps. To my surprise it seems like there isn't much out there on the topic. I found this one stacko, How to restrict a program to a single instance, but it doesnt seem to work for me on dotnetcore with ubuntu. Anyone here do this before?

Variation of #MusuNaji's solution at: How to restrict a program to a single instance
private static bool AlreadyRunning()
{
Process[] processes = Process.GetProcesses();
Process currentProc = Process.GetCurrentProcess();
logger.LogDebug("Current proccess: {0}", currentProc.ProcessName);
foreach (Process process in processes)
{
if (currentProc.ProcessName == process.ProcessName && currentProc.Id != process.Id)
{
logger.LogInformation("Another instance of this process is already running: {pid}", process.Id);
return true;
}
}
return false;
}

This is a little more difficult on .NET core than it should be, due to the problem of mutex checking on Linux/MacOS (as reported above). Also Theyouthis's solution isn't helpful as all .NET core apps are run via the CLI which has a process name of 'dotnet' which if you are running multiple .NET core apps on the same machine the duplicate instance check will trigger incorrectly.
A simple way to do this that is also multi-platform robust is to open a file for write when the application starts, and close it at the end. If the file fails to open it is due to another instance running concurrently and you can handle that in the try/catch. Using FileStream to open the file will also create it if it doesn't first exist.
try
{
lockFile = File.OpenWrite("SingleInstance.lck");
}
catch (Exception)
{
Console.WriteLine("ERROR - Server is already running. End that instance before re-running. Exiting in 5 seconds...");
System.Threading.Thread.Sleep(5000);
return;
}

Here is my implementation using Named pipes. It supports passing arguments from the second instance.
Note: I did not test on Linux or Mac but it should work in theory.
Usage
public static int Main(string[] args)
{
instanceManager = new SingleInstanceManager("8A3B7DE2-6AB4-4983-BBC0-DF985AB56703");
if (!instanceManager.Start())
{
return 0; // exit, if same app is running
}
instanceManager.SecondInstanceLaunched += InstanceManager_SecondInstanceLaunched;
// Initialize app. Below is an example in WPF.
app = new App();
app.InitializeComponent();
return app.Run();
}
private static void InstanceManager_SecondInstanceLaunched(object sender, SecondInstanceLaunchedEventArgs e)
{
app.Dispatcher.Invoke(() => new MainWindow().Show());
}
Your Copy-and-paste code
public class SingleInstanceManager
{
private readonly string applicationId;
public SingleInstanceManager(string applicationId)
{
this.applicationId = applicationId;
}
/// <summary>
/// Detect if this is the first instance. If it is, start a named pipe server to listen for subsequent instances. Otherwise, send <see cref="Environment.GetCommandLineArgs()"/> to the first instance.
/// </summary>
/// <returns>True if this is tthe first instance. Otherwise, false.</returns>
public bool Start()
{
using var client = new NamedPipeClientStream(applicationId);
try
{
client.Connect(0);
}
catch (TimeoutException)
{
Task.Run(() => StartListeningServer());
return true;
}
var args = Environment.GetCommandLineArgs();
using (var writer = new BinaryWriter(client, Encoding.UTF8))
{
writer.Write(args.Length);
for (int i = 0; i < args.Length; i++)
{
writer.Write(args[i]);
}
}
return false;
}
private void StartListeningServer()
{
var server = new NamedPipeServerStream(applicationId);
server.WaitForConnection();
using (var reader = new BinaryReader(server, Encoding.UTF8))
{
var argc = reader.ReadInt32();
var args = new string[argc];
for (int i = 0; i < argc; i++)
{
args[i] = reader.ReadString();
}
SecondInstanceLaunched?.Invoke(this, new SecondInstanceLaunchedEventArgs { Arguments = args });
}
StartListeningServer();
}
public event EventHandler<SecondInstanceLaunchedEventArgs> SecondInstanceLaunched;
}
public class SecondInstanceLaunchedEventArgs
{
public string[] Arguments { get; set; }
}
Unit test
[TestClass]
public class SingleInstanceManagerTests
{
[TestMethod]
public void SingleInstanceManagerTest()
{
var id = Guid.NewGuid().ToString();
var manager = new SingleInstanceManager(id);
string[] receivedArguments = null;
var correctArgCount = Environment.GetCommandLineArgs().Length;
manager.SecondInstanceLaunched += (sender, e) => receivedArguments = e.Arguments;
var instance1 = manager.Start();
Thread.Sleep(200);
var manager2 = new SingleInstanceManager(id);
Assert.IsFalse(manager2.Start());
Thread.Sleep(200);
Assert.IsTrue(instance1);
Assert.IsNotNull(receivedArguments);
Assert.AreEqual(correctArgCount, receivedArguments.Length);
var receivedArguments2 = receivedArguments;
var manager3 = new SingleInstanceManager(id);
Thread.Sleep(200);
Assert.IsFalse(manager3.Start());
Assert.AreNotSame(receivedArguments, receivedArguments2);
Assert.AreEqual(correctArgCount, receivedArguments.Length);
}
}

The downside of deandob's solution is that one can launch the application from another path. So you may prefer some static path or a tmp path for all users.
Here is my attempt:
//second instance launch guard
var tempPath = Environment.GetEnvironmentVariable("TEMP", EnvironmentVariableTarget.Machine)
??
Path.GetTempPath();
var lockPath = Path.Combine(tempPath, "SingleInstance.lock");
await using var lockFile = File.OpenWrite(lockPath);
here I'm trying to get TEMP system variable at the scope of machine (not the user TEMP) and if its empty - fallback to the user's temp folder on windows or shared /tmp on some linuxes.

Related

Prevent running multiple instances of a mono app

I know how to prevent running multiple instances of a given app on Windows:
Prevent multiple instances of a given app in .NET?
This code does not work under Linux using mono-develop though. It compiles and runs but it does not work. How can I prevent it under Linux using mono?
This is what I have tried but the code deos not work under linux only on windows.
static void Main()
{
Task.Factory.StartNew(() =>
{
try
{
var p = new NamedPipeServerStream("SomeGuid", PipeDirection.In, 1);
Console.WriteLine("Waiting for connection");
p.WaitForConnection();
}
catch
{
Console.WriteLine("Error another insance already running");
Environment.Exit(1); // terminate application
}
});
Thread.Sleep(1000);
Console.WriteLine("Doing work");
// Do work....
Thread.Sleep(10000);
}
I came up with this answer. Call this method passing it a unique ID
public static void PreventMultipleInstance(string applicationId)
{
// Under Windows this is:
// C:\Users\SomeUser\AppData\Local\Temp\
// Linux this is:
// /tmp/
var temporaryDirectory = Path.GetTempPath();
// Application ID (Make sure this guid is different accross your different applications!
var applicationGuid = applicationId + ".process-lock";
// file that will serve as our lock
var fileFulePath = Path.Combine(temporaryDirectory, applicationGuid);
try
{
// Prevents other processes from reading from or writing to this file
var _InstanceLock = new FileStream(fileFulePath, FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.None);
_InstanceLock.Lock(0, 0);
MonoApp.Logger.LogToDisk(LogType.Notification, "04ZH-EQP0", "Aquired Lock", fileFulePath);
// todo investigate why we need a reference to file stream. Without this GC releases the lock!
System.Timers.Timer t = new System.Timers.Timer()
{
Interval = 500000,
Enabled = true,
};
t.Elapsed += (a, b) =>
{
try
{
_InstanceLock.Lock(0, 0);
}
catch
{
MonoApp.Logger.Log(LogType.Error, "AOI7-QMCT", "Unable to lock file");
}
};
t.Start();
}
catch
{
// Terminate application because another instance with this ID is running
Environment.Exit(102534);
}
}

Queue problems across multiple threads

There are many questions and articles on the subject of using a .NET Queue properly within a multi threaded application, however I can't find subject on our specific problem.
We have a Windows Service that receives messages onto a queue via one thread and is then dequeued and processed within another.
We're using lock when queuing and dequeuing, and the service had run fine for around 2 years without any problems. One day we noticed that thousands of messages had been logged (and so had been queued) but were never dequeued/processed, they seem to have been skipped somehow, which shouldn't be possible for a queue.
We can't replicate the circumstances that caused it as we have no real idea what caused it considering that day was no different from any of the others as far as we're aware.
The only idea we have is to do with the concurrency of the queue. We're not using the ConcurrentQueue data-type, which we plan on using in the hope it is a remedy.
One idea, looking at the source of the Queue type, is that it uses arrays internally, which have to be resized once these buffers have reached a certain length. We hypothesised that when this is being done some of the messages were lost.
Another idea from our development manager is that using multiple threads on a multicore processor setup means that even though locks are used, the individual cores are working on the data in their local registers, which can cause them to be working on different data. He said they don't work on the same memory and seems to think lock only works as expected one a single core processor using multiple threads.
Reading more about ConcurrentQueue's use of volatile I'm not sure that this would help, as I've read that using lock provides a stronger guarantee of threads using the most up-to-date state of memory.
I don't have much knowledge on this specific subject, so my question is whether the manager's idea sounds plausible, and whether we might have missed something that's required for the queue to be used properly.
Code snippet for reference (forgive the messy code, it does need refactoring):
public sealed class Message
{
public void QueueMessage(long messageId, Message msg)
{
lock (_queueLock)
{
_queue.Enqueue(new QueuedMessage() { Id = messageId, Message = msg });
}
}
public static void QueueMessage(string queueProcessorName, long messageId, Message msg)
{
lock (_messageProcessors[queueProcessorName]._queueLock)
{
_messageProcessors[queueProcessorName].QueueMessage(messageId, msg);
_messageProcessors[queueProcessorName].WakeUp(); // Ensure the thread is awake
}
}
public void WakeUp()
{
lock(_monitor)
{
Monitor.Pulse(_monitor);
}
}
public void Process()
{
while (!_stop)
{
QueuedMessage currentMessage = null;
try
{
lock (_queueLock)
{
currentMessage = _queue.Dequeue();
}
}
catch(InvalidOperationException i)
{
// Nothing in the queue
}
while(currentMessage != null)
{
IContext context = new Context();
DAL.Message msg = null;
try
{
msg = context.Messages.SingleOrDefault(x => x.Id == currentMessage.Id);
}
catch (Exception e)
{
// TODO: Handle these exceptions better. Possible infinite loop.
continue; // Keep retrying until it works
}
if (msg == null) {
// TODO: Log missing message
continue;
}
try
{
msg.Status = DAL.Message.ProcessingState.Processing;
context.Commit();
}
catch (Exception e)
{
// TODO: Handle these exceptions better. Possible infinite loop.
continue; // Keep retrying until it works
}
bool result = false;
try {
Transformation.TransformManager mgr = Transformation.TransformManager.Instance();
Transformation.ITransform transform = mgr.GetTransform(currentMessage.Message.Type.Name, currentMessage.Message.Get("EVN:EventReasonCode"));
if (transform != null){
msg.BeginProcessing = DateTime.Now;
result = transform.Transform(currentMessage.Message);
msg.EndProcessing = DateTime.Now;
msg.Status = DAL.Message.ProcessingState.Complete;
}
else {
msg.Status = DAL.Message.ProcessingState.Failed;
}
context.Commit();
}
catch (Exception e)
{
try
{
context = new Context();
// TODO: Handle these exceptions better
Error err = context.Errors.Add(context.Errors.Create());
err.MessageId = currentMessage.Id;
if (currentMessage.Message != null)
{
err.EventReasonCode = currentMessage.Message.Get("EVN:EventReasonCode");
err.MessageType = currentMessage.Message.Type.Name;
}
else {
err.EventReasonCode = "Unknown";
err.MessageType = "Unknown";
}
StringBuilder sb = new StringBuilder("Exception occured\n");
int level = 0;
while (e != null && level < 10)
{
sb.Append("Message: ");
sb.Append(e.Message);
sb.Append("\nStack Trace: ");
sb.Append(e.StackTrace);
sb.Append("\n");
e = e.InnerException;
level++;
}
err.Text = sb.ToString();
}
catch (Exception ne) {
StringBuilder sb = new StringBuilder("Exception occured\n");
int level = 0;
while (ne != null && level < 10)
{
sb.Append("Message: ");
sb.Append(ne.Message);
sb.Append("\nStack Trace: ");
sb.Append(ne.StackTrace);
sb.Append("\n");
ne = ne.InnerException;
level++;
}
EventLog.WriteEntry("Service", sb.ToString(), EventLogEntryType.Error);
}
}
try
{
context.Commit();
lock (_queueLock)
{
currentMessage = _queue.Dequeue();
}
}
catch (InvalidOperationException e)
{
currentMessage = null; // No more messages in the queue
}
catch (Exception ne)
{
StringBuilder sb = new StringBuilder("Exception occured\n");
int level = 0;
while (ne != null && level < 10)
{
sb.Append("Message: ");
sb.Append(ne.Message);
sb.Append("\nStack Trace: ");
sb.Append(ne.StackTrace);
sb.Append("\n");
ne = ne.InnerException;
level++;
}
EventLog.WriteEntry("Service", sb.ToString(), EventLogEntryType.Error);
}
}
lock (_monitor)
{
if (_stop) break;
Monitor.Wait(_monitor, TimeSpan.FromMinutes(_pollingInterval));
if (_stop) break;
}
}
}
private object _monitor = new object();
private int _pollingInterval = 10;
private volatile bool _stop = false;
private object _queueLock = new object();
private Queue<QueuedMessage> _queue = new Queue<QueuedMessage>();
private static IDictionary<string, Message> _messageProcessors = new Dictionary<string, Message>();
}
so my question is whether the manager's idea sounds plausible
Uhm. No. If all those synchronization measures would only work on single core machines, the world would have ended in complete Chaos decades ago.
and whether we might have missed something that's required for the queue to be used properly.
As far as your description goes, you should be fine. I would look at how you found out that you have that problem. logs coming in but then vanishing without being properly dequeued, wouldn't that be the default case if I simply turned off the service or rebooted the machine? Are you sure you lost them while your application was actually running?
You declare the object to be used for the lock as private object.
If you try this:
class Program
{
static void Main(string[] args)
{
Test test1 = new Test();
Task Scan1 = Task.Run(() => test1.Run("1"));
Test test2 = new Test();
Task Scan2 = Task.Run(() => test2.Run("2"));
while(!Scan1.IsCompleted || !Scan2.IsCompleted)
{
Thread.Sleep(1000);
}
}
}
public class Test
{
private object _queueLock = new object();
public async Task Run(string val)
{
lock (_queueLock)
{
Console.WriteLine($"{val} locked");
Thread.Sleep(10000);
Console.WriteLine($"{val} unlocked");
}
}
}
You will notice that the code that lies under the lock is executed even if another thread is running inside.
But if you change
private object _queueLock = new object();
To
private static object _queueLock = new object();
It changes how your lock works.
Now, this being your issue depends on if you have multiple instances that class or everything is running withing that same class.

Keeping console app alive and event-driven while processes run in the background

I am trying to write a console application that acts as a "job manager" by running processes in the background. These processes would be running JScript files with arguments passed in. This console application will be distributed across many machines, and will pull from a centralized source (ie. database) to get jobs. The purpose of this application is to eliminate the need for individualized batch files on all of these machines.
I am having trouble keeping the application alive. In the code that I included, you can see in my main function that I am making an initial call to the JobManger's StartNewJobs() method. After this initial call to this method, I'd like my application to then be event-driven, only waking up and running when a process has exited, allowing me to start a new process. The problem I am running into is that once the main() function finishes (when the initial StartNewJobs() method finishes) the console closes and the program ends.
My question is what is the proper way to keep my console application alive and allow it to be event-driven rather than procedural? I know I can probably throw in a while(true) at the end of the main function, but that seems sloppy and incorrect.
Batch file we are trying to replace:
C:\Windows\SysWOW64\cscript.exe c:\temp\somejscriptfile.js 49f1bdd8-5e6b-40cc-92bc-eb20c237a959
C:\Windows\SysWOW64\cscript.exe c:\temp\somejscriptfile.js 654e3783-a1b6-43be-8027-c7d060bf131f
...
Program.cs:
using DistributedJobs.Data;
using DistributedJobs.Logging;
using DistributedJobs.Models;
using Microsoft.Practices.EnterpriseLibrary.Common.Configuration;
using Microsoft.Practices.EnterpriseLibrary.ExceptionHandling;
using System;
namespace DistributedJobs
{
class Program
{
static void Main(string[] args)
{
//Get intial objects/settings
ILogger logger = new Logger(Properties.Settings.Default.LoggingLevel, EnterpriseLibraryContainer.Current.GetInstance<ExceptionManager>());
IDataProvider dataProvider = new SQLDataProvider();
DMSPollingJobType availableJobTypes = DMSPollingJobType.FlatFile;
if (Properties.Settings.Default.SupportsVPN)
{
availableJobTypes |= DMSPollingJobType.VPN;
}
String executableLocation = Properties.Settings.Default.ExecutableLocation;
String jsLocation = Properties.Settings.Default.JSLocation;
Int32 maxProcesses = Properties.Settings.Default.MaxProcesses;
//Create job manager and start new processes/jobs
DMSJobManager jobManager = new DMSJobManager(logger, dataProvider, availableJobTypes, executableLocation, jsLocation, maxProcesses);
jobManager.StartNewJobs();
}
}
}
JobManager.cs:
using DistributedJobs.Models;
using System.Diagnostics;
using System;
using System.Collections.Generic;
using DistributedJobs.Logging;
namespace DistributedJobs.Data
{
public class JobManager
{
private IDataProvider DataProvider;
private ILogger Logger;
private Dictionary<Job, Process> RunningProcesses;
private JobType AvailableJobTypes;
private String ExecutableLocation;
private String JSLocation;
private Int32 MaxProcesses;
public Boolean CanStartNewJob
{
get
{
Boolean canStartNewJob = false;
if (RunningProcesses.Count < MaxProcesses)
{
canStartNewJob = true;
}
foreach (KeyValuePair<Job, Process> entry in RunningProcesses)
{
if (entry.Key.JobType != JobType.FlatFile)
{
canStartNewJob = false;
break;
}
}
return canStartNewJob;
}
}
public JobManager(ILogger logger, IDataProvider dataProvider, JobType availableJobTypes, String executableLocation, String jsLocation, Int32 maxProcesses)
{
Logger = logger;
DataProvider = dataProvider;
RunningProcesses = new Dictionary<Job, Process>();
AvailableJobTypes = availableJobTypes;
ExecutableLocation = executableLocation;
JSLocation = jsLocation;
MaxProcesses = maxProcesses;
}
public void StartNewJobs()
{
while (CanStartNewJob)
{
Job newJob = DataProvider.GetNextScheduledJob(AvailableJobTypes);
if (newJob != null)
{
Process newProcess = CreateNewProcess(newJob);
RunningProcesses.Add(newJob, newProcess);
newProcess.Start();
}
}
}
public Process CreateNewProcess(Job job)
{
ProcessStartInfo startInfo = new ProcessStartInfo();
startInfo.FileName = ExecutableLocation;
startInfo.Arguments = JSLocation + " " + job.JobID.ToString();
startInfo.UseShellExecute = false;
Process retProcess = new Process()
{
StartInfo = startInfo,
EnableRaisingEvents = true
};
retProcess.Exited += new EventHandler(JobFinished);
return retProcess;
}
public void JobFinished(object sender, EventArgs e)
{
Job finishedJob = null;
foreach (KeyValuePair<Job, Process> entry in RunningProcesses)
{
if ((Process)sender == entry.Value)
{
finishedJob = entry.Key;
break;
}
}
if (finishedJob != null)
{
RunningProcesses.Remove(finishedJob);
StartNewJobs();
}
}
}
}
You could try using Application.Run()(System.Windows.Forms). This will start a standard message loop.
So at the end of your Main method just add a Application.Run():
static void Main(string[] args)
{
//Get intial objects/settings
ILogger logger = new Logger(Properties.Settings.Default.LoggingLevel, EnterpriseLibraryContainer.Current.GetInstance<ExceptionManager>());
IDataProvider dataProvider = new SQLDataProvider();
DMSPollingJobType availableJobTypes = DMSPollingJobType.FlatFile;
if (Properties.Settings.Default.SupportsVPN)
{
availableJobTypes |= DMSPollingJobType.VPN;
}
String executableLocation = Properties.Settings.Default.ExecutableLocation;
String jsLocation = Properties.Settings.Default.JSLocation;
Int32 maxProcesses = Properties.Settings.Default.MaxProcesses;
//Create job manager and start new processes/jobs
DMSJobManager jobManager = new DMSJobManager(logger, dataProvider, availableJobTypes, executableLocation, jsLocation, maxProcesses);
jobManager.StartNewJobs();
// start message loop
Application.Run();
}

How to Efficiently Read From a Pipe Stream when using IPC C#

I wrote the simplified version of my program below. Process A launches a child process (Process B). I use an anonymous pipe to write information about the progress of a method running on process B. Meanwhile I have a function in process A that continually reads from a stream to see if there is a new update coming in from the pipe. If there is, the form on process A is updated to reflect the progress. This works as expected, however I am wondering if there is a better way to accomplish this without having to continually check the stream to see if there are any new updates to the progress.
/////////////////
///Process A ////
/////////////////
public void LaunchProcessB()
{
using (AnonymousPipeServerStream pipeServer = new AnonymousPipeServerStream(PipeDirection.In,
HandleInheritability.Inheritable))
{
var _Process = new Process();
_Process.StartInfo.FileName = exeString;
_Process.StartInfo.Arguments = pipeServer.GetClientHandleAsString()
_Process.StartInfo.RedirectStandardOutput = true;
_Process.StartInfo.RedirectStandardInput = true;
_Process.StartInfo.CreateNoWindow = true;
_Process.StartInfo.UseShellExecute = false;
_Process.Start(); //launches process B
pipeServer.DisposeLocalCopyOfClientHandle();
using (StreamReader sr = new StreamReader(pipeServer))
{
try
{
while (true)
{
string temp = sr.ReadLine();
if (temp == null) break;
int result;
if (Int32.TryParse(temp, out result))
ShowDocumentProgress(result);
else ShowProgress(temp);
}
}
catch (Exception)
{
//error occured when reading from stream.
}
}
if (!_Process.Responding && !_Process.HasExited)
{
_Process.Kill();
return;
}
_Process.WaitForExit(10000);
}
}
private void ShowProgressPercent(int percentage)
{
if (percentage > currentPercentage)
{
progressBar.Value = percentage;
}
}
private void ShowProgress(string progressString)
{
labelMessage.Text = progressString;
}
/////////////////
///Process B ////
/////////////////
private StreamWriter _progressWriter;
private PipeStream _progressPipe;
static int Main(string[] args)
{
using (progressPipe = new AnonymousPipeClientStream(PipeDirection.Out, args[0]))
using (_progressWriter = new StreamWriter(_progressPipe))
{
RunLongProcess()
}
}
private void RunLongProcess()
{
//attaches events to PercentProgress and StageProgress methods.
}
private void PercentProgress(int percentage)
{
_progressWriter.WriteLine(percentage.ToString());
_progressPipe.WaitForPipeDrain();
}
private void StageProgress(string stage)
{
_progressWriter.WriteLine(stage);
_progressPipe.WaitForPipeDrain();
}
The while condition is not necessary. Simply read until temp is null. That's the end signal of the stream.
Make this a while(true) loop.
I think you also need to add exception handling to catch the process terminating and severing the pipe. !_Process.HasExited && pipeServer.IsConnected is not enough because it might be true but immediately switch to false after the test.
I also would add a WaitForExit at the end to make sure the system is quiesced before you continue.

windows update application

I wrote here an application that uses the WUA COM interfaces (IUpdateSearcher, IUpdate etc.). I use this application to scan for available updates, download the updates and install them. Everything works OK until I get to download and install some update have some ui update dialog.
I get this update when I use IUpdateSearcher.Search(), I can successfully download it (using IUpdateDownloader.Download()) but when I install this update using IUpdateInstaller2.Install() I cannot get rid of the user interface.
My question is - how can I make this a silent installation?
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using WUApiLib;
namespace MSHWindowsUpdateAgent
{
class Program
{
static void Main(string[] args)
{
UpdatesAvailable();
EnableUpdateServices();//enables everything windows need in order to make an update
InstallUpdates(DownloadUpdates());
Console.Read();
}
//this is my first try.. I can see the need for abstract classes here...
//but at least it gives most people a good starting point.
public static void InstalledUpdates()
{
UpdateSession UpdateSession = new UpdateSession();
IUpdateSearcher UpdateSearchResult = UpdateSession.CreateUpdateSearcher();
UpdateSearchResult.Online = true;//checks for updates online
ISearchResult SearchResults = UpdateSearchResult.Search("IsInstalled=1 AND IsHidden=0");
//for the above search criteria refer to
//http://msdn.microsoft.com/en-us/library/windows/desktop/aa386526(v=VS.85).aspx
//Check the remakrs section
foreach (IUpdate x in SearchResults.Updates)
{
Console.WriteLine(x.Title);
}
}
public static void UpdatesAvailable()
{
UpdateSession UpdateSession = new UpdateSession();
IUpdateSearcher UpdateSearchResult = UpdateSession.CreateUpdateSearcher();
UpdateSearchResult.Online = true;//checks for updates online
ISearchResult SearchResults = UpdateSearchResult.Search("IsInstalled=0 AND IsPresent=0");
//for the above search criteria refer to
//http://msdn.microsoft.com/en-us/library/windows/desktop/aa386526(v=VS.85).aspx
//Check the remakrs section
foreach (IUpdate x in SearchResults.Updates)
{
Console.WriteLine(x.Title);
}
}
public static UpdateCollection DownloadUpdates()
{
UpdateSession UpdateSession = new UpdateSession();
IUpdateSearcher SearchUpdates = UpdateSession.CreateUpdateSearcher();
ISearchResult UpdateSearchResult = SearchUpdates.Search("IsInstalled=0 and IsPresent=0");
UpdateCollection UpdateCollection = new UpdateCollection();
//Accept Eula code for each update
for (int i = 0; i < UpdateSearchResult.Updates.Count; i++)
{
IUpdate Updates = UpdateSearchResult.Updates[i];
if (Updates.EulaAccepted == false)
{
Updates.AcceptEula();
}
UpdateCollection.Add(Updates);
}
//Accept Eula ends here
//if it is zero i am not sure if it will trow an exception -- I havent tested it.
UpdateCollection DownloadCollection = new UpdateCollection();
UpdateDownloader Downloader = UpdateSession.CreateUpdateDownloader();
for (int i = 0; i < UpdateCollection.Count; i++)
{
DownloadCollection.Add(UpdateCollection[i]);
}
Downloader.Updates = DownloadCollection;
Console.WriteLine("Downloading Updates");
IDownloadResult DownloadResult = Downloader.Download();
UpdateCollection InstallCollection = new UpdateCollection();
for (int i = 0; i < UpdateCollection.Count; i++)
{
if (DownloadCollection[i].IsDownloaded)
{
InstallCollection.Add(DownloadCollection[i]);
}
}
return InstallCollection;
}
public static void InstallUpdates(UpdateCollection DownloadedUpdates)
{
UpdateSession UpdateSession = new UpdateSession();
UpdateInstaller InstallAgent = UpdateSession.CreateUpdateInstaller() as UpdateInstaller;
InstallAgent.Updates = DownloadedUpdates;
//Starts a synchronous installation of the updates.
// http://msdn.microsoft.com/en-us/library/windows/desktop/aa386491(v=VS.85).aspx#methods
IInstallationResult InstallResult = InstallAgent.Install();
}
public static void EnableUpdateServices()
{
IAutomaticUpdates updates = new AutomaticUpdates();
if (!updates.ServiceEnabled)
{
Console.WriteLine("Not all updates services where enabled. Enabling Now" + updates.ServiceEnabled);
updates.EnableService();
Console.WriteLine("Service enable success");
}
}
}
}
jvelez has an excellent answer but it does not directly answer the actual question: "how can I make this a silent instaltion?"
confi - I would suggest the easiest way first:
static class Program
{
/// <summary>
/// The main entry point for the application.
/// </summary>
[STAThread]
static void Main()
{
Application.SetCompatibleTextRenderingDefault(false);
Application.EnableVisualStyles();
Thread thread = new Thread(() =>
{
Form1 form1 = new Form1();
form1.Visible = false;
form1.ShowInTaskbar = false;
Application.Run(form1);
});
thread.SetApartmentState(ApartmentState.STA);
thread.Start();
}
}
Ensuring all the methods were working in the background by simply hiding your UI this would give you a means of making it silent.
#Justin Choponis - I have a need to do the same as confi. WSUS is not cutting it for many of my clients. Its impratical and bulky. If you had one Network to look after and were onsite all the time WSUS is then usefull other than that I find it very un-satisfactory.

Categories