can a timer that fires continuously lead to memory leaks? - c#

In my program i write data to files if i am not able to connect to database and I have a separate thread that has timer which checks for connection availability after every 25 seconds and if it can connect it transfers the data from files to the main db and deletes the file.
the problem is i never stop this timer,can this lead to memory leaks?
if i just run my program and monitor the task manager i can see the memory usage increasing continuously if i disable the timer and then run my application then the memory is stable
public BackgroundWorker()
{
_backgroundWorkerThread = new Thread(new ThreadStart(ThreadEntryPoint));
_timer = new SWF.Timer();
_timer.Tick += new EventHandler(_timer_Tick);
_timer.Interval = 25 * 1000;
_timer.Enabled = true;
}
void _timer_Tick(object sender, EventArgs e)
{
bool lanAvailabe = NetworkInterface.GetIsNetworkAvailable();
if (lanAvailabe)
{
if (!GetListOfFiles())
{
return;
}
}
else
return;
}
implementation of GetListofFiles()
private bool GetListOfFiles()
{
string sourceDirectory = pathOfXmlFiles;
if (!Directory.Exists(sourceDirectory))
{
return false;
}
var xmlFiles = Directory.GetFiles(sourceDirectory, "*.xml");
if (!xmlFiles.Any())
{
return false;
}
foreach (var item in xmlFiles)
{
ReadXmlFile(item);
}
foreach (var item in xmlFiles)
{
if (_writtenToDb)
{
File.Delete(item);
}
}
return true;
}
method that reads xml files
private void ReadXmlFile(string filename)
{
string[] patientInfo = new string[15];
using (StreamReader sr = new StreamReader(filename, Encoding.Default))
{
String line;
line = sr.ReadToEnd();
if (line.IndexOf("<ID>") > 0)
{
patientInfo[0] = GetTagValue(line, "<ID>", "</ID>");
}
if (line.IndexOf("<PatientID>") > 0)
{
patientInfo[1] = GetTagValue(line, "<PatientID>", "</PatientID>");
}
if (line.IndexOf("<PatientName>") > 0)
{
patientInfo[2] = GetTagValue(line, "<PatientName>", "</PatientName>");
}
if (line.IndexOf("<Room>") > 0)
{
patientInfo[3] = GetTagValue(line, "<Room>", "</Room>");
}
}
WriteToDb(patientInfo);
}

if i just run my program and monitor the task manager i can see the memory usage increasing continuously
Get a profiler. Task Manager is not a right tool. Can't tell what is going on. It doesn't mean that you have a leak. Maybe just GC doesn't run cause there is enough space etc.

Related

How to correctly parse received serial data into lines?

I'm creating a program which communicates with a serial device which is constantly sending data. I'm reading data from device every 100ms (using a timer). I use port.ReadExisting() to receive all currently available data from the device then I try split it into lines, because I need to check some of the received data and the best way is to check lines. The problem occurs when device sends data which doesn't end with "\r\n" or '\n'.
In a perfect situation port.ReadExisting() returns: "sampletext\r\nsomesampletext\nsampletext\r\n
But a problem occurs when there's no CR or LF character at the end:
First time port.ReadExisting() returns this: "text\nsamp"
Second time port.ReadExisting() returns this: letext\r\ntext\r\n"
End result should look like this:
text
sampletext
text
But what I get looks like this:
text
samp
letext
text
My code:
This is the timer which runs every 100ms:
private void CommandTimer_Tick(object sender, EventArgs e)
{
BackgroundWorker seriaDataWorker = new BackgroundWorker();
seriaDataWorker.DoWork += (obj, p) => PrintSerialData();
seriaDataWorker.RunWorkerAsync();
}
BackgroundWorker which gets called by the timer:
private void PrintSerialData()
{
try
{
if (RandomReboot)
{
RebootWatch.Start();
}
if (COMport.IsOpen)
{
if (COMport.BytesToRead != 0)
{
SerialPrint(COMport.ReadExisting());
}
}
}
catch (System.IO.IOException SerialException)
{
return;
}
}
Function which parses received data into lines:
private void SerialPrint(string data)
{
using (var buffer = new StringReader(data))
{
string line = "";
while((line = buffer.ReadLine()) != null)
{
if (CheckForAnsw)
{
ReceivedCommandData = line;
if (ReceivedCommandData.Contains(AnswExpected))
{
ReceivedAnsw = true;
ReceivedLine = ReceivedCommandData;
ReceivedCommandData = "";
}
}
this.Invoke(new MethodInvoker(delegate
{
AppendText(TextBox_System_Log, Color.Black, line + "\r\n");
}
));
}
}
}
I know that the problem is that buffer.ReadLine() treats remainder of the string which doesn't end with a CR or LF character as a seperate line but I don't know how to fix it.
I tried using port.ReadLine() in the past but it is way slower and causes problems for me when serial ports get disconnected etc.
I don't think there's an easy way to handle this with the StringReader. Instead, you can split the string yourself:
private static string _buffer = string.Empty;
private static void SerialPrint(string data)
{
// Append the new data to the leftover of the previous operation
data = _buffer + data;
int index = data.IndexOf('\n');
int start = 0;
while (index != -1)
{
var command = data.Substring(start, index - start);
ProcessCommand(command.TrimEnd('\r'));
start = index + 1;
index = data.IndexOf('\n', start);
}
// Store the leftover in the buffer
if (!data.EndsWith("\n"))
{
_buffer = data.Substring(start);
}
else
{
_buffer = string.Empty;
}
}
private static void ProcessCommand(string command)
{
Console.WriteLine(command);
}
You can use AnonymousPipes to transport and buffer the incoming data and read them as lines to output them to somewhere.
Here is a little example which creates a server and client pipe stream, then writes data to the server in one task (with some newline in the data) and reads the data in a different task per line and outputs them to the console.
public class Program
{
public static async Task Main()
{
(var writer, var reader) = CreatePipe();
using (writer)
using (reader)
{
var writerTask = Task.Run(async () =>
{
writer.AutoFlush = true;
writer.Write("?");
for (int i = 0; i < 100; i++)
{
if (i % 10 == 9)
{
await writer.WriteAsync("!");
await writer.WriteAsync(Environment.NewLine);
await writer.WriteAsync("?");
}
else
{
await writer.WriteAsync((i % 10).ToString());
}
await Task.Delay(100);
}
writer.Close();
});
var readerTask = Task.Run(async () =>
{
while (!reader.EndOfStream)
{
var line = await reader.ReadLineAsync();
Console.WriteLine(line);
}
});
await Task.WhenAll(writerTask, readerTask);
}
}
public static (StreamWriter, StreamReader) CreatePipe()
{
var server = new AnonymousPipeServerStream(PipeDirection.Out);
var client = new AnonymousPipeClientStream(server.GetClientHandleAsString());
return
(
new StreamWriter(server, Encoding.UTF8),
new StreamReader(client, Encoding.UTF8)
);
}
}
Try to adapt this code to your use case and comment if there are difficulies.
Your issue with \r\n and \n can be covered by using Environment.NewLine. I'm not sure what AppendText does, but if you're using it to store the values, then you're overdoing it. What you need is to store all data first in a StringBuilder then process them, OR process each data and store them in managed type such as Array, to define each line separately. Only use the string in the presentation layer (if you have some GUI that you want the user to see the results).
So, what I suggest is to store the lines in StringBuilder Something like this :
private readonly StringBuilder _strDataBuilder = new StringBuilder();
private void PrintSerialData()
{
try
{
if (RandomReboot)
{
RebootWatch.Start();
}
if(COMport.IsOpen && COMport.BytesToRead != 0)
{
var data = COMport.ReadExisting();
if(!string.IsNullOrEmpty(data)) {
_strDataBuilder.Append(data);
}
}
}
catch (System.IO.IOException SerialException)
{
return;
}
}
private void SerialPrint()
{
var data = _strDataBuilder.ToString();
if(string.IsNullOrEmpty(data)) { return; }
var lines = data.Split(Environment.NewLine);
if(lines.Length == 0) { return; }
for(int x = 0; x < lines.Length; x++)
{
var line = lines[x];
if (CheckForAnsw)
{
ReceivedCommandData = line;
if (ReceivedCommandData.Contains(AnswExpected))
{
ReceivedAnsw = true;
ReceivedLine = ReceivedCommandData;
ReceivedCommandData = "";
}
}
this.Invoke(new MethodInvoker(delegate
{
AppendText(TextBox_System_Log, Color.Black, line + Environment.NewLine);
}
));
}
}
Storing them first would make things more maintainability and fixability when you want to add more processing steps or reuse the results.
Although the SerialPrint() is unnessary if you just re-print the data in the GUI. As the data already separated in lines. So, if you do
TextBox_System_Log.Text = _strDataBuilder.ToString();
Directly, would list them in lines in the default color. However, if you intended to split them to process each line separately (to validate for instance), then it would be okay.
You can try like below code:
public void DataReceivedSerialPort(object sender, SerialDataReceivedEventArgs e)
{
readExistingData = "";
SerialPort sp = (SerialPort)sender;
sp.ReadTimeout = 100;
do
{
readExistingData = "";
try
{
readExistingData = sp.ReadLine();
if (readExistingData == "")
{
readExistingData = sp.ReadLine();
}
dataReadFromSerialPort += readExistingData;
}
catch
{
try
{
readExistingData = sp.ReadExisting();
dataReadFromSerialPort += readExistingData + "\r\n";
}
catch { }
}
UI.insert_new_items_into_textBoxUARTLog(readExistingData);
} while (readExistingData != "");
}

FileSystemWacher is locking some files

I am using this code to monitor creation of files in certain folder:
_watcher = new RecoveringFileSystemWatcher(SourceFolder, "*.xml");
_watcher.Created += (_, e) =>
{
ProcessFile(e.Name);
};
RecoveringFileSystemWatcher is a fileSystemWatcher wrapper. It's constructor is:
public RecoveringFileSystemWatcher (string path, string filter)
{
_containedFSW = new FileSystemWatcher(path, filter);
}
The process works as expected but for some files, randomly, an exception is thrown telling that the file is used by another process.
This is the method that is launched upon file creation:
var nfo = new FileInfo(filePath);
if (nfo.Exists)
{
var archivoXml = nfo.Name;
string archivo = String.Empty;
try
{
string content = Task.Run(async () => await GetFileContent(filePath)).Result;
if (String.IsNullOrEmpty(content))
return false;
XmlDocument xml = new XmlDocument();
xml.LoadXml(content);
//The rest of the method
}
}
the method GetFileContent is this:
private async Task<string> GetFileContent(string filePath)
{
string content = String.Empty;
try
{
Console.Write("ONE - "); InfoLog.Save($"ONE {filePath}");
using (StreamReader sr = new StreamReader(filePath))
{
Console.Write("TWO - "); InfoLog.Save($"TWO {filePath}");
content = await sr.ReadToEndAsync().ConfigureAwait(false);
Console.Write($"THREE {(sr.BaseStream == null ? "Closed" : "Opened")} - "); InfoLog.Save($"THREE {(sr.BaseStream == null ? "Closed" : "Opened")} {filePath}");
sr.Close();
Console.WriteLine($"FOUR {(sr.BaseStream == null ? "Closed" : "Opened")}"); InfoLog.Save($"FOUR {(sr.BaseStream == null ? "Closed" : "Opened")} {filePath}");
}
}
catch (Exception ex)
{
InfoLog.Save($"XML file could be read -> {filePath}. See error log.");
ErrorLog.Save(ex);
}
return content;
}
Look at the log information I am writing to debug the process.
I got one case with a file called 1112186.xml.... this is recorded in the log:
18/12/2018 19:12:10 ONE D:\GestorDocumental\Origen\1112186.xml
18/12/2018 19:12:10 XML file could not be read -> D:\GestorDocumental\Origen\1112186.xml. See error log.
As you see, the exception is thrown at the "using" instruction.
If I see the full log, I can see that file, 1112186.xml, is never used before, so the only chance is that FSW keeps the file opened. I don't know why, but it seems this is happening.
It is clear also that this process is locking the file, because when I exit the console application and then run again, the file can be processed.
Any help about this, please?
thanks
Jaime
I usually use this method to check if file is locked. I got it from one of the link in stackoverflow.
public static bool IsFileClosed(string filepath)
{
bool fileClosed = false;
int retries = 20;
const int delay = 400; // set a delay period = retries*delay milliseconds
if (!File.Exists(filepath))
return false;
do
{
try
{
// Attempts to open then close the file in RW mode, denying other users to place any locks.
FileStream fs = File.Open(filepath, FileMode.Open, FileAccess.ReadWrite, FileShare.None);
fs.Close();
fileClosed = true; // success
}
catch (IOException) { }
retries--;
if (!fileClosed)
Thread.Sleep(delay);
}
while (!fileClosed && retries > 0);
return fileClosed;
}
This is a new class called FileTimerWatcher (it will have logger injected):
public FileTimerWatcher(ILogger logger) : base(logger)
{
if (timer == null)
{
// Create a timer with a 1.5 second interval.
// monitor the files after 1.5 seconds.
timer = new Timer(delay);
// Hook up the event handler for the Elapsed event.
timer.Elapsed += new ElapsedEventHandler(ProcessFolder);
timer.AutoReset = true;
timer.Enabled = true;
}
}
private void ProcessFolder(object sender, ElapsedEventArgs e)
{
var LastChecked = DateTime.Now;
string[] files = System.IO.Directory.GetFiles(SourceDirectory, somefilter, System.IO.SearchOption.TopDirectoryOnly);
foreach (string file in files)
{
ProcessFile(file); // process file here
}
}

Queue problems across multiple threads

There are many questions and articles on the subject of using a .NET Queue properly within a multi threaded application, however I can't find subject on our specific problem.
We have a Windows Service that receives messages onto a queue via one thread and is then dequeued and processed within another.
We're using lock when queuing and dequeuing, and the service had run fine for around 2 years without any problems. One day we noticed that thousands of messages had been logged (and so had been queued) but were never dequeued/processed, they seem to have been skipped somehow, which shouldn't be possible for a queue.
We can't replicate the circumstances that caused it as we have no real idea what caused it considering that day was no different from any of the others as far as we're aware.
The only idea we have is to do with the concurrency of the queue. We're not using the ConcurrentQueue data-type, which we plan on using in the hope it is a remedy.
One idea, looking at the source of the Queue type, is that it uses arrays internally, which have to be resized once these buffers have reached a certain length. We hypothesised that when this is being done some of the messages were lost.
Another idea from our development manager is that using multiple threads on a multicore processor setup means that even though locks are used, the individual cores are working on the data in their local registers, which can cause them to be working on different data. He said they don't work on the same memory and seems to think lock only works as expected one a single core processor using multiple threads.
Reading more about ConcurrentQueue's use of volatile I'm not sure that this would help, as I've read that using lock provides a stronger guarantee of threads using the most up-to-date state of memory.
I don't have much knowledge on this specific subject, so my question is whether the manager's idea sounds plausible, and whether we might have missed something that's required for the queue to be used properly.
Code snippet for reference (forgive the messy code, it does need refactoring):
public sealed class Message
{
public void QueueMessage(long messageId, Message msg)
{
lock (_queueLock)
{
_queue.Enqueue(new QueuedMessage() { Id = messageId, Message = msg });
}
}
public static void QueueMessage(string queueProcessorName, long messageId, Message msg)
{
lock (_messageProcessors[queueProcessorName]._queueLock)
{
_messageProcessors[queueProcessorName].QueueMessage(messageId, msg);
_messageProcessors[queueProcessorName].WakeUp(); // Ensure the thread is awake
}
}
public void WakeUp()
{
lock(_monitor)
{
Monitor.Pulse(_monitor);
}
}
public void Process()
{
while (!_stop)
{
QueuedMessage currentMessage = null;
try
{
lock (_queueLock)
{
currentMessage = _queue.Dequeue();
}
}
catch(InvalidOperationException i)
{
// Nothing in the queue
}
while(currentMessage != null)
{
IContext context = new Context();
DAL.Message msg = null;
try
{
msg = context.Messages.SingleOrDefault(x => x.Id == currentMessage.Id);
}
catch (Exception e)
{
// TODO: Handle these exceptions better. Possible infinite loop.
continue; // Keep retrying until it works
}
if (msg == null) {
// TODO: Log missing message
continue;
}
try
{
msg.Status = DAL.Message.ProcessingState.Processing;
context.Commit();
}
catch (Exception e)
{
// TODO: Handle these exceptions better. Possible infinite loop.
continue; // Keep retrying until it works
}
bool result = false;
try {
Transformation.TransformManager mgr = Transformation.TransformManager.Instance();
Transformation.ITransform transform = mgr.GetTransform(currentMessage.Message.Type.Name, currentMessage.Message.Get("EVN:EventReasonCode"));
if (transform != null){
msg.BeginProcessing = DateTime.Now;
result = transform.Transform(currentMessage.Message);
msg.EndProcessing = DateTime.Now;
msg.Status = DAL.Message.ProcessingState.Complete;
}
else {
msg.Status = DAL.Message.ProcessingState.Failed;
}
context.Commit();
}
catch (Exception e)
{
try
{
context = new Context();
// TODO: Handle these exceptions better
Error err = context.Errors.Add(context.Errors.Create());
err.MessageId = currentMessage.Id;
if (currentMessage.Message != null)
{
err.EventReasonCode = currentMessage.Message.Get("EVN:EventReasonCode");
err.MessageType = currentMessage.Message.Type.Name;
}
else {
err.EventReasonCode = "Unknown";
err.MessageType = "Unknown";
}
StringBuilder sb = new StringBuilder("Exception occured\n");
int level = 0;
while (e != null && level < 10)
{
sb.Append("Message: ");
sb.Append(e.Message);
sb.Append("\nStack Trace: ");
sb.Append(e.StackTrace);
sb.Append("\n");
e = e.InnerException;
level++;
}
err.Text = sb.ToString();
}
catch (Exception ne) {
StringBuilder sb = new StringBuilder("Exception occured\n");
int level = 0;
while (ne != null && level < 10)
{
sb.Append("Message: ");
sb.Append(ne.Message);
sb.Append("\nStack Trace: ");
sb.Append(ne.StackTrace);
sb.Append("\n");
ne = ne.InnerException;
level++;
}
EventLog.WriteEntry("Service", sb.ToString(), EventLogEntryType.Error);
}
}
try
{
context.Commit();
lock (_queueLock)
{
currentMessage = _queue.Dequeue();
}
}
catch (InvalidOperationException e)
{
currentMessage = null; // No more messages in the queue
}
catch (Exception ne)
{
StringBuilder sb = new StringBuilder("Exception occured\n");
int level = 0;
while (ne != null && level < 10)
{
sb.Append("Message: ");
sb.Append(ne.Message);
sb.Append("\nStack Trace: ");
sb.Append(ne.StackTrace);
sb.Append("\n");
ne = ne.InnerException;
level++;
}
EventLog.WriteEntry("Service", sb.ToString(), EventLogEntryType.Error);
}
}
lock (_monitor)
{
if (_stop) break;
Monitor.Wait(_monitor, TimeSpan.FromMinutes(_pollingInterval));
if (_stop) break;
}
}
}
private object _monitor = new object();
private int _pollingInterval = 10;
private volatile bool _stop = false;
private object _queueLock = new object();
private Queue<QueuedMessage> _queue = new Queue<QueuedMessage>();
private static IDictionary<string, Message> _messageProcessors = new Dictionary<string, Message>();
}
so my question is whether the manager's idea sounds plausible
Uhm. No. If all those synchronization measures would only work on single core machines, the world would have ended in complete Chaos decades ago.
and whether we might have missed something that's required for the queue to be used properly.
As far as your description goes, you should be fine. I would look at how you found out that you have that problem. logs coming in but then vanishing without being properly dequeued, wouldn't that be the default case if I simply turned off the service or rebooted the machine? Are you sure you lost them while your application was actually running?
You declare the object to be used for the lock as private object.
If you try this:
class Program
{
static void Main(string[] args)
{
Test test1 = new Test();
Task Scan1 = Task.Run(() => test1.Run("1"));
Test test2 = new Test();
Task Scan2 = Task.Run(() => test2.Run("2"));
while(!Scan1.IsCompleted || !Scan2.IsCompleted)
{
Thread.Sleep(1000);
}
}
}
public class Test
{
private object _queueLock = new object();
public async Task Run(string val)
{
lock (_queueLock)
{
Console.WriteLine($"{val} locked");
Thread.Sleep(10000);
Console.WriteLine($"{val} unlocked");
}
}
}
You will notice that the code that lies under the lock is executed even if another thread is running inside.
But if you change
private object _queueLock = new object();
To
private static object _queueLock = new object();
It changes how your lock works.
Now, this being your issue depends on if you have multiple instances that class or everything is running withing that same class.

How to Efficiently Read From a Pipe Stream when using IPC C#

I wrote the simplified version of my program below. Process A launches a child process (Process B). I use an anonymous pipe to write information about the progress of a method running on process B. Meanwhile I have a function in process A that continually reads from a stream to see if there is a new update coming in from the pipe. If there is, the form on process A is updated to reflect the progress. This works as expected, however I am wondering if there is a better way to accomplish this without having to continually check the stream to see if there are any new updates to the progress.
/////////////////
///Process A ////
/////////////////
public void LaunchProcessB()
{
using (AnonymousPipeServerStream pipeServer = new AnonymousPipeServerStream(PipeDirection.In,
HandleInheritability.Inheritable))
{
var _Process = new Process();
_Process.StartInfo.FileName = exeString;
_Process.StartInfo.Arguments = pipeServer.GetClientHandleAsString()
_Process.StartInfo.RedirectStandardOutput = true;
_Process.StartInfo.RedirectStandardInput = true;
_Process.StartInfo.CreateNoWindow = true;
_Process.StartInfo.UseShellExecute = false;
_Process.Start(); //launches process B
pipeServer.DisposeLocalCopyOfClientHandle();
using (StreamReader sr = new StreamReader(pipeServer))
{
try
{
while (true)
{
string temp = sr.ReadLine();
if (temp == null) break;
int result;
if (Int32.TryParse(temp, out result))
ShowDocumentProgress(result);
else ShowProgress(temp);
}
}
catch (Exception)
{
//error occured when reading from stream.
}
}
if (!_Process.Responding && !_Process.HasExited)
{
_Process.Kill();
return;
}
_Process.WaitForExit(10000);
}
}
private void ShowProgressPercent(int percentage)
{
if (percentage > currentPercentage)
{
progressBar.Value = percentage;
}
}
private void ShowProgress(string progressString)
{
labelMessage.Text = progressString;
}
/////////////////
///Process B ////
/////////////////
private StreamWriter _progressWriter;
private PipeStream _progressPipe;
static int Main(string[] args)
{
using (progressPipe = new AnonymousPipeClientStream(PipeDirection.Out, args[0]))
using (_progressWriter = new StreamWriter(_progressPipe))
{
RunLongProcess()
}
}
private void RunLongProcess()
{
//attaches events to PercentProgress and StageProgress methods.
}
private void PercentProgress(int percentage)
{
_progressWriter.WriteLine(percentage.ToString());
_progressPipe.WaitForPipeDrain();
}
private void StageProgress(string stage)
{
_progressWriter.WriteLine(stage);
_progressPipe.WaitForPipeDrain();
}
The while condition is not necessary. Simply read until temp is null. That's the end signal of the stream.
Make this a while(true) loop.
I think you also need to add exception handling to catch the process terminating and severing the pipe. !_Process.HasExited && pipeServer.IsConnected is not enough because it might be true but immediately switch to false after the test.
I also would add a WaitForExit at the end to make sure the system is quiesced before you continue.

how to make service act dynamically based on service running condition

hi friends i was trying to make my service act dynamically... i have set time for my service about for 2 min ,if suppose it was doin huge amount of work means it will exceeds that 2 min time limit then we need to check the service condition if work is pending means we need to run that instance until upto finish
so that i have tried this below code on googling ... i m having method were i need to cooperate in below service, can any one help me
public static void StartService(string serviceName, int timeoutMilliseconds)
{
ServiceController service = new ServiceController(serviceName);
try
{
TimeSpan timeout = TimeSpan.FromMilliseconds(timeoutMilliseconds);
service.Start();
service.WaitForStatus(ServiceControllerStatus.Running, timeout);
}
catch
{
// ...
}
}
as of now i m doing this below logic
protected override void OnStart(string[] args)
{
// my service name
Workjob("FTSCSVGenerator");
// ad 1: handle Elapsed event and CsvGenFromDatabase is method which i have to executed
timerjob.Elapsed += new ElapsedEventHandler(CsvGenFromDatabase);
// ad 2: set interval to 1 minute (= 60,000 milliseconds)
timerjob.Interval = Convert.ToDouble(DueTime);
// ////ad 3: enabling the timer
timerjob.Enabled = true;
eventLog1.WriteEntry("my service started");
}
protected override void OnStop()
{
eventLog1.WriteEntry("my service stopped");
}
private void Workjob(string servicename )
{
ServiceController servicecsv = new ServiceController(servicename);
if ((servicecsv.Status.Equals(ServiceControllerStatus.Stopped)) || (servicecsv.Status.Equals(ServiceControllerStatus.StopPending)))
{
// Start the service if the current status is stopped.
servicecsv.Start( );
}
else
{
// Stop the service if its status is not set to "Stopped".
servicecsv.Stop();
}
}
I have built services that operate in a similar manner before, my advice would be to NOT start and stop the service from external code. Instead, apply the Timer methodology within the service itself, which should always be running. On TimerElapsed, do work and then return to an idle state. Thus alleviating the need to start and stop.
Further, I would protect the "stop" of a service to not allow the stop if the service is "working"
Sample Code
Note: I employ a process I call "zeroing" with my timer. Zeroing, in my context, is the process of getting the events to fire on zero seconds of every minute. To do that, I first set the time to fire every second and I check to see if the seconds part of the current time is zero, once that occurs I switch the timer elapse to every minute. I do this to give myself some sanity while testing.
Also, my scheduling is configurable so every minute when it "ticks" i check my config to see if the process "should" execute. I do so with the following Xml Schema:
<?xml version="1.0" encoding="utf-8"?>
<ScheduleDefinition xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<ScheduleInterval>1</ScheduleInterval>
<ScheduleUnits>min</ScheduleUnits>
<DailyStartTime>1753-01-01T08:00:00</DailyStartTime>
<ExcludedWeekDays>
<string>Sunday</string>
<string>Saturday</string>
</ExcludedWeekDays>
<ExcludedDates>
<string>12/25</string>
<string>02/02</string>
<string>03/17</string>
</ExcludedDates>
<DailyRunTimes>
<!-- code ommitted for size // -->
</DailyRunTimes>
</ScheduleDefinition>
Finally, this code sample is for a DataSync Services, so any references to "DataMigrationService" or "DataMigrationManager" are my own custom classes and are used as an abstraction to give me an object to control within the service.
... here's the code:
using System;
using System.Diagnostics;
using System.Reflection;
using System.ServiceProcess;
using System.Threading;
using System.Xml;
using System.Xml.Serialization;
using DataMigration.Configuration;
using DataMigration.ObjectModel;
namespace DataSyncService
{
public partial class DataSyncService : ServiceBase
{
#region Private Members
private System.Timers.Timer _timer = null;
private SimpleScheduleManager.ScheduleDefinition _definition = null;
private DataMigrationManager _manager = new DataMigrationManager();
#endregion
#region Constructor(s)
public DataSyncService()
{
AppDomain.CurrentDomain.AssemblyResolve += new ResolveEventHandler(AssemblyResolver.Resolve);
InitializeComponent();
}
~DataSyncService()
{
_manager = null;
_definition = null;
_timer = null;
}
#endregion
#region Public Method(s)
protected override void OnStart(string[] args)
{
Assembly assembly = Assembly.GetExecutingAssembly();
_manager.ProcessMonitor.Logger.Debug("Assembly Version: ", assembly.GetName().FullName);
assembly = null;
SetScheduleFromConfigurationFile();
_timer = new System.Timers.Timer(1000);
_timer.AutoReset = true;
_timer.Enabled = true;
_timer.Elapsed += new System.Timers.ElapsedEventHandler(_timer_ZeroingProcess);
_timer.Start();
}
protected override void OnStop()
{
_timer.Stop();
_timer.Enabled = false;
_timer = null;
// block if the Process is active!
if (_manager.State == DataMigrationState.Processing)
{
// I invented my own CancellableAsyncResult (back in the day), now you can use CancellationTokenSource
CancellableAsyncResult result = _manager.RequestCancel() as CancellableAsyncResult;
while (!result.IsCompleted) { Thread.Sleep(ServiceConstants.ThreadSleepCount); }
try
{
result.EndInvoke();
}
catch (Exception ex)
{
ProcessMonitorMessage message = ProcessMonitorMessage.GetErrorOccurredInstance();
message.EventType = ProcessMonitorEventType.ProcessAlert;
message.Severity = ProcessMessageSeverity.ErrorStop;
message.SubjectLine = "Error while stopping service. ";
message.EventDescription = ex.Message;
_manager.ProcessMonitor.ReportError(message);
}
}
}
#endregion
#region Private Method(s)
private bool MigrationIsScheduledToRunNow()
{
DateTime now = DateTime.Now;
foreach (string dowString in _definition.ExcludedWeekDays)
{
if (now.DayOfWeek.ToString().Equals(dowString))
{
Trace.WriteLine("Today is " + dowString, "Excluded by Schedule definition");
return false;
}
}
foreach (string datePart in _definition.ExcludedDates)
{
string dateString = datePart + "/2008"; // 2008 is a leap year so it "allows" all 366 possible dates.
DateTime excludedDate = Convert.ToDateTime(dateString);
if (excludedDate.Day.Equals(now.Day) && excludedDate.Month.Equals(now.Month))
{
Trace.WriteLine("Today is " + datePart, "Excluded by Schedule definition");
return false;
}
}
foreach (DateTime runTime in _definition.DailyRunTimes)
{
if (runTime.Hour.Equals(now.Hour) && runTime.Minute.Equals(now.Minute))
{
Trace.WriteLine("Confirmed Scheduled RunTime: " + runTime.TimeOfDay.ToString(), "Included by Schedule definition");
return true;
}
}
return false;
}
/// <summary>
/// Load Scheduling Configuration Options from the Xml Config file.
/// </summary>
private void SetScheduleFromConfigurationFile()
{
string basePath = AppDomain.CurrentDomain.BaseDirectory;
if (basePath.EndsWith("\\")) { basePath = basePath.Substring(0, basePath.Length - 1); }
string path = string.Format("{0}\\Scheduling\\scheduledefinition.xml", basePath);
_manager.ProcessMonitor.Logger.Debug("Configuration File Path", path);
XmlSerializer serializer = new XmlSerializer(typeof(SimpleScheduleManager.ScheduleDefinition));
XmlTextReader reader = new XmlTextReader(path);
reader.WhitespaceHandling = WhitespaceHandling.None;
_definition = serializer.Deserialize(reader) as SimpleScheduleManager.ScheduleDefinition;
reader = null;
serializer = null;
}
#endregion
#region Timer Events
private void _timer_ZeroingProcess(object sender, System.Timers.ElapsedEventArgs e)
{
if (DateTime.Now.Second.Equals(0))
{
_timer.Interval = 60000;
_timer.Elapsed -= new System.Timers.ElapsedEventHandler(_timer_ZeroingProcess);
_timer.Elapsed += new System.Timers.ElapsedEventHandler(_timer_Elapsed);
_timer_Elapsed(sender, e);
}
}
private void _timer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
{
_manager.ProcessMonitor.Logger.Info("Timer Elapsed", DateTime.Now.ToString());
if (MigrationIsScheduledToRunNow())
{
switch (_manager.State)
{
case DataMigrationState.Idle:
_manager.ProcessMonitor.Logger.Info("DataMigration Manager is idle. Begin Processing.");
_manager.BeginMigration();
break;
case DataMigrationState.Failed:
_manager.ProcessMonitor.Logger.Warn("Data Migration is in failed state, Email <NotificationRecipients> alerting them.");
break;
default:
_manager.ProcessMonitor.Logger.Warn("DataMigration Manager is still processing. Skipping this iteration.");
break;
}
}
}
#endregion
}
}

Categories