C# - Best way to continuously check a process - c#

Scenario: A Azure WebJob that will get all the Vendor record from NetSuite via WSDL.
Problem: The dataset is too large. Even with service set to 12 minutes time out. It still time out and the code failed.
NetSuite have a async process that basically run whatever you want on the server and it will return a JobId that allowed you to check the process on the server.
What I did currently is by making a search call first asking for all the Vendor records and it is to be process on the server. After I got the JobId, i wrote a void Recursion that check if the job is finish on the server with Thread Sleep set to 10 minutes.
private static bool ChkProcess(VendorsService vendorService, string jobId)
{
var isJobDone = false;
//Recursion
void ChkAsyncProgress(bool isFinish)
{
if (isFinish) return;
var chkJobProgress = vendorService.NsCheckProcessStatus(jobId);
if (chkJobProgress.OperationResult.IsFinish) isJobDone = true;
Thread.Sleep(TimeSpan.FromMinutes(10));
ChkAsyncProgress(isJobDone);
}
ChkAsyncProgress(false);
return isJobDone;
}
It work but is there a better approach?
Thanks

I think that since you're working with Azure already, with Service BUS you can implement a really low cost solution for this (if not free, depending on how much frequent is your job running)
Basically it's a queue where you enqueue messages (which can be objects with properties too, so they could also contain your result of the elaboration potentially).
A service bus is used to enqueue.
An azure function of type ServiceBusTrigger listens automatically if any new message on service bus has arrived and gets triggered if so (or, you can set messages to be enqueued, but be available after a certain future time only).
So, in the webjob code, at the end you could add code to enqueue a message which will mark the webjob has finished elaboration.
The azure function will get immediately noticed as soon as the message gets in the queue and you can retrieve the data without polling constantly for job completion, as azure will take care of all of that for you for a ridiculous price and without any effort by you.
Also, these task aren't priced timely based, but execution based, so you will pay only when it effectively put a message in queue.
They have a certain number of executions free, so it might be that you don't even need to pay anything.
Here some microsoft code sample for doing so.

Related

Azure Service Bus MessageLockLostException when Completing Locked Message

I'm getting a MessageLockLostException when performing a complete operation on Azure Service Bus after performing a long operation of 30 minutes to over an hour. I want this process to scale and be resilient to failures so I keep hold of the Message lock and renew it well within the default lock duration of 1 minute. However when I try to complete the message at the end, even though I can see all the lock renewals have occurred at the correct time I get a MessageLockLostException. I want to scale this up in the future however there is currently only one instance of the application and I can confirm that the message still exists on the Service Bus Subscription after it errors so the problem is definitely around the lock.
Here are the steps I take.
Obtain a message and configure a lock
messages = await Receiver.ReceiveAsync(1, TimeSpan.FromSeconds(10)).ConfigureAwait(false);
var message = messages[0];
var messageBody = GetTypedMessageContent(message);
Messages.TryAdd(messageBody, message);
LockTimers.TryAdd(
messageBody,
new Timer(
async _ =>
{
if (Messages.TryGetValue(messageBody, out var msg))
{
await Receiver.RenewLockAsync(msg.SystemProperties.LockToken).ConfigureAwait(false);
}
},
null,
TimeSpan.FromSeconds(Config.ReceiverInfo.LockRenewalTimeThreshold),
TimeSpan.FromSeconds(Config.ReceiverInfo.LockRenewalTimeThreshold)));
Perform the long running process
Complete the message
internal async Task Complete(T message)
{
if (Messages.TryGetValue(message, out var msg))
{
await Receiver.RenewLockAsync(msg.SystemProperties.LockToken);
await Receiver.CompleteAsync(msg.SystemProperties.LockToken).ConfigureAwait(false);
}
}
The code above is a stripped down version of what's there, I removed some try catch error handling and logging we have but I can confirm that when debugging the issue I can see the timer execute on time. It's just the "CompleteAsync" that fails.
Additional Info;
Service Bus Topic has Partitioning Enabled
I have tried renewing it at 80% of the threshold (48 seconds), 30% of the Threshold (18 seconds) and 10% of the Threshold (6 seconds)
I've searched around for an answer and the closest thing I found was this article but it's from 2016.
I couldn't get it to fail in a standalone Console Application so I don't know if it's something I'm doing in my Application but I can confirm that the lock renewal occurs for the duration of the processing and returns the correct DateTime for the updated lock, I'd expect if the lock was truely lost that the CompleteAsync would fail
I'm using the Microsoft.Azure.ServiceBus nuget package Version="4.1.3"
My Application is Dotnet Core 3.1 and uses a Service Bus Wrapper Package which is written in Dotnet Standard 2.1
The message completes if you don't hold onto it for a long time and occasionally completes even when you do.
Any help or advice on how I could complete my Service Bus message successfully after an hour would be great
The issue here wasn't with my code. It was with Partitioning on the Service Bus topic. If you search around there are some issues on the Microsoft GitHub around completion of messages. That's not important anyway because the fix I used here was to use the Subscription forwarding feature to move the message to a new Topic with partitioning disabled and then read the message from that new topic and I was able to use the exact same code to keep the message locked for a long time and still complete it successfully

ASP.Net Server Side code, does it keep running after user logged out? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
To give this question context we have an ASP.Net MVC Project which requires you to be authenticated to use the system (typical saas product). The project includes an inactivity timer which will log the user out if they leave the screen alone for too long. The project is an SPA type project and Web API is used to get/post relevant data.
I am currently having to develop a routine that archives a potentially huge amount of data and the process itself is fine. What I'm not sure of is once the process is started, a post is sent to web api and the server side code starts running, does it continue to run if the inactivity timeout occurs or the user logs out manually for some reason?
I assume it would but I don't like to rely on assumptions.
EDIT: For example
For below comments/answers. Screen will have a list of tickboxes for the data they wish to archive so is not a set list of data so this project does need to process the task.
The following code is on the client side when running (checks etc omitted and data variable contains all true/false values for ticks):
self.Running = true;
self.showProgress();
http.ajaxRequest("post", "/api/archive/runarchive", data)
.done(function () {
self.Running = false;
})
.fail(function () {
self.Running = false;
app.showMessage("You do not have permission to perform this action!");
});
For reference the showProgress function used to pick up progress to display on screen. This is also run when accessing the screen in case an archive process is still running it can be displayed:
self.showProgress = function () {
http.ajaxRequest("get", "/api/archive/getarchiveprocess")
.done(function (result) {
if (result.ID == -1) {
$("#progressBar").hide();
$("#btnArchive").show();
if (self.Running) setTimeout(self.showProgress, 2000);
else app.showMessage("The Archive Process has finished.");
}
else {
$("#progressBar").show();
$("#btnArchive").hide();
$("#progressBarInner").width(result.Progress + '%');
$("#progressBarInner").attr("data-original-title", result.Progress + '%');
setTimeout(self.showProgress, 2000);
}
});
};
Server Side:
[HttpPost]
public void RunArchive(dynamic data)
{
// Add table row entry for the archive process for reference and progress
// Check each tick and update tables/fields etc
// Code omitted as very long and not needed for example
// table row for reference edited during checks for showProgress function
}
So basically I'm asking if the RunArchive() function on the controller will keep running until it's finished despite user logging off and being unauthenticated in some way. I'm aware any IIS, App Pool refresh etc would.
It sounds like web api is the one doing the heavy work and once that starts it will continue to run regardless of what happens on the UI side of things.
This being said, there is a timeout for webapi requests that you can control in web.config.
You might want to consider another alternative. Whenever you're talking about heavy processing tasks, you're better offloading those to another service.
Your API is supposed to be responsive and accessible by your users and it needs to respond fast to allow for a better experience. If you get 100 users doing heavy work, your API will basically crumble.
The API could simply send commands to a queue of stuff that needs to be run and another service can pick them up and execute them. This keeps your API lightweight while the work is still being done.
You're talking about archiving which probably involves a database and there is no reason why you can't have something else do that job.
You could keep track of jobs in the database, you could build a table which holds statuses and once a job is done, the external service changes the status in the database and your UI can then show the result.
So the API could work like this:
add message to queue
add job details to db with status of "new" for example and a unique id which allows the queue item to be linked to this record.
Service B picks up the job from the queue and updates status in db to "running".
Job finishes and Service B updates status to "complete".
the UI reflects these statuses so the users know what's going on.
Something like this should would make for a better user experience I would think.
Feel free to change whatever doesn't make sense, it's a bit hard to give suggestions when you don't know the details of what needs to be done.
This Service B could be a windows service for example or whatever else you want that can do the job. The user permissions come into play in the beginning only, a work item would be added to the queue only if the user has the permission to initiate that. This gives you the certainty that only authorized jobs are added.
After that, Service B won't care about user permissions and will do the job to the end irrespective about users being logged in or not.
This is largely guess work at this point, but you should be able to get an idea of how to do this.
If you have more specific requirements you should add those to the initial question.
Even if the process isn't killed by the user logging out, you also need to consider that IIS can recycle app pools, and by default is set to do so once a day, as well as on memory contention, either of which will kill your long running process.
I would highly recommend you check out Hangfire.io, which is designed to help with long running processes in ASP.Net sites.

Is there any way to redirect the page first then execute the remaining code

I am new to azure web app, Is there any way to redirect the page first then execute the remaining code? I am stuck in situation where I have to redirect my page first, then execute the remaining code... Actually I have deployed my code on azure web app which has request timeout for about 4 mins (which is not configurable), my code take approx 15 min to execute, I want to redirect to main page and execute the remaining code in background. I have tried threads and parallel programming also still no luck.. I am not able to overcome the time frame my web page get request time out every time. Is there a way anyone can suggest?
Thanks for help!
/*functionA and functionB are not execute after redirecting.*/
private static async Task <int> functionA(para1, para2)
{
Task<int> temp1 = await functionB(y,z);
return int;
}
private static async Task<int> functionB(para1, para2)
{
return int;
}
/* This method will execute first */
private string functionC(para1, para2, para3)
{
console.log("hello world");
redirect.response("www.xyz.com");
Task<int> temp = await functionA(x,y);
return str; //return string type value
}
If you've got heavy processing that will result in a HTTP timeout, I suggest looking into offloading processing to a WebJob or Azure Function. It would work as follows:
Your Azure WebApp receives a HTTP request for a long-running operation. It gathers the necessary information, creates a Service Bus Queue message, and fires the message off. Your WebApp then responds to the user by telling them that the processing has begun.
Provision a separate WebJob or Azure Function that monitors your Service Bus Queue for messages. When a message is received, the WebJob/Function can perform the processing.
You will probably want to tell your user when the operation has completed and what the result is. You have a few options. The slickest would be to use SignalR to push notifications that the operation has completed to your users. A less sophisticated would be to have your WebJob/Function update a database record, then have your HTTP clients poll for the result.
I've personally used this pattern with Service Bus Queues/WebJobs/SignalR, and have been very pleased with the results.
Asynchronous operations in Azure storage queues and WebJobs can help in situation as stated
i have referred this
https://dev.office.com/patterns-and-practices-detail/2254

Trigger WebJob at a particular time after record added to a database

I want to trigger an Azure Webjob 24Hours after I have added a record to a database using .NET . Obviously there will be multiple tasks for the Webjob to handle, all at their designated time. Is there a way ( in the Azure Library for .NET) in which i can schedule this tasks ?
I am free to use Message Queues , but I want to try and avoid the unnecessary polling of the WebJob for new messages.
If you want to trigger the execution of a WebJob 24 hours after a record insertion in a SQL database I would definitely use Azure Queues for this. So after you insert the record, just add a message to the queue.
In order to do this you can easily leverage the initialVisibilityDelay property that can be passed to the CloudQueue.AddMessage() method. This will make the message invisible for 24 hours in your case, and then it will appear to be processed by your Webjob. You don't have to schedule anything, just have a Continuous WebJob listening to a queue running.
Here's some sample code:
public void AddMessage(T message, TimeSpan visibilityDelay)
{
var serializedMessage = JsonConvert.SerializeObject(message);
var queue = GetQueueReference(message);
queue.AddMessage(new CloudQueueMessage(serializedMessage), null, visibilityDelay);
}
private static CloudQueue GetQueueReference(T message)
{
var storageAccount = CloudStorageAccount.Parse("Insert connection string");
var queueClient = storageAccount.CreateCloudQueueClient();
var queueReference = queueClient.GetQueueReference("Insert Queue Name");
queueReference.CreateIfNotExists();
return queueReference;
}
Hope this helps
Since the event of adding a record to the database is the trigger here, You can use Azure Management Libraries to create a Azure Scheduler Job to execute after 24hrs from the time the db record is inserted. Azure Scheduler Jobs can do only 3 things : make HTTP/HTTPS requests or Put Message in Queue. Since you do not want to poll queues, here are two options
Deploy the existing Web Job as Wep API where each task is reachable by unique URLs, so that the scheduler task can execute the right HTTP/HTTPS request
Create a new WebAPI/Wep API which takes accepts request (like a man in the middle) and pro-grammatically run the existing web job on demand, again using Azure management libraries.
Please let me know if any of these strategies help.
To invoke a WebJob from your Website,is not good idea rather than you can add the WebJob code inside your Website and simply call that code. you can still easily use the WebJob SDK from inside your Website.
https://github.com/Azure/azure-webjobs-sdk-samples
we wouldn't recommend to invoke the WebJob from your Website is that the invocation contains a secret you rather not store on your Website (deployment credentials).
Recommendation:
To separate WebJob and Website code, the best thing to do is to communicate using a queue, the WebJob listens on the queue and the Website pushes the request to the queue.

Why does sending many emails asynchronously block the site? ( using Action.BeginInvoke )

I created a page to send thousands of emails to our clients, almost 8K emails.
The sending process is taking hours, but after a while I couldn't access any page (get waiting...) in the site that is hosting the page except for static files (images etc...).
Using: IIS 6 and .Net 4.0
Code:
public static bool Send(MailSettings settings, Action<string, string[], bool> Sent = null)
{
System.Net.Mail.SmtpClient client;
...
foreach(){
try{ client.Send(message);}catch{...client.Dispose();...}
Sent.BeginInvoke(stringValue, stringArray, boolValue, null, null);
if(count++>N){
count=1;
System.Threading.Thread.Sleep(1000);
}
}
...
}
public void SentComplete(string email, string[] value, bool isSent)
{
....//DB logging
}
Note: Other sites using the same Application pool were fine!
Questions:
Is there a IIS 6.0 limitation to the number of threads for the same website?
Any Ideas if my code was causing any performance issues? Am I using Action right?
There are many things wrong with this code:
Your try catch block is very weird. You are disposing an object in
one iteration but it can be used by others. Try using block instead.
There is a maximum time for asp.net request to execute.
Don't put thread sleep in asp.net!
Yes, there is a maximum thread count in asp.net pool
If you end up blocking for one reason, you can also be blocked by the session (if you have one)... Does it work if you open a different browser?
You're firing off a whole bunch of actions to be executed in the thread pool. There is a max number of threads that the thread pool will create, after that the work items sent to the pool simply get queued up. Since you're flooding the thread pool with so many operations you're preventing the thread pool from ever having an opportunity to get a chance to work on the items added by ASP.NET to handle pages. Since static items don't need to push work to the thread pool, those items can be serviced.
You shouldn't be firing off so many items in parallel. You should be limiting the degree of parallelism to a reasonably small fixed amount. Let those handful of items that you start each process a large number of operations that you have so that the threads in the thread pool have the possibility of working on other things as well.
We regularly send 10000 client emails. I store all the details of the email in a database and then call a web service to send them. This just chuggs through them and affects nothing else. I do put the thread to sleep (in the web service) for 100ms between each call to Send ... if I don't do this firing off so many emails seems to overwhelm our mail server and we get some odd things happening.

Categories