In my Web Api application I need to do something like in this topic: Best way to run a background task in ASP.Net web app and also get feedback? In the application user could upload an excel file and then import its data to tables in the Database. It all works fine, but the import process can take a long time (about 20 minutes, if the excel have a lot of rows) and when process is started the page is blocked and users have to wait all this time. I need this import process run in a background.
I have a controller with this POST method:
[HttpPost]
public async Task<IHttpActionResult> ImportFile(int procId, int fileId)
{
string importErrorMessage = String.Empty;
// Get excel file and read it
string path = FilePath(fileId);
DataTable table = GetTable(path);
// Add record for start process in table Logs
using (var transaction = db.Database.BeginTransaction()))
{
try
{
db.uspAddLog(fileId, "Process Started", "The process is started");
transaction.Commit();
}
catch (Exception e)
{
transaction.Rollback();
importErrorMessage = e.Message;
return BadRequest(importErrorMessage);
}
}
//Using myHelper start a store procedure, which import data from excel file
//The procedure also add record in table Logs when it is finished
using (myHelper helper = new myHelper())
helper.StartImport(procId, fileId, table, ref importErrorMessage);
if (!String.IsNullOrEmpty(importErrorMessage))
return BadRequest(importErrorMessage);
return Ok(true);
}
I also have a GET method, which returns information about file and its process
[HttpGet]
[ResponseType(typeof(FileDTO))]
public IQueryable<FileDTO> GetFiles(int procId)
{
return db.LeadProcessControl.Where(a => a.ProcessID == procId)
.Project().To<FileDTO>();
}
It returns JSON like this:
{
FileName = name.xlsx
FileID = 23
ProcessID = 11
Status = Started
}
This method is for GRID
File name | Status | Button to run import | Button to delete file
This Status is from table Logs and in FileDTO it placed the last value, for example if I upload file the status will be "File uploaded" when I run Import status will be "Started" and when it finished status will be "Complete". But now the page is locked when the import process is running, so the status always will be "Complete".
So I need to run procedure in background and GET method should return new Status if it has been changed. Any suggestions?
Adding async to a method don't make your method call asynchronous. It just indicate that thread that is handling the current request can be reused for processing other requests while waiting for some network/disk IO. When a client call this method it will only get response once method is complete. In other word async is completely sever side thing and nothing to do with the client call. You need to start your long running process in a separate thread as shown below. But best practice is not to use web app for such a long running processing instead do long processing in a separate windows service.
[HttpPost]
public async Task<IHttpActionResult> ImportFile(int procId, int fileId)
{
string importErrorMessage = String.Empty;
// Get excel file and read it
string path = FilePath(fileId);
DataTable table = GetTable(path);
// Add record for start process in table Logs
using (var transaction = db.Database.BeginTransaction()))
{
try
{
db.uspAddLog(fileId, "Process Started", "The process is started");
transaction.Commit();
}
catch (Exception e)
{
transaction.Rollback();
importErrorMessage = e.Message;
return BadRequest(importErrorMessage);
}
}
//Start long running process in new thread
Task.Factory.StartNew(()=>{
using (myHelper helper = new myHelper())
{
helper.StartImport(procId, fileId, table, ref importErrorMessage);
//** As this code is running background thread you cannot return anything here. You just need to store status in database.
//if (!String.IsNullOrEmpty(importErrorMessage))
//return BadRequest(importErrorMessage);
}
});
//You always return ok to indicate background process started
return Ok(true);
}
Related
Created a POST API which basically save a file in one directory.
Will asynchronous code make my API better at handling the scalability when multiple requests come from clients?
Currently, the code works synchronously.
Should I make every method as asynchronous? And where should I place the keyword await?
The tasks:
Task 1: Read request content (XML)
Task 2: Create a directory if not created already
Task 3: Uniquely make filenames unique
Save file on the directory
[System.Web.Mvc.HttpPost]
public IHttpActionResult Post(HttpRequestMessage request)
{
try
{
string contentResult = string.Empty;
ValidateRequest(ref contentResult, request);
//contentResult = "nothing";
//Validation of the post-requested XML
//XmlReaderSettings(contentResult);
using (StringReader s = new StringReader(contentResult))
{
doc.Load(s);
}
string path = MessagePath;
//Directory creation
DirectoryInfo dir = Directory.CreateDirectory($#"{path}\PostRequests");
string dirName = dir.Name;
//Format file name
var uniqueFileName = UniqueFileNameFormat();
doc.Save($#"{path}\{dirName}\{uniqueFileName}");
}
catch (Exception e)
{
LogService.LogToEventLog($"Error occured while receiving a message from messagedistributor: " + e.ToString(), System.Diagnostics.EventLogEntryType.Error);
throw e;
}
LogService.LogToEventLog($"Message is received sucessfully from messagedistributor: ", System.Diagnostics.EventLogEntryType.Information);
return new ResponseMessageResult(Request.CreateResponse((HttpStatusCode)200));
}
Yes, it should.
When you use async with a network or IO calls, you do not block threads and they can be reused for processing other requests.
But, if you have only one drive and other clients do the the same job - you will not get speed benefits, but whole system health still would be better with async calls.
I have a C# WinForms application running on Raspbian with Mono. It has a timer. When the OnTimedEvent fires, I check if I have exclusive access to a file that I want to upload (to make sure it is finished being written to disk), then attempt to upload. If the upload is successful, I move the file to an archive folder, otherwise I leave it there and wait for the next timer event. I have no problems when connected to the Internet, but when I test without and my upload fails, the second OnTimedEvent gets an exception when checking if the same file is ready (again). I am getting :
Error message: ***Sharing violation on path 'path'
***HResult: ***-2147024864
Method to check if file is ready:
public static bool IsFileReady(string filename)
{
// If the file can be opened for exclusive access it means that the file
// is no longer locked by another process.
try
{
var inputStream = File.Open(filename, FileMode.Open, FileAccess.Read, FileShare.None);
bool test = inputStream.Length > 0;
inputStream.Close();
inputStream.Dispose();
return test;
}
catch (Exception e)
{
//log
throw e;
}
}
This is what executes on the OntimedEvent:
var csvFiles = from f in di.GetFiles()
where f.Extension == ".csv"
select f; //get csv files in upload folder
foreach (var file in csvFiles)
{
if (IsFileReady(file.FullName)) //check that file is done writing before trying to move.
{
bool IsUploadSuccess = await WritingCSVFileToS3Async(file);//.Wait(); //upload file to S3
if (IsUploadSuccess)
{
File.Move(file.FullName, archivePath + file.Name); //move to completed folder if upload successful. else, leave there for next upload attempt
}
}
}
From what I can understand, it looks like my first FileStream (File.Open) still has the file locked when the 2nd event fires. However, I've added .Close() and .Dispose() to the IsFileReady method but that doesn't seem to be working.
Any help would be appreciated!
EDIT: Below is the WritingCSVFileToS3Async method.
static async Task<bool> WritingCSVFileToS3Async(FileInfo file)
{
try
{
client = new AmazonS3Client(bucketRegion);
// Put the object-set ContentType and add metadata.
var putRequest = new PutObjectRequest
{
BucketName = bucketName,
Key = file.Name,
FilePath = file.FullName ,
ContentType = "text/csv"
};
//putRequest.Metadata.Add("x-amz-meta-title", "someTitle"); //don't need meta data at this time
PutObjectResponse response = await client.PutObjectAsync(putRequest);
if (response.HttpStatusCode == System.Net.HttpStatusCode.OK)
return true;
else
return false;
}
catch (AmazonS3Exception e)
{
ErrorLogging.LogErrorToFile(e);
return false;
}
catch (Exception e)
{
ErrorLogging.LogErrorToFile(e);
return false;
}
Also, I ran the same application on Windows, and am getting a similar exception:
The process cannot access the file 'path' because it is being used by another process.
I believe I've found the problem. I noticed that I was not catching the client timeout exception for the PUT request(not connected to internet). My timer interval was 20 seconds, which is shorter than the S3 client timeout (30 seconds). So the client still had the file tied up by the time the second timer event fired, hence the access violation. I increased the timer interval to 60 seconds, and I now catch the client timeout exception and can handle it before the next timer event.
Thanks for your help.
I am working on an ASP.NET Webform project (legacy code).On my button_click event i am sending sms message to all the datas populated in this.
var customerSMS = BusinessLayer.SMS.SmsSetup.GetAllCustomerSMS(OfficeId);
This takes around 15seconds to do all the computing and get the data(1000rows)
from the Db.And for each data it runs through the loop and does validation and
sends the sms and it does take time.I want to do this task in background and
redirect the user to the index page and the background process continues till it
gets out of the loop.I am new to this and still learning this beautiful
language C#.I did go through this amazing Asynchronous Programming async/await
and Multithreading approach and got hold of it only in simple WindowsForm
applications.Any reference/code snippet/best approach with a simple explanation for my case would be helpful.
My button click event code :
protected void ReturntoDashboard_Click(object sender, EventArgs e)
{
sms = Everest.Net.BusinessLayer.SMS.SmsSetup.GetSmsSetUp(OfficeId);
if (sms.EnableSmsData && sms.SmsCount > 0)
{
#region Loan Section
var smsLoan = Everest.Net.BusinessLayer.SMS.SmsSetup.GetLoanId(s.Sms_AccountNumber);
var loanId =
BusinessLayer.SMS.SmsSetup.GetLoanIdValue(s.Sms_AccountNumber);
var dateexceeded =
BusinessLayer.SMS.SmsSetup.IsDateExceeded(loanId);
if (smsLoan != null && dateexceeded == true)
{
foreach (Common.SMS.SMSSetup sm in smsLoan)
{
var smsClosingBalanceLoan = BusinessLayer.SMS.SmsSetup.GetAmountForLoanAlert( sm.LoanId,
BusinessLayer.Core.DateConversion
.GetCurrentServerDate()
.AddDays(sms.DaysbeforeLoanalerts).ToString());
if (smsClosingBalanceLoan != null)
{
if (smsClosingBalanceLoan.LoanAmountToPay > 0)
{
int smsSentAlertCount = sms.LoanAlertCount;
var logCount = BusinessLayer.SMS.SmsSetup.GetLoanSmsAlertSentCount(DateTime.Now.AddDays(-smsSentAlertCount).ToString("yyyy-MM-dd"), DateTime.Now.ToString("yyyy-MM-dd"), sm.LoanAccountNumber);
if (logCount < smsSentAlertCount)
{
smsLog = new Everest.Net.Common.SMS.SMSSetup();
finalMessage = "Dear Member, Your Loan accnt " + sm.LoanAccountNumber + " with Principal"+ "+" + "Int Amnt: Rs." + smsClosingBalanceLoan.LoanAmountToPay + " need to be payed.Thank You," + officeName.OfficeName;
smsLog.LogServiceType = "Loan";
smsLog.LogSmsType = s.Sms_SmsType;
smsLog.LogSmsMessage = finalMessage;
smsLog.LogCustomerId = s.CustomerId.ToString();
smsLog.LogAccountNumber = s.Sms_AccountNumber;
smsLog.LogAccountType = s.Sms_AccountType;
smsLog.LogSmsSentDate = BusinessLayer.Core.DateConversion.GetCurrentServerDate();
smsLog.LogSmsFailedDate = "";
smsLog.LogSentStatus = true;
smsLog.LogUserId = UserId;
smsLog.LogSmsFailedMessage = "";
try
{
var result = Everest.Net.BusinessLayer.SMS.smsParameters.SendSMS(sms.FromNum, sms.Token, sms.Url, cellNum, finalMessage);
}
catch (Exception ex)
{
smsLog.LogSmsFailedDate = System.DateTime.Now.ToString("MM/dd/yyyy HHmmss");
smsLog.LogSentStatus = false;
smsLog.LogSmsFailedMessage = ex.Message;
Everest.Net.BusinessLayer.SMS.SmsSetup.InsertSMSLog(smsLog);
}
sms = Everest.Net.BusinessLayer.SMS.SmsSetup.GetSmsSetUp(OfficeId);
sms.SmsCount = sms.SmsCount - 1;
Everest.Net.BusinessLayer.SMS.SmsSetup.UpdateSmsSetup(sms);
Everest.Net.BusinessLayer.SMS.SmsSetup.InsertSMSLog(smsLog);
}
}
}
}
}
}
}
}
catch (Exception ex)
The ideal solution would remove the responsibility of sending the SMS from the web application itself. Instead, the web application should create a database record containing the message and recipient addresses, and a separate background job (e.g. a Windows Service) should poll the database and send SMS messages when neeeded. This is the best solution in terms of fault tolerance and auditability, because there is a permanent record of the messaging job which can be resumed if the system fails.
That being said, maybe you don't want to go to all that trouble. If you feel strongly that you wish to send the SMS directly from the ASP.NET application, you will need to create a Task and queue it to run using QueueBackgroundWorkitem. You will need to refactor your code a bit.
Move all the logic for sending the SMS into a separate function that accepts all the information needed as parameters. For example,
static void SendSMS(string[] addresses, string messagetext)
{
//Put your SMS code here
}
When you need to call the function, queue it as a background item
HostingEnvironment.QueueBackgroundWorkItem(a => SendSMS(addresses, messageText));
If your worker task needs to access its own cancellation token (e.g. if it is supposed to loop until cancelled), it is passed as an argument to the lambda expression. So you could modify the prototype
static void SendSMS(string[] addresses, string messagetext, CancellationToken token)
{
while (!token.IsCancellationRequested)
{
//Put your code here
}
}
and pass it thus:
HostingEnvironment.QueueBackgroundWorkItem(token => SendSMS(addresses, messageText, token));
Placing the task in the background queue ensures that ASP.NET keeps track of the thread, doesn't try to garbage collect it, and shuts it down properly when the application pool needs to shut down.
After queuing the background operation, your page can render is content per usual and conclude the HTTP response while the task continues to execute.
Problem: ContinueFileOpenPicker, the file reading action is success in debugging mode (even in release build), but not success when the release APP running independently.
I read a text file in my window 8.1 phone by using a file picker,
in debug mode, the file picker return the file and showing waiting to restore to my APP, and the reading is SUCCESS.
but when I run the APP directly, the file picker return to my APP straight away without waiting, then within seconds later my app crashed, perhaps read some lines but not all.
the code for reading are as following.
public async void ContinueFileOpenPicker(FileOpenPickerContinuationEventArgs args)
{
try
{
int nRecords = await ImportACUFile(files[0]);
}
catch (Exception ex)
{
}
}
public async Task<int> ImportACUFile(StorageFile acufile)
{
using (var db = new SQLite.SQLiteConnection(DBPath))
{
var lines = await Windows.Storage.FileIO.ReadLinesAsync(acufile);
foreach (var line in lines)
{.....}
.....
}
}
Can anyone help me to identify the problem is?
THANKS A LOT AHEAD!!!
LONG
I have a process that retrieves html from a remote site and parses it. I pass several URL's into the method, so I would like to ajaxify the process and give a screen notification each time a URL completes parsing. For example, this is what I am trying to do:
List<string> urls = ...//load up with arbitary # of urls
foreach (var url in urls)
{
string html = GetContent(url);
//DO SOMETHING
//COMPLETED.. SEND NOTIFICATION TO SCREEN (HOW DO I DO THIS)
}
public static string GetContent(string url)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Method = "GET";
using (var stream = request.GetResponse().GetResponseStream())
{
using (var reader = new StreamReader(stream, Encoding.UTF8))
{
return reader.ReadToEnd();
}
}
}
In each iteration in the loop, I want to show the URL was completed and moving on to the next one. How can I accomplish this?
The first thing you need to worry about is the fact (I'm assuming) that you're running a potentially long-running operation in ASP.NET code. This will become a problem when you run in to IIS timeouts. (By default, 90 seconds.) Assume you're processing ten URLs, each of which takes 15 seconds to complete reader.ReadToEnd() – your code will time out and get killed after the sixth URL.
You might be thinking "I can just crank up the timeout," but that's not really a good answer; you're still under time pressure.
The way I solve problems like this is to move long-running operations into a stand-alone Windows Service, then use WCF to communicate between ASP.NET code and the Service. The Service can run a thread pool that executes requests to process a group of URLs. (Here's an implementation that allows you to queue work items.)
Now, from your web page, you can poll for status updates via AJAX requests. The handler in your ASP.NET code can use WCF to pull the status information from the Service process.
A way to do this might be to assign each submitted work unit a unique ID and return that ID to the client. The client can then poll for status by sending an AJAX request for the status of work unit n. In the Service, keep a List of work units with their statuses (locking it as appropriate to avoid concurrency problems).
public class WorkUnit {
public int ID { get; set; }
public List<string> URLs { get; set; }
public int Processed { get; set; }
}
private var workUnits = new List<WorkUnit>();
private void ExecuteWorkUnit(int id) {
var unit = GetWorkUnit(id);
foreach (var url in unit.URLs) {
string html = GetContent(url);
// do whatever else...
lock (workUnits) unit.Processed++;
}
}
public WorkUnit GetWorkUnit(int id) {
lock (workUnits) {
// Left as an exercise for the reader
}
}
You'll need to fill in methods to add a work unit, return the status of a given work unit, and deal with the thread pool.
I've used a similar architecture with great success.