I have a really long webAPI request that basically does the follow :
1. retrieves a list of item categories from the db
2. for each category, retrieve all the items in the category
Now, the entire process takes a very long time and I don't want the user to wait till the entire process is over, if a category has finished loading I want it to return to the client
Does anyone know how I can do that? Send a request and get progress notifications by the server whenever a part of the request has finished?
You could use SignalR to send the data from the server to the client when it's available.
The other option is polling from the client. The client makes the initial request, which triggers a server side process that prepares the data and keeps it somewhere (in memory, in a database). Then the client polls the server for new available data until the server process finishes.
you need to break your request. use for loop. if elements from first category are downloaded then do something with them before going for second category.
so your request will go inside some loop. You can use Jquery or page methods if you are using asp.net webforms
PushStreamContent might help you:
http://weblogs.asp.net/andresv/asynchronous-streaming-in-asp-net-webapi
Related
What will be the possible ways to send multiple responses to the client side within a Ajax call.
Scenario
1.Import 200 records from excel through Ajax.
2.Read file record one by one.
3.On each iteration I need to send some data to client side and get it back with some base 64 image on server side and then save that image into the database.
4.Do this for all the iterations.
5.After all iterations show Ajax success message.
How can I achieve this with in one Ajax request.
So finally We did it by Signal R.
We can achieve this by Web Sockets or Signal R. Signal R opens a communication channel for client side from there we can get any data to the server again.
Note Don't forget to increase the size of Signal R message buffer if anyone wants to send a large file through Signal R e.g.
GlobalHost.Configuration.DefaultMessageBufferSize = 200
I have a pretty big video file I upload to a web service via multipart/form-data.
It takes ~ 30 seconds to arrive and I would prefer not waiting that long simply to access parameters I send along with the file.
My question is simple, can I access parameters sent with the form without waiting for the video payload to be uploaded?
Can this be done using headers or any other methods?
Streaming vs. Buffering
It's about how the webserver is set up. For IIS you can enable Streaming.
Otherwise, by default, IIS will use 'buffering' - the whole request is loaded into memory first (IIS's memory that you can't get to) before your app running in IIS can get it.
Not using IIS? You have to figure out how to get the webserver to do the same thing.
How to stream using IIS:
Streaming large file uploads to ASP.NET MVC
Note the way the file is read in the inner loop:
while ((cbRead = clientRequest.InputStream.Read(rgbBody, 0, rgbBody.Length)) > 0)
{
fileStream.Write(rgbBody, 0, cbRead);
}
Here instead of just saving the data like that question does, you will have to parse any xml/json/etc or whatever contains the file parameters you speak of ... and expect the video to be sent afterwards. You can process them right away if it's a quick process ... then get the rest of the video ... or you can send them to a background thread.
You probably won't be able to parse it just dumping what you have to a json or xml parser, there will be an unclosed tag or } at the top that isn't closed til after the video data is uploaded (however that is done). Or if it's multipart data from a form submission, as you imply, you will have to parse that partial upload yourself, instead of just asking IIS for the post data.
So this will be tricky, you can first start by writing 1k at a time to a log file with a time stamp to prove that you're getting the data as it comes. after that it's just a coding headache.
Getting this to work also means you'll have to have some control over the client and how it sends the data.
That's because you'll at least have to ensure it sends the file parameters FIRST!
Which concerns me, because, if you have control of the client, why can't you take the simple route (as Nobody and Nkosi imply) and use 2 requests? You mention you need one. Why not write js client code to send the parameters first in an XHR and then the file in a second request, using a correlation ID in both to tie them together? (the server could return this from the first request and you could send it in the 2nd).
Obviously, if you're just having a form with some inputs and a file upload and doing submit, then you need one request ;-) But if you have control over the client side you're not stuck with that.
Good luck, there is some advanced programming here, but nothing super high-tech. You will make it work!!!
If you don't have control over the server code, you are probably stuck, if the server app's webserver is buffering, the server app won't get anything, of course, if you wanted to do something with the file parameters first, this really implies you have control of the server side ;-)
I have an issue with using a Database in a thread in my asp.net Application.
When I want to start my application I want to start a thread called "BackgroundWorker" with it, which runs in the background till the whole application is stopped.
The problem is that I have massive problems with the dbContext in the thread.
I I try to start the walker in my Startup.cs in the methods "ConfigureServices" or "Configure" and then initialize the dbContext in the constructor in the Walker like this "dbContext = new ApplicationContext()" it tells me that the connection is not configured, when I try to operate in the while(true) queue on the database.
If I write an own Controller for the Walker which receives a ApplicationContext in his constructor and then starts a Thread like this, if i call this controller once with a GET Request:
public BackgroundWorker(ChronicusContext dbContext)
{
_dbContext = dbContext;
_messageService = new MailMessageService();
}
// GET: api/backgroundworker
[HttpGet]
[Route("start")]
public void StartWorker()
{
//Thread thread = new Thread(this.DoBackGroundWork);
Thread thread = new Thread(() => DoBackGroundWork(this._dbContext));
thread.Start();
}
public void DoBackGroundWork(ChronicusContext _dbContext)
{
while (true)
{
if (_dbContext.PollModels.Any()) //Here is the exception
{
...
}
}
}
Then I receive an System.ObjectDisposedException that the object is already disposed inside the while (true) queue.
I tried those and similar things in many different ways but allways receive exceptions like these two or that the database connection is closed.
Can somebody help me and tell me, how this works?
Thank you!
Generally, server side multithreading for Web Applications does not happen often and is, most times, a huge no no.
Conceptually, your server is "multithreaded", it handles many HTTP requests from clients/users/other servers. For mobile and web architecture/design, your server(s) process multiple requests and your clients are handling asynchronous calls and dealing with waiting for responses from long running calls like your API method StartWorker.
Think of this scenario, you make a request to your WebAPI method StartWorker, the client, making the request is waiting for a response, putting the work on another thread does nothing as the client is still waiting for a response.
For example, let's consider your client an HTML web page with an Ajax call. You call StartWorker via Ajax, you will be loading data into a HTML table. You will desire, from a UX perspective, to put up a progress spinner while that long running StartWorker responds to your HTML Page Ajax call request. When StartWorker responds, the Ajax call loads the HTML table with the StartWorker response. StartWorker has to respond with the data. If StartWorker responds beforehand than you will have to send a push notification, via SignalR, for example, when the other thread completes and has the data you need for the HTML table.
Hopefully, you see, the call to the WebAPI method, takes the same amount of time from a Ajax request/response perspective, so multithreading becomes pointless in this scenario, a most common web application scenario.
You can have your client UI load other UI elements, showing a progress spinner in HTML table UI area, until your database call is complete and responds with the data to your Ajax call. This way your users know things are happening and something is still loading.
If you still need your additional thread in your API for your project needs, I believe you have to be using Entity Framework 6 or greater to support asynchronous queries, see this tutorial:
http://www.codeproject.com/Tips/805923/Asynchronous-programming-in-Web-API-ASP-NET-MVC
UPDATE
Now that I know you need to run a SQL query on a repeating frequency of time, and you have an Azure Web App, what you want to use is Azure Automation if you are using Sql Azure or create a Sql Server Job if you are using a Sql Server instance as your backend
DbContext is not thread safe. You need to create a new context from inside your thread.
public void DoBackGroundWork()
{
ChronicusContext anotherContext= new ChronicusContext();
while (true)
{
if (anotherContext.PollModels.Any())
{
...
}
}
}
My Asp.net application generates a dynamic pdf. Sometimes this takes a while and is a quite heavy process. Actually i dont want my users to wait for the pdf, just send it to there mail after it generated.
So I tried a webservice. I'm passing an id (to get the data from the database) and some strings to the websercice's method.
But also with a webservice (even with asynchronous calls) the client only receives its response after the pdf is generated. So the user still has to wait.
So I'm kinda stuck, there must be a way i'm overlooking.
You don't need a webservice in order to get the ability to make asynchronous invocations.
You can just use ThreadPool.QueueUserWorkItem() as a fire-and-forget approach in the ASPX page, then return a reply with some sort of "work item id" - like a receipt or an order number.
Generate the PDF in the WaitCallback you pass to QUWI.
when the pdf is ready, that WaitCallback can send an email, or whatever.
Use a webservice if you want the function to be accessible, outside the webpage. Don't use it strictly for asynchrony.
Issue is that in your ASP.NET page code, you must be invoking the web service synchronously so the page waits till web service returns. You should try invoking the web service asynchronously (or on the different thread) and then don't wait for it to complete. Typically, visual studio generated proxy already has asynchronous overloads that you may use.
Alternately, you may modify your web service code - essentially, when request to your web method comes, you can start PDF generating on a different thread so that your web method may end indicating your client (page in this case) that request has been successfully scheduled for processing.
there are two ways which i know
First ways;
In asp.net code behind (in xxx.aspx.cs file) you can define a void method then you can call the method by starting a thread like below.
protected void SenMail(object prms)
{
int id = int.Parse(prms.ToString());
//mail sending proces
}
//starting SendMail method asynchronous
Thread trd = new Thread(SenMail);
trd.Start(idValue);
Second way;
You can create and mail sender page like "SendMail.aspx", then you can make an ajax request in javascript and no need to wait any response. you can pass id value to aspx page as request parameter.
I've got several web-services: asmx,wcf. At couple of them there are some methods, which take a lot of time for processing, but size of input data for these methods are small and it takes not much time to transfer on the wire. I want move to not sync model. Client passes data to service, service answers that data transfer was correct and process it at background thread witout connection with client. So agter transfering connection should be closed. IS it possible? Can u help me with articles or may be just google request.
John is right - Once you close an http connection, it is done. You can't get back to the same process.
So if you can use another technology that allows duplex on one connection (e.g. WCF), do it!
However,
if you have no choice but to use webservices,
here are three ways to make it work. You may get timeouts on any of them.
Option 1:
Forget the part about 'client answers data was correct.' Just have each thread make its request and wait for the data.
Option 2:
Now, assuming that won't work and you must do the validation, this way requires the client to make 2 requests.
First request: returns valid/invalid.
Second request: returns the long-running results.
Variation of option 2:
If you have timeout problems, you could have the first request generate a GUID or unique database key and start another process, passing it this key, and return the key to the client. (if you can get the server to allow you to start a process - depends on security settings/needs - if not you may be able to start an async thread and have it keep running after the websvc one ends?) The process will do the long task, update the row in the database w/ the unique id when finished, revealing the results plus a 'done' flag. The second request by the client could always return immediately and if the processing is not done, return that, if it is, return the results. The client will repeat this every 5 sec or so until done.
Hacks, I know, but we don't always have a choice for the technology we use.
Don't do this with ASMX web services. They weren't designed for that. If you must do it with ASMX, then have the ASMX pass the data off to a Windows Service that will do the actual work, in the background.
This is more practical with WCF.
We have been writing stuff to interact with the UK gov website and the way they handle something similar is that you send your request and data to the server and it responds saying, roughly, "thanks very much - we're processing it now, please call back later using this id" - all in an XML message. You then, at some point later, send a new http request to the service saying, essentially, "I'm enquiring about the status of this particular request id" and the server returns a result that says either it has processed OK, or processed with errors, or is still processing, please try again in xx seconds.
Similar to option 2 described previously.
It's a polling solution rather than a callback or 2 way conversation but it seems to work.
The server will need to keep, or have access to, some form of persistent table or log for each request state - it can contain eg, the id, the original request, current stage through the workflow, any error messages so far, the result (if any) etc. And the web service should probably have passed the bulk of the request off to a separate Windows service as already mentioned.