I have an MVC and WebAPI application that needs to log activities performed by the users back to my database. This is almost always a single insert into a table that have less than 5 columns (i.e. very little data is crossing the wire). The data interface that I am currently using is Entity Framework 6
Every once in a while, I'll get a large number of users needing to log that they performed a single activity. In this case, "Large Number" could be a couple hundred requests every second. This typically will only last for a few minutes at most. The rest of the time, I see very manageable traffic to the site.
When the traffic spikes, Some of my clients are getting timeout errors because the page doesn't finish loading until the server has inserted the data into the database. Now, the actual inserting of the data into the database isn't necessary for the user to continue on using the application, so I can cache these requests somewhere locally, and then batch insert them later.
Is there any good solutions for ASP.NET MVC to buffer incoming request data and then batch insert them into the database every few seconds?
As for my environment, I have several servers running Server 2012 R2 in a load balanced Web Farm. I would prefer to stay stateless if at all possible, because users might hit different servers per request.
When the traffic spikes, Some of my clients are getting timeout errors because the page doesn't finish loading until the server has inserted the data into the database.
I would suggest using a message queue. Have the website rendering code simply post an object to the queue representing the action, and have a separate process (e.g. Windows Service) read off the queue and write to the database using Entity Framework.
UPDATE
Alternatively you could log access to a file (fast), and have a separate process read the file and write the information into your database.
I prefer the message queue option, but it does add another piece of architecture.
Related
Here we are using angular 9 as Front end and .net core 3.1 as backend and sql server as db.
Each user will send single request to server but to execute each request server is taking almost 90 minutes for its operations.
Need to find solution for backend to hold all request or to execute some request at a time but it should not overlap the existing running request calculations.
We are thinking to bring Queue in backend already tried parallel foreach but it is messing up some calculations.
Is there any other way you can help me??
Thanks in advance
Sounds for me you have an asynchronous operation. So there are multiple solutions from my point of view I will describe two simple once which you easily can extend.
First you have to seperate the trigger of your „job“ and query the state.
With your less given informations about the whole system I assume you have running an api on a server. So you could use the files system to persist the running jobs and query them vie seperate request. So one request scheduled the „job“ and the other returns the information of the state.
Of there is no file system you also can use a database for this or just store it in memory ( be aware of undesired restarts of the programme drop your infos).
Achieved this functionality using Hangfire: https://docs.hangfire.io/en/latest/
I'm trying to come up with a design for an application (C#/Avalonia) which will allow creating views for data coming from multiple sources. The overall idea is to link the sources and present outcome using various visualization components.
There are multiple sources of data:
Database 1
Database 2/3/4
SOAP
Database 1 is going to be used to store everything related to the application itself (users, permissions and so on).
Databases 2-4+ are only data feeds.
SOAP - this is where I struggle, not quite sure how to handle. There could be 10-50 concurrent instances of the application running, and each one of them could request the same data update from SOAP (provider restrictions would make it impossible).
What I was thinking was to take the following approach:
Request initial data from SOAP
Cache the data in database 1 with a timestamp
Define a delay between the requests
Once a user requests a data update from SOAP, check if we should return cached or fresh data based on timestamp and delay value.
This approach leads to an issue where the user terminates the application in the middle of requesting new data.
User 1 requests new data, marks database to ensure no future requests are processed
User 2 requests new data - nothing happens at this stage, wait and query again
User 1 terminates - no new data for any users
Is the approach completely wrong and trying to handle it using Client only would be a suicide?
I'm working on a Cloud-Hosted ZipFile creation service.
This is a Cross-Origin WebApi2 service used to provide ZipFiles from a file system that cannot host any server side code.
The basic operation goes like this:
User makes a POST request with a string[] of Urls that correlate to file locations
WebApi reads the array into memory, and creates a ticket number
WebApi returns the ticket number to the user
AJAX callback then redirects the user to a web address with the ticket number appended, which returns the zip file in the HttpResponseMessage
In order to handle the ticket system, my design approach was to set up a Global Dictionary that paired a randomly generated 10 digit number to a List<String> value, and the dictionary was paired to a Queue storing 10,000 entries at a time. ([Reference here][1])
This is partially due to the fact that WebApi does not support Cache
When I make my AJAX call locally, it works 100% of the time. When I make the call remotely, it works about 20% of the time.
When it fails, this is the error I get:
The given key was not present in the dictionary.
Meaning, the ticket number was not found in the Global Dictionary Object.
We (with the help of Stack) tracked down the issue to multiple servers in the Cloud.
In this case, there are three.
That doesn't mean there is a one-in-three chance of this working, what seems to be going on is this:
Calls made while the browser is on the cloud site work 100% of the time because the same machine handles the whole operation end-to-end
Calls made from other sites work far less often because there is no continuity between the machine who takes the AJAX call, and the machine who takes the subsequent REDIRECT to the website to download the file. It's simple luck of the draw if the same machine handles both.
Now, I'm sure we could create a database to handle requests, but that seems like a lot more work to maintain state among these machines.
Is there any non-database way for these machines to maintain the same Dictionary across all sessions that doesn't involve setting up a fourth machine just to handle queue?
Is the reason for the dictionary simply to have a queue of operations?
It seems you either need:
A third machine that hosts the queue (despite your objection). If you're using Azure, an obvious choice might be the distributed Azure Cache Service.
To forget about the dictionary and just have the server package and deliver the requested result, perhaps in an asynchronous operation.
If your ASP.NET web app uses session state, you will need to configure an external session state provider (either the Redis Cache Service or a SQL Server session state provider).
There's a step-by-step guide here.
I have a asp.net website which processes requests using a 3rd party exe. Currently my workflow is
User accesses website using any browser and fills out a form with job details
Website calls a WCF self hosted windows service which is listening on a port
Windows service launches 3rd party exe to process the job and returns the result to website
Website displays the returned result to the user
The above website was a prototype which now needs to be turned into a production ready deployment. I realize that the above architecture has many points that could break. For example, if the machine is powered off or if the windows service crashes and is no longer listening on the port, all current requests will stop processing. To make the architecture more robust, I am considering the following
User accesses website using any browser and fills out a form with job details
Website writes out the job details to a database
Windows service which is polling the database every 10 seconds for a new job picks up the job and executes it using the 3rd party application. The results are written back to the database.
Website which has now started polling the database, picks up the results and displays them to the user.
The second architecture provides me with more logging capabilities and jobs can start again if they are in a queue. However it involves large amounts of polling which may not be scalable. Can anyone recommend a better architecture?
Instead of Polling I would go with MSMQ or RabbitMQ.
That way you could off-load your processing to multiple consumers (possibly separate servers from the web server) of the queue, and process more requests in parallel.
I have implemented the same architecture in one of my applications where users are making multiple requests to process. So I have -
Users goto website and select parameters etc. Submit the request
Request is stored into database table with all the details + user name etc
A service looks the database table and picks up the request in FIFO manner
After the request is processed, the status is updated as Failed or Completed into database table against that requestId, which can be seen by users on website
Service picks up next request if there is any, otherwise stops.
Service runs every 30 mins
I have a REST web service that is used for comunication with multiple clients, some sort of chat, but I have to make all the changes into the database as soon as the clients comunicate something and then inform all the others clients when a change is made.
I basically get a POST request and I have to reply as soon as an entry is mofified.
Now I make my thread sleep 1 second and then keep recreating the context every second for each request and if there are changes to the database I send the response.
This looks ugly to me and I wonder if there is some event or async method to be notified when a specific entry in the database is modified?
Thank you Advance.
If you're using MS SQL Server, you might have success with using Sql Dependencies. Here's a link to a brief tutorial: C# & SqlDependency - Monitoring your database for data changes.
Microsoft's successor to Notification Services, SQL Server 2008 R2 – Complex Event Processing (CEP) Technology might also serve your purposes, but I know nothing about it but what's on the web page.