Best way to prevent race conditions in a multi instance web environment? - c#

Say you have an Action in ASP.NET MVC in a multi-instance environment that looks something like this*:
public void AddLolCat(int userId)
{
var user = _Db.Users.ById(userId);
user.LolCats.Add( new LolCat() );
user.LolCatCount = user.LolCats.Count();
_Db.SaveChanges();
}
When a user repeatedly presses a button or refreshes, race conditions will occur, making it possible that LolCatCount is not similar to the amount of LolCats.
Question
What is the common way to fix these issues? You could fix it client side in JavaScript, but that might not always be possible. I.e. when something happens on a page refresh, or because someone is screwing around in Fiddler.
I guess you have to make some kind of a network based lock?
Do you really have to suffer the extra latency per call?
Can you tell an Action that it is only allowed to be executed once per User?
Is there any common pattern already in place that you can use? Like a Filter or attribute?
Do you return early, or do you really lock the process?
When you return early, is there an 'established' response / response code I should return?
When you use a lock, how do you prevent thread starvation with (semi) long running processes?
* just a stupid example shown for brevity. Real world examples are a lot more complicated.

Answer 1: (The general approach)
If the data store supports transactions you could do the following:
using(var trans = new TransactionScope(.., ..Serializable..)) {
var user = _Db.Users.ById(userId);
user.LolCats.Add( new LolCat() );
user.LolCatCount = user.LolCats.Count();
_Db.SaveChanges();
trans.Complete();
}
this will lock the user record in the database making other requests wait until the transaction has been committed.
Answer 2: (Only possible with single process)
Enabling sessions and using session will cause implicit locking between requests from the same user (session).
Session["TRIGGER_LOCKING"] = true;
Answer 3: (Example specific)
Deduce the number of LolCats from the collection instead of keeping track of it in a separate field and thus avoid inconsistency issues.
Answers to your specific questsions:
I guess you have to make some kind of a network based lock?
yes, database locks are common
Do you really have to suffer the extra latency per call?
say what?
Can you tell an Action that it is only allowed to be executed once per User
You could implement an attribute that uses the implicit session locking or some custom variant of it but that won't work between processes.
Is there any common pattern already in place that you can use? Like a Filter or attribute?
Common practice is to use locks in the database to solve the multi instance issue. No filter or attribute that I know of.
Do you return early, or do you really lock the process?
Depends on your use case. Commonly you wait ("lock the process"). However if your database store supports the async/await pattern you would do something like
var user = await _Db.Users.ByIdAsync(userId);
this will free the thread to do other work while waiting for the lock.
When you return early, is there an 'established' response / response code I should return?
I don't think so, pick something that fits your use case.
When you use a lock, how do you prevent thread starvation with (semi) long running processes?
I guess you should consider using queues.

By "multi-instance" you're obviously referring to a web farm or maybe a web garden situation where just using a mutex or monitor isn't going to be sufficient to serialize requests.
So... do you you have just one database on the back end? Why not just use a database transaction?
It sounds like you probably don't want to force serialized access to this one section of code for all user id's, right? You want to serialize requests per user id?
It seems to me that the right thinking about this is to serialize access to the source data, which is the LolCats records in the database.
I do like the idea of disabling the button or link in the browser for the duration of a request, to prevent the user from hammering away on the button over and over again before previous requests finish processing and return. That seems like an easy enough step with a lot of benefit.
But I doubt that is enough to guarantee the serialized access you want to enforce.
You could also implement shared session state and implement some kind of a lock on a session-based object, but it would probably need to be a collection (of user id's) in order to enforce the serializable-per-user paradigm.
I'd vote for using a database transaction.

I suggest, and personally use mutex on this case.
I have write here : Mutex release issues in ASP.NET C# code , a class that handle mutex but you can make your own.
So base on the class from this answer your code will be look like:
public void AddLolCat(int userId)
{
// I add here some text in front of the number, because I see its an integer
// so its better to make it a little more complex to avoid conflicts
var gl = new MyNamedLock("SiteName." + userId.ToString());
try
{
//Enter lock
if (gl.enterLockWithTimeout())
{
var user = _Db.Users.ById(userId);
user.LolCats.Add( new LolCat() );
user.LolCatCount = user.LolCats.Count();
_Db.SaveChanges();
}
else
{
// log the error
throw new Exception("Failed to enter lock");
}
}
finally
{
//Leave lock
gl.leaveLock();
}
}
Here the lock is base on the user, so different users will not block each other.
About Session Lock
If you use the asp.net session on your call then you may win a free lock "ticket" from the session. The session is lock each call until the page is return.
Read about that on this q/a:
Web app blocked while processing another web app on sharing same session
Does ASP.NET Web Forms prevent a double click submission?
jQuery Ajax calls to web service seem to be synchronous

Well MVC is stateless meaning that you'll have to handle with yourself manually. From a purist perspective I would recommend preventing the multiple presses by using a client-side lock, although my preference is to disable the button and apply an appropriate CSSClass to demonstrate its disabled state. I guess my reasoning is we cannot fully determine the consumer of the action so while you provide the example of Fiddler, there is no way to truly determine whether multiple clicks are applicable or not.
However, if you wanted to pursue a server-side locking mechanism, this article provides an example storing the requester's information in the server-side cache and returns an appropriate response depending on the timeout / actions you would want to implement.
HTH

One possible solution is to avoid the redundancy which can lead to inconsistent data.
i.e. If LolCatCount can be determined at runtime, then determine it at runtime instead of persisting this redundant information.

Related

Best practice to avoid double request from View in ASP.NET

I have the following controller:
[HttpPost]
public ActionResult SomeMethod(string foo, obj bar)
{
//Some Logic
}
Now suppose that from the view that Action is called from a Button or from Ajax (with some edits), and I don't want to receive a double request.
What is the best approach to handle it from server side?
Update
You'd first have to define the time interval that would meet the
criteria of a double request – Jonesopolis
Let's suppose that in this case a double request are when the difference of time between first and 2nd call is less than 1s
Frankly, you can't, at least not totally. There's certain things you can do server-side, but none are fool-proof. The general idea is that you need to identity the POST is some way. The most common approach is to set a hidden input with a GUID or similar. Then, when a request comes in you record that GUID somewhere. This could be in the session, in a database, etc. Then, before processing the request, you check whatever datastore you're using for that GUID. If it exists, it's a duplicate POST, and if not, you can go ahead.
However, the big stipulation here is that you have to record that somewhere, and do that takes some period of time. It might only be milliseconds, but that could be enough time for a duplicate request to come in, especially if the user is double-clicking a submit button, which is most often the cause of a double-submit.
A web server just reponds to requests as they come in, and importantly, it has multiple threads and perhaps even multiple processes serving requests simultaneously. HTTP is a stateless protocol, so the server doesn't care whether the client has made the same request before, because it effectively doesn't know the client has made the same request before. If two duplicate requests are being served virtually simultaneously on two different threads, then it's a race to see if one can set something identifying the other as a duplicate before the other one checks to see if it's a duplicate. In other words, most of the time, you're just going to be out of luck and both requests will go through no matter what you try to do server-side to stop duplicates.
The only reliable way to prevent double submits is to disable the submit button on submit using JavaScript. Then, the user can effectively only click once, even if they double-click. That still doesn't help you if the user disables JavaScript, of course, but that's becoming more and more rare.
Look. Becareful with this approach. You will add most complexity to control this in server side.
First, you need recognize when the multiple requests are comming from the same user.
I don't think that to control this in server side is the best way.
However, if you really want that... look: https://stackoverflow.com/a/218919/2892830
In this link was suggested maintain a list of token. But, in your case, just check if the same token was received more than one time.
You need at least to implement double click on event listener.
UseSubmitBehiviar="false"
OnClientClick="this.disable='true'; this.value="Please wait";"
Check ASP.NET Life cycle
Check Request/Redirect
Add test code to see who is responsible
if (IsPostBack)
{
_CtrlName = thisPage.Request.Params.Get("__EVENTTARGET");
if (_CtrlName != null && _CtrlName == myButton.ID)
{
//Do your thing
}
}
Check IsPostBack in page load and use it correct to prevent dublicate requests.
protected void Page_Load(object sender, EventArgs e)
{
if (!IsPostBack)
{
}
}

Create Critical section/Lock for a specific user

I am developing an eCommerce website which will be accessed by multiple users. I have a method UpdateUserAmount() which must be synchronized for a specific user i.e. we should not access the method UpdateUserAmount() simultaneously for the same user.
I am thinking of using a new static lock object for each user.I will create this object when the user enters the critical section and delete it when he leaves.
This seems inefficient to me as we are creating as many objects as there are simultaneous users. Is there an efficient method of achieving the same?
UpdateUserAmount(int amt) {
user.amount+=amt // critical section
}
I am thinking of using a new [...] lock object for each user.
That's exactly what you'll need to do.

Conditional locking based on value

I am writing a web service that allows users to create jobs within the system. Each user has an allowance of the number of jobs they can create. I have a method which checks that the user has some remaining credits which looks like this:
private bool CheckRemainingCreditsForUser(string userId)
{
lock(lockObj)
{
var user = GetUserFromDB(userId);
if (user.RemaingCredit == 0) return false;
RemoveOneCreditFromUser(user);
SaveUserToDB(user);
}
}
The problem I can see with this approach is that if multiple different users make a request at the same time they will only get processed one at a time which could cause performance issues to the client. Would it be possible to do something like this?
private bool CheckRemainingCreditsForUser(string userId)
{
//If there is a current lock on the value of userId then wait
//If not get a lock on the value of userId
var user = GetUserFromDB(userId);
if (user.RemaingCredit == 0) return false;
RemoveOneCreditFromUser(user);
SaveUserToDB(user);
//Release lock on the value of userId
}
This would mean that requests with different userIds could be processed at the same time, but requests with the same userId would have to wait for the previous request to finish
Yes, you could do that with a Dictionary<string, object>. To link a lockObject to every userId.
The problem would be cleaning up that Dictionary every so often.
But I would verify first that there really is a bottleneck here. Don't fix problems you don't have.
The alternative is to have a (optimistic) concurrency check in your db and just handle the (rare) conflict cases.
Instead of locking in every methods, why aren't you using a Singleton that will manage the User's rights ?
It will be responsible from giving the remaining allowances AND manage them at the same time without loosing the thread-safe code.
By the way, a method named CheckRemainingCreditsForUser should not remove allowances since the name isn't implying it, you may be the only developer on this project but it won't hurt to make 2 methods to manage this for re-useability and code comprehension.
EDIT : And this object should also hold the Users dictionary

SQL insert statement executed twice

what is the best way to prevent user double click or refresh page that would cause execute SQL insert statement twice, I've tried to disable the button after click, but the result is not really good. I am expecting that it is possible to do it from code-behind. something more like SQL commit and rollback
Perhaps PRG wikipedia article can help to you:
Post/Redirect/Get (PRG) is a common design pattern for web developers
to help avoid certain duplicate form submissions and allow user agents
to behave more intuitively with bookmarks and the refresh button.
If you wish to protect against this you're going to need the server to be aware that the current user has already begun an action, and can't begin it again until a condition is met.
You need to be able to identify that user amongst the many that are potentially visiting your site. This is most simply done using SessionState, but if you have no other need for SessionState and wish to scale your application massively, a simple random cookie to identify the user can be used as a prefix for any keys that you use to place items into the server cache.
Let's say you used SessionState. You'd do something like the following (pseudo):
public void StartAction()
{
var inProgress = HttpContext.Current.Session["actionInProgress"] as bool;
if (!inProgress)
{
try
{
HttpContext.Current.Session["actionInProgress"] = true;
MySqlController.DoWork();
}
finally
{
HttpContext.Current.Session["actionInProgress"] = false;
}
}
}
The above does not account for the following:
Catching exceptions and/or closing any connections in your finally block
Queueing up subsequent actions as a result of the next clicks on your client (this pseudo-code just returns if the action is already in progress)
I've gone for the simplest solution, but in reality a better practise would be to have this encompassed as a service which runs asynchronously so that you can monitor the progress both for the benefit of the user, and for the prevention of multiple parallel processes.

Inserting data in background/async task what is the best way?

I have an very quick/lightweight mvc action, that is requested very often and I need to maintain minimal response time under heavy load.
What i need to do, is from time to time depending on conditions to insert small amount of data to sql server (log unique id for statistics, for ~1-5% of queries).
I don't need inserted data for response and if I loose some of it because application restart or smth, I'll survive.
I imagine that I could queue somehow inserting and do it in background, may be even do some kind of buffering - like wait till queue collects 100 of inserts and then make them in one pass.
I'm pretty sure, that somebody must have done/seen such implementation before, there's no need to reinvent wheel, so if somebody could point to right direction, I would be thankful.
You could trigger a background task from your controller action that will do the insertion (fire and forget):
public ActionResult Insert(SomeViewModel model)
{
Task.Factory.StartNew(() =>
{
// do the inserts
});
return View();
}
Be aware though that IIS could recycle the application at any time which would kill any running tasks.
Create a class that will store the data that needs to be pushed to the server, and a queue to hold a queue of the objects
Queue<LogData> loggingQueue = new Queue<LogData>();
public class LogData {
public DataToLog {get; set}
}
The create a timer or some other method within the app that will be triggered every now and then to post the queued data to the database
I agree with #Darin Dimitrov's approach although I would add that you could simply use this task to write to the MSMQ on the machine. From there you could write a service that reads the queue and inserts the data into the database. That way you could throttle the service that reads data or even move the queue onto a different machine.
If you wanted to take this one step further you could use something like nServiceBus and a pub/sub model to write the events into the database.

Categories