I have Like button, when it pressed - controller action should add field to my dbTable("Likes"),
and if button pressed again - it should remove Like from dbTable("Likes").
Here code(simplified):
var like = db.Likes.Find(id);
if (like == null)
{
Like like = new Like("SomeData", UserPosedLike);
db.Add(like);
}
else
{
db.Likes.Remove(like);
}
db.SaveChanges();
It's work fine at 95% of times, but if i press button very fast 2-3 times, it throws error – “Null ref”, some data in Like is NULL (but this data can’t be null, because it references to other dbField(User)). Also, sometimes it adds two likes, but it shouldn’t.
I think it happens because of:
First action(press) – read db, start processing.
Second action (second press) – read db, start processing. (first action not saved data yet).
First action(press) – save data.
Second action(pess) – save data.
Here we got 2 likes. Probably with “null error” similar problem.
I received advice to lock this block of code, but I think it’s wrong. Also I thinking about Optimistic Concurrency, currently reading about it.
I’m not good with db, any help will be great - good book, code, article or advice.
Thank you!
You need to read and write in transaction.
using (var db = new YourContext())
{
using (var t = db.Database.BeginTransaction())
{
try
{
var like = db.Likes.Find(id);
if (like == null)
{
Like like = new Like("SomeData", UserPosedLike);
db.Add(like);
}
else
{
db.Likes.Remove(like);
}
db.SaveChanges();
t.Commit();
}
catch
{
t.Rollback();
}
}
}
I received advice to lock this block of code, but I think it’s wrong.
It is wrong.
Also as someone has suggested in comments, you can prevent users from clicking a button multiple times. You could disable the button and add a spinner.
Related
I've created an app in C# .Net WPF. This app uses an sqlite database, but this database is shared with other programs.
So rarely my app and the others are running simultaneously, and more rarely they attempt to write in db simultaneously.. And All apps are crashing at this point...
So, I want to patch my app to wait that db isn't locked anymore before doing something with it.
I've been thinking about making a dirty try/catch loop with a number of attempting, but it seems to me to be a too dirty way (and waste of ressources)
The other program has a visual indicator when it uses db, so I've thought that a solution can involve user action. When database is locked, a MessageBox open to notify the user to wait until other program has finished before click ok and continue.
Is it a way to test if the database is locked without try/catch?
Following the comments, I tried a try/catch method. I'm still not convinced that this is the cleanest method, because I don't like the idea of waiting for an exception. An expected exception cannot be an exception in my opinion.
Maybe someone will come up with a solution that doesn't use this subterfuge, but I haven't found a better match for my expectations.
Thank you for your comments.
using SQLite;
public bool IsDatabaseLocked(string dbPath)
{
bool locked = true;
using (SQLiteConnection connection = new SQLiteConnection(dbPath))
{
try
{
connection.Execute("BEGIN EXCLUSIVE");
connection.Execute("COMMIT");
locked = false;
}
catch (SQLiteException)
{
// database is locked error
}
}
return locked;
}
public void WaitForDbToBeUnlocked(string dbPath)
{
int i = 0;
while (IsDatabaseLocked(dbPath))
{
i++;
if (i > 10)
{
MessageBox.Show("Please release manually Database or wait");
i = 0;
}
}
}
My answer is inspired by the following questions:
C# - How to detect if SQLite DB is locked?
I have a windows service that uploads data to a database and a MVC-app that utilises said service. The way it works today is something like this:
Upload(someStuff);
WriteLog("Uploaded someStuff");
ReadData(someTable);
WriteLog("Reading someTable-data");
Drop(oldValues);
WriteLog("Dropping old values");
private void Upload(var someStuff)
{
using(var conn = new connection(connectionstring))
{
//performQuery
}
}
private void WriteLog(string message)
{
using(var conn = etc..)
//Insert into log-table
}
private string ReadData(var table)
{
using etc..
//Query
}
///You get the gist.
The client can then see the current status of the upload through a query to the log-table.
I want to be able to perform a rollback if something fails. My first thought was to use a BeginTransaction() and then lastly a transaction.Commit(), but that would make my status-message behave bad. It would just go from "starting upload" and then fastforward to the last step where it would wait for a long time before "Done".
I want the user to be able to see if the process is stuck on some specific step, but I still want to be able to perform a full rollback if something unexpected happens.
How do I achieve this?
Edit:
I don't seem to have been clear in my question. If I do a separate connection for the logging, that would indeed work-ish. The problem is that the actual code will execute super-fast so the statusmessages would pass so fast that the user wouldn't even be able to see them before the final "committing"-message that would take 99% of the upload-time.
Design your table so that it has a (P)ending, (A)ctive (D)eleted flag - then to perform an update, new records are created called 'pending' Status P - your very final stage is to change the current Active to Deleted, and the Pending to Active (you could do that in a transaction). At your leisure, you can then delete the Status D (deleted) records at some time.
In the event of an error, the 'pending' record could become Deleted
I am using Entity Framework 6, C#, and MySQL InnoDB as our db engine.
I have the following code to "insert on duplicate update" a record:
public async Task AddHostIdToGroup(HostsToGroup hostToGroup)
{
using (var context = new MaintDbContext())
{
HostsToGroup htg = context.HostsToGroup.SingleOrDefault(hs => hs.HostId == hostToGroup.HostId);
if (htg != null)
{
htg.GroupId = hostToGroup.GroupId;
await context.SaveChangesAsync();
}
else
{
context.HostsToGroup.Add(hostToGroup);
await context.SaveChangesAsync();
}
}
}
The code itself looks fine for insert on duplicate update.
Still, on our production server I occasionally see duplicate errors.
My initial thought was that it's a race condition issue.
What can I do to prevent these errors, or how should I handle them?
Your problem is not necessarily the code but what triggers that code. Say the code is triggered by an ASP.net app button, if a user double clicks on the button then it could shoot off two simultaneous requests that results in double entity creation.
So either fix your front end/entry point to eliminate the double create scenario, push everything into a synchronous service/queue that will allow you to deduplicate or use Where instead of Single and handle the duplication in code.
I want to make a RESTful API (or any other way that can get it done, really) to have it work in a loop to do a specified task everyday at the same hour.
Specifically, I want it to access a foreign API, let's say, at midnight everyday, request the specified data and update the database accordingly. I know how to make a request to an API and make it do something. But I want it to do it automatically so I don't even have to interact with it, not even having to make requests.
The reason for this is that I'm working on a project that requires multiple platforms (and even if it was only one platform the users would be several) and I can't make a request to a foreign API (mainly because it's trial, it's a school project) every time a user logs in or clicks a button on each platform.
I don't know how to even do that (or if it's even possible) with a web service. I've tried with a web form doing it async with BackgroundWorker but nothing.
I thought I might have better luck here with more experienced people.
Hope you can help me out.
Thanks, in advance,
Fábio.
Don't know if I get it right, but it seems to me that the easiest way to do what you want (have a program scheduled to work at a given time, every day) is to use Windows Scheduler to schedule your application to run always on the specific time you want.
I managed to get there, thanks to the help of #Pedro Gaspar - LoboFX.
I didn't want the Windows Scheduler as I want it reflected on the code and I don't exactly have access to the server where it's going to be. That said, what got me there was something like this:
private static string LigacaoBD="something";
private static Perfil perfil = new Perfil(LigacaoBD);
protected void Page_Load(object sender, EventArgs e)
{
Task.Factory.StartNew(() => teste());
}
private void teste()
{
bool verif = false;
while (true)
{
if (DateTime.UtcNow.Hour + 1 == 22 && DateTime.UtcNow.Minute == 12 && DateTime.UtcNow.Second == 0)
verif = false;
if (!verif)
{
int resposta = perfil.Guardar(DateTime.UtcNow.ToString());
verif = true;
}
Thread.Sleep(1000);
}
}
It's inserting into the database through a class library. And with this loop it garantees that it only inserts once (hence the bool) and when it gets to the specified hour, minute and second it resets, allowing it to insert again. If something happens that the servers goes down, when it gets back up it inserts anyway. The only problem is that if it's already inserted and the server goes down it will insert again. But for that there are stored procedures. Well, not for the DateTime.UtcNow.ToString() but that was just a test.
I have an MVC3/.NET 4 application which uses Entity Framework (4.3.1 Code First)
I have wrapped EF into a Repository/UnitOfWork pattern as described here…
http://www.asp.net/mvc/tutorials/getting-started-with-ef-using-mvc/implementing-the-repository-and-unit-of-work-patterns-in-an-asp-net-mvc-application
Typically, as it explains in the article, when I require the creation of a new record I’ve been doing this…
public ActionResult Create(Course course)
{
unitOfWork.CourseRepository.Add(course);
unitOfWork.Save();
return RedirectToAction("Index");
}
However, when more than simply saving a record to a database is required I wrap the logic into what I’ve called an IService. For example…
private ICourseService courseService;
public ActionResult Create(Course course)
{
courseService.ProcessNewCourse(course);
return RedirectToAction("Index");
}
In one of my services I have something like the following…
public void ProcessNewCourse(Course course)
{
// Save the course to the database…
unitOfWork.CourseRepository.Add(course);
unitOfWork.Save();
// Generate a PDF that email some people about the new course being created, which requires more use of the unitOfWork…
var someInformation = unitOfWork.AnotherRepository.GetStuff();
var myPdfCreator = new PdfCreator();
IEnumerable<People> people = unitOfWork.PeopleRepository.GetAllThatWantNotifiying(course);
foreach(var person in people)
{
var message = “Hi ” + person.FullName;
var attachment = myPdfCreator.CreatePdf();
etc...
smtpClient.Send();
}
}
The above isn’t the actual code (my app has nothing to do with courses, I’m using view models, and I have separated the PDF creation and email message out into other classes) but the gist of what is going on is as above!
My problem is that the generation of the PDF and emailing it out is taking some time. The user just needs to know that the record has been saved to the database so I thought I would put the code below the unitOfWork.Save(); into an asynchronous method. The user can then be redirected and the server can happily take its time processing the emails, and attachments and whatever else I require it to do post save.
This is where I’m struggling.
I’ve tried a few things, the current being the following in ICourseService…
public class CourseService : ICourseService
{
private delegate void NotifyDelegate(Course course);
private NotifyDelegate notifyDelegate;
public CourseService()
{
notifyDelegate = new NotifyDelegate(this.Notify);
}
public void ProcessNewCourse(Course course)
{
// Save the course to the database…
unitOfWork.CourseRepository.Add(course);
unitOfWork.Save();
notifyDelegate.BeginInvoke(course);
}
private void Notify(Course course)
{
// All the stuff under unitOfWork.Save(); moved here.
}
}
My Questions/Problems
I’m randomly getting the error: "There is already an open DataReader associated with this Command which must be closed first." in the Notify() method.
Is it something to do with the fact that I’m trying to share the unitOrWork and therefore a dbContext across threads?
If so, can someone be kind enough to explain why this is a problem?
Should I be giving a new instance of unitOfWork to the Notify method?
Am I using the right patterns/classes to invoke the method asynchronously? Or should I be using something along the lines of....
new System.Threading.Tasks.Task(() => { Notify(course); }).Start();
I must say I've become very confused with the terms asynchronous, parallel, and concurrent!!
Any links to articles (c# async for idiots) would be appreciated!!
Many thanks.
UPDATE:
A little more digging got me to this SO page: https://stackoverflow.com/a/5491978/192999 which says...
"Be aware though that EF contexts are not thread safe, i.e. you cannot use the same context in more than one thread."
...so am I trying to achieve the impossible? Does this mean I should be creating a new IUnitOfWork instance for my new thread?
You could create a polling background thread that does the lengthy operation separately from your main flow. This thread could scan the database for new items (or items marked to process). This solution is pretty simple and ensures that jobs get done even if you application crashes (it will be picked up when the polling thread is started again).
You could also use a Synchronised Queue if it's not terrible if the request is 'lost', in the case your application crashes after the doc is requested and before it's generated/sent.
One thing is almost sure - as rikitikitik said - you will need to use a new unit of work, which means a separate transaction.
You could also look at Best threading queue example / best practice .