Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Today in the interview I faced a question tat I had no clue about. Being a newbie into ASP .Net MVC I had no other way but to ask the experts about it.
"In your MVC Application ,data are being stored into a RDBMS System. In a .cshtml page ,there is a button to update the existing records into DB.
Now suppose two users from different parts of the Globe are at the same time are trying to update a common record. But by the the time the second user hits the submit button, you already submitted the update. In that case the second user , when would be pressing the submit won't be able to submit. Instead the page would reloaded for the second user with the updated info, discarding the changes.And then only he would be able to go on with the update"
How could you achieve that in ASP .Net MVC?
I though thought it might be something from the DB Side coding also , but I have no clue how to achieve the same.
You need to implement a strategy for managing concurrency. How you do this depends largely on the business rules of the application and the type of conflict that arises when two or more people attempt to change the same record.
Most often, you will add a column called RowVersion which will be a timestamp type. Whenever you display records to be updated, you also select the current RowVersion value and usually store it in a hidden field. The update operation will include the RowVersion field, which gets a new value, but before you commit an update operation, you compare the RowVersion value you have with the current one in the database. If they are different, someone else has updated the row during the time that it took you to get the record to be updated and then tried to change it. You determine how to proceed based on the application's business rules.
Asynchronous Tasks do the database operations , so even if another user tries to update , the first process runs in the background asynchronously
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I am building an ASP.NET Core web app in C#. Clients can add an "inquiry" and the target response time is less than 30 minutes. Table "Inquiries" has many columns, among them: Id, Inquiry, and Reply.
Here is the scenario:
10h00: Client A adds an inquiry
10h09: No reply yet
10h10: Employee B gets notified that an inquiry has been added and has no reply yet
10h04: No reply yet
10h09: No reply yet
10h20: Employee B gets notified that an inquiry has been added and has no reply yet
10h21: Employee B adds reply and stops getting notified (B was getting notified every 10 minutes)
What would you say is the best approach to get this to work? I suppose I can write a C# function to check for empty "Reply" cells every 30 seconds, but I think it would exhaust memory when the database grows bigger, right? Can you point me in the right direction?
There are a few options here for this.
Personally, I would schedule an event in a secondary service to trigger tasks at a time window, but that may be overkill in your case.
Within your Inquiries, you (you probably already have this field) can use a last modified or last updated field to determine when this row was altered last via a calculation.
If you don't have a requirement of Source Control for your DB script, you can use stored procedures to manage this data and control the creating of these date/time stamps.
I would recommend updating your query to have a where clause on both the Last Updated time AND the null condition. This way there is no unnecessary records being processed and checked.
POST EDIT Alteration:
It seems like you have a creation date of events within the database, compare them to the current time on your retrieval query for determining an alert status.
About memory/performance impact:
The (C# application) memory impact will be based on the number of items returned, and not the number of items in the Database. So, if you add a where clause onto your query, you'll be able to ensure that the memory isn't allocated to misc. objects.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 days ago.
Improve this question
Firstly I'm talking about loose-coupling scenario, in this scenario we don't use the DbContext directly on the client side, it's used in the service layer instead.
So DbContext cannot help track changes for updating action normally. Actually we can use DbContext directly in our project but I have a feeling that that way it's fairly tightly-coupled to Entity Framework. I always prefer to creating a separate service layer (even the project is a Windows desktop application which is fairly suitable to consume DbContext directly).
So in that loose-coupling scenario, we need to detect changes ourselves without the help of DbContext. There are several options here, one of those (that I'm asking here) is detect changes from the old instance and the new instance. Here the old instance can be requested (queried) from the database, it's something like this:
public void UpdateItem(Item item){
using(var db = new SomeDbContext()){
var oldItem = db.Set<Item>().Find(item.SomeKey);
db.Set<Item>().Attach(item);
//this will actually update some EntityState for the item's properties
detectChanges(oldItem, item);
db.SaveChanges();
}
}
The cost of above method is it requires one more query to find the old item. Moreover it may be dangerous if the new item was actually loaded partially (such as just some properties of the item are loaded because just those are interested in some specific view), when that's the case the detectChanges may misupdate the item's properties' EntityState which in turn will clear value of some missing properties unexpectedly.
So I'm a bit hesitant at this point. I am open to better approaches to save/update item in this scenario.
You should consider using row version (timestamp) to handle unintended updates. https://www.infoworld.com/article/3085390/application-development/how-to-handle-concurrency-conflicts-in-entity-framework.html
At that point you wouldn’t need to detect changes. Just attach and save changes. If the object you are saving is stale, EF will throw a concurrency exception which you can either handle by notifying the user or attempt a merge or just fail. It’s up to you.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am new to ASP.net. I have a webform where I have 40 text box 18 drop downs and 29 checkbox. I need to load the all data from these controls to SQL server database table using stored procedure. I dont want to pass all the parameters one by one and update the table.
Is there any other short way to do it? Please help
I am not sure I follow your question here?
The simplest way, would be to do a model object. Create a method that accepts the model object that includes all the data from your form. This methods feeds the stored procedure call.
This is the common way to do it. And the fastest. You need to assign the data to data-fields. There is no way around it really. But of course it can be done in many different ways.
So you want a more specific answer, you will have to elaborate what you mean by:
"I dont want to pass all the parameters one by one and update the
table."
I can't come up with a scenario where it would be desireable to pass each parameter one by one into an update statement...
but you will have to mention each field in your SQL stored procedure, that you want to update.
if that is your only question:
UPDATE table-name
SET column-name = value, column-name = value, ...
WHERE condition
If you have used a model in your view and used Razor syntax to render the relevant controls than if you have a form defined, posting back to the controller that accepts the same model as it's parameter will handle the binding of all the properties back to the model.
You then either parse the model or use the properties and send them down to entity framework, or any other ORM that you might be using, and have it handle the update operation.
[HttpPost]
public ActionResult YourPostMethod(YourModelUsedInView model)
{
// process your model here and send it down to the DB.
}
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I work on a small project just to get to understand the mvvm model in C# better.
I work with a Microsoft SQL Server database with three tables: customer, location and address.
Every customer can have one or more locations, and every location has a specific address.
My current thought how to accomplish this:
First insert the customer.
To insert the location, get the highest customer_id and insert the location with the max(customer_id)
Then, to insert the address, get the max(location_id) and insert the address, with the location id
Is there a better way to do this?
I haven't found any tutorial with an example of inserting data into more than one table, especially not using SQL Server.
And my next problem is: what should I bind to my TextBoxes, so that I can insert the content of it?
I thought about having a save button. This button would then execute a method, where I insert the data from the bound TextBoxes. Should I do this with commands, instead of writing a method?
Thanks already!
Sql server has a OUTPUT clause which you can use
something like
INSERT INTO MyTable VALUES({CustomerName})
OUTPUT INSERTED.ID
then you can store the inserted customer's actual ID and do the rest inserts in separate queries.
As for your second question yes you should do it with Command binding to a method in your View model
The most advisable for your situation is to use a transaction.
Example.- https://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqltransaction(v=vs.110).aspx
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I'd like to display a repeating event on a date/time display in an app. This date time display can take the form of a calendar, but it could also just be a list of upcoming events.
What is the best way to handle tracking this event that can repeat?
For example: Should the event be stored once in the database and projected out / repeated several times in the display code? Should the event be stored several times and then just rendered?
I did something like this before and I based my schema off of SQL Servers sysschedules table.
http://technet.microsoft.com/en-us/library/ms178644.aspx
The schema linked above will allow you to store the schedule for a job (event). Then you can calculate what dates the event occurs on based off of the schedule. This may be a lengthy calculation, so I would try to cache that result somewhere.
I think it depends on type of event it is. Is it like Christmas where once it comes along and happens you really aren't interested in it until the next occurrence? Or is it a task like, "Make sure I call my mom every month", where if it happens and you missed it you wouldn't want it to go away?
One way I recently implemented the latter was to have a record that had next_occurrence (date), reoccurence_period (weekly, monthly, yearly, etc) columns. So that as the next occurence approched it would show up in the list. Once it passed the list item would have a recycle icon that once pressed would update the record to the next future occurence.
Again, i'm not sure if this applies to your situation, but it worked well for mine.