I am working on an app that would have a database of information. The functionality would be the user would look up entries and could save it to a personal list.
My thinking for how this would work would be that the SQLite database would have two tables, one for the information and one for the personal list.
When the user presses the button to save the info to their list a method is called that creates a personal list object and copies the info held in each field of the selected data to the identical field in the personal list object.
And so that the user knows it has been saved it appears differently like labeled that they have already been saved to the personal list. To do this that it has been saved to the list there would be a variable to keep track of that. Default like this.
var isSavedToList = false;
And when the method is called to save it to the personal list this variable is changed to true.
My question is: Can I add rows to the SQLite database (after the app has been launched and people are using it) without changing the isSavedToList variables on each users device? Since each person's list is going to be different I don't want them to be reset if I update the database with new entries.
Yes, you can. I presume you mean you're doing some kind of program update that adds new rows to the main table. Add your new rows with "isSavedToLiat" set to false (because they can't possibly be set to true because they are new and the user doesn't know about them")
I have an ASP.Net web application that calls a customer to a station. Five employees are running this application simultaneously when they see a customer walk in they click on a ButtonGetCustomer to call the customer and come to their station.
Here is my issue. I am getting the data from SQL and storing it in a Datatable. Sometimes when two or more clerks click at the same time they call the same customer.
Any ideas in how to prevent this from happening?
I had a similar problems with thousands of people clicking the same button trying to claim a limited number of spots. Here is a similar solution:
When they click your button, run a stored procedure to mark that user as seen.
Your SPROC will first check to see if the user is marked as seen, if so, quit (I use RAISEERROR and pass a message back and catch the SQL Exception in code so you can tell them what user has already been called).
If the user hasn't been seen, the next thing your SPROC does is mark them as seen.
So the person who clicked the button either has success and sees the customer, or he gets a message saying the customer has already been seen.
The problem you are experiencing is a concurrency problem. Try wrapping the read of the datatable in a lock statement (there are several), the records you plan on returning to the calling thread should be flagged so that they are not picked up by other thread, try something like this:
private Object _syncObject = new Object();
private DataTable yourDataReadMethod() {
lock(_syncObject)
{
// Read records to return to calling thread.
// Flag records read so they are not given to other threads. You might need expiration date in case the records are not completed in a timely manner.
}
}
Furthermore, if you are updating a record after a call takes place you should compare a the db last updated date with a date that is persisted in the client form; if they differ than raise an exception, because this means that someone else has already updated the record. Hopefully that helps.
In my business logic, there is a function that creates a unique value (it is a session id) by a random function.
I have to be sure, that the session id is unique before I store them into database.
So I am generating new session id, until I found one, that is not yet in database.
But there might be a race condition between checking for existing session ids in database and writing it.
The functions for writing and reading in database are using two different connections.
How can I manage this?
I cannot use auto increments, because the next session should not be guessable.
I think you can make this column UNIQUE (add database constraint), try to insert new row and check wheter it will return an error about duplicate value. Considering that duplicates are very rare it's probably the fastest and safest method.
currently we are using Sessions to store datatables in our pages so that we don't have to make Database hits to get the same datatable again and again. But my worry is that it is using the server memory and if large number of users login some day, the response of the server will become slow and our application might crash as well.
Please tell me is it a good idea to store datatables into Sessions or should we get the datatables from DB everytime?
As a general rule of thumb I would say don't use session. I haven't had to use session for a long time. As soon as you move into a web farm situation session either gets a lot slower or a lot more complicated or both.
Whether you will get away with it or not really depends on how much data you are storing in session, and how many users will be active within the session timeout period.
There are a lot of caching and in memory database options available today that may be a better option. Finally, while the solution as described sounds questionable, I wouldn't optimize the existing solution until you have actually measured a problem.
This is dependent on what is being stored in the datatables. In any case, I would use the ASP.NET Cache to store these datatables for the following reasons.
Cache has an expiry, which means you can automatically remove it based upon a sliding or absolute expiry timed value
Cache will automatically be removed if the processes memory "pressure" is too high.
You can make a cached item specific to one user, or global to all users based upon its key
for example:
// personalized cache item
string personalCacheKey = string.Format("MyDataTable_{0}", (int)Session["UserID"]);
DataTable myPersonalDataTable = (DataTable)Cache[personalCacheKey];
if (myPersonalDataTable == null)
{
myPersonalDataTable = database.dosomething();
Cache.Insert(personalCacheKey, myPersonalDataTable, null, Cache.NoAbsoluteExpiration, new TimeSpan(0, 30, 0)); // 30 minutes
}
// global (non user specific) cached item
string globalCacheKey = "MyDataTable";
DataTable globalDataTable = (DataTable)Cache[globalCacheKey];
if (globalDataTable == null)
{
globalDataTable = database.dosomething();
Cache.Insert(globalCacheKey, globalDataTable, null, Cache.NoAbsoluteExpiration, new TimeSpan(0, 30, 0)); // 30 minutes (again)
}
The issue that you have now, however, is if the underlying data gets updated, and whether it is acceptable for your application to present "old" cached data. If it is not acceptable, you will have to forcibly remove an item from cache, there are a few mechanisms for that.
You can setup a SqlCacheDependency (which I have never personally used), or you can just clear out the cached object yourself using Cache.Remove(cachekey).
It is preferable to store "commonly used" data in Memory; that's good logic. However "Session" means that it exists for the life of that Session, and hence that user. Secondly, pending the user "Session" life, as you already said, this could be consume valuable resources on the Server Side.
What you may want to consider using is the "Cache" object, as it serves the same purpose with "Expiration".
DataTable users = new DataTable();
if (Cache["users"] == null)
{
// users = getUsers(customer);
Cache.Add(“users”, users, null, System.Web.Caching.Cache.NoAbsoluteExpiration, new TimeSpan(0, 60, 0), System.Web.Caching.CacheItemPriority.Default, null);
}
else
{
sers = (DataTable)Cache["users"];
}
There are many ways to re-use memory in .NET
(1) ViewState
(2) Cache
(3) Session
(4) Cookies
But I would go for the "Cache" object.
If you can't increase memory on the web server then the obvious answer is to not store it in session state and get it from the database every time.
The problem with this is what impact will it have on your database? Are you just moving the problem from the web server to the database server?
It is much easier to scale out web servers than it is to scale up/out Databases (and often cheaper if you're using something like SQL Server)
If your datatable has smaller number of records and it does not contain sensitive data then you can use ViewState as well but data should be smaller as this approach will serialize the data and store it at client side and then gets the data from client side to store at server side.
I am having a Search.aspx page which is going to display the records from user entered search criteria. It get the connects to the Specific Database amongst 50 SQLServer Databases. Depending upon the QueryString passed i.e Search.aspx?ID=1 it will connect to 1MyDB Database.
if Search.aspx?ID=2 then connect to 2MyDB database which reside on same or different server.This works fine.
The problem i need to display the total count of visitors depending upon the querystring in URL. for these Different Databases that are accessed from same page i.e Search.aspx
Plz suggest me the best method to get the total Visitors for that specific ID.
Whether to Store it in DB by creating a new table and insert the count. OR Set about 50 Application Variables in Global.asax file and get the counter accordingly.
Help Appreciated...!
50 variables doesn't look good. You're going to have a lot of work to show it, and you have to change the code if you add another database. A table is better. Create a table with id and count fields, and every time a page is being hit check if the id is already in the table. If so, increase the row count field. If not, add the id with count = 1.