Accessing MDF file thread safety [closed] - c#

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
Is MDF file access (when attached to SQLEXPRESS) thread safe?
I have a local MDF file deployed along with my WPF client application.
I'm using the MDF file to persist some client-specific settings. There may be many threads SELECTing and UPDATEing the same rows at the same time and thus accessing the file via the SQL connection provider simultanously.
Now, what i'm asking is wether or not I can treat the thread syncronization the same way as I do with remote SQL SERVER databases (just leave all the work to the SQL connection provider) or do I have to wrap my DB calls inside a critical section?
Thanks!

I'm really confused, it's a Microsoft SQL database running on a database server that manages table and row locking and conflicts as long as you use optimistic concurrency in your WHERE clauses when updating rows. For example
update settings set A = 'val' where A = 'old val'
so of course it's thread safe.

MDF file is Main Database File.
You can't write to MDF file directly(theoretically),you are accessing through SQL Server engines and clients.
As for multiple SELECT and UPDATE you are relay on database transaction isolation levels(read commited,read uncommited,serializable,snapshots).
Isolation Levels in the Database Engine

Related

Server Database Setup file [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 11 months ago.
Improve this question
I've created a multiple user application in WinForm and it have a database on the server which is on the other computer. So far, i set up the server database (MySql) manually and create each tables in the database manually as well. I was wondering is there a way to create (maybe an exe file) to automate the entire process?
I've did some searching on the internet but most of the answer i get are the configuration of the database which i've go through it when i did the manual set up. I developed the application and the database separately so what is related to the database in my application are the connection string only. i did managed to publish the application and now i need to find a way to publish my server database.
With mysql you cannot automate the install of the mysql software, unless you have a paid for enterprise licence or you release your software as open source under GPL.
To automate mysql installation, you can use mysql installer console.
What you can definitely automate is the deployment of your mysql data structure and the configuration of the mysql instance.
The former requires an sql script that creates the database and all other database objects (tables, indedes, views, users, etc) that you can execute as part of an install process. The latter requires overwriting or adding your settings to my.cnf / my.ini configuration files.

How does odbc allow more than one session to access and modify the database [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I'm working on an asp.net web site application that uses ODBC to store data into a 2008 R2 Microsoft SQL Server.
My understanding of ODBC is that it is an open database connection, where the connection to the database can be kept open, and during this time only "one connection" can modify, update, insert into the database. Until that connection is closed and another one is opened.
The application is designed to allow multiple user sessions to open connections to the database at the same time, and I don't see any client code that is handling concurrency issues regarding insertions or modifications to the database.
How are multiple users in separate sessions (for example, three users on three separate web browsers), able to modify, update, and insert information safely into one database; particularly when all three users are modifying, deleting, or inserting into the same row at the same time. Does the database automatically provide a lock and wait for that lock to free up from one user session to the next to resolve this concurrent data access issue?
You cannot modify the same row on the same table at the same time as someone else. This is inherent in the ODBC connection statement. For instance, UserA is updating Table1 on and UserB is also trying to perform updates on Table1, if there are rows in common between the two users, the ODBC will throw an error with feedback about the data currently being locked or something similiar (depending on how you're accessing the ODBC).
Regardless of Optimistic or Pessimistic style of ODBC record locks, only one user can make modifications to the same data at the same time.
However, multiple users can make changes to the same Table if no conflicts occur in the modifications taking place in certain circumstances (this would be a difference in locking types).
FYI - This is not really a coding question.
Each connection is isolated. You could have 1000s of connection.
A web server acts on the benefit of many http connections so you don't need a lot.
It is possible to share a connection across threads but it is not a good practice. You could get the response from another session.
The database manages isolation with locks. Only one person can have an update lock at a time.

Remote and local databases in C# [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I am working on a WPF application that requires access to a remote database. The problem is:
The app does not have consistent access to the internet
There are multiple instances of the application running
My thought was to get a local copy of the database, log all the interactions with the local database (or someway to queue the interactions for later use), then have the option to sync the remote and local database (send the local commands to the remote database, drop the local database, get the remote database).
This article on MSDN was pretty helpful, but I have some concerns. The main purpose of the queue is to store updates and inserts to the local table, but this route does not look like it stores the parameters for use (has them commented).
Any suggestions or thoughts on the best way to handle this?
Thanks!
You should probably look at message queues (the Microsoft version is called MSMQ and is built into Windows. Other message queues are available). They are designed for exactly this sort of scenario.
Essentially, your application write an event to it's local message queue. This will attempt to send it to the remote queue (on the database in this case) periodically, providing for reliable message delivery.
On the database queue you typically have a listener watching the queue and writing any events it receives to the database.

What is best way to store database records in local? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have email application. Currently I store all the emails of all email accounts in SQL Server. I have 60 tables. After every 20 minutes, all emails are stored in XML file. To storing in XML is for offline purpose.
Offline: if there is no database connection, it brings all data from XML and stores in DataSet (60 tables). I use this DataSet to perform all offline operations like read mail or check mail etc.
This mechanism takes so much memory if there are thousands of emails in XML file. I think it takes so much memory because of DataSet. I don't dispose DataSet because all the data are stored in it and it use for operation in offline.
Can anybody suggest me mechanism for above situation?
You can use SQL Server Compact Edition to store mails locally. You'll be able to use the same queries to access data.
This is a client based SQL Server which keeps tables in a file. You don't install a SQL server to the client, you will only reference to the System.Data.SqlServerCe.dll library and will use SqlServerCe specific connection object to access the data.
Microsoft download page is here: https://www.microsoft.com/en-us/download/details.aspx?id=30709

Local files or remote server [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I've developed a desktop application(Accountant App). This application is going to import invoices data and do a lot of other things. All data will be stored on remote database (SQL Server). This app needs some 'pre loaded data' to work properly, like a list containing a bunch of cities etc. My question is: Is better to have this data (cities, states, zipcodes) stored in the remote database or is better to use xml, csv files and deploy them with each individual instance of the application? This data will be updated eventually. And this data will be used frequently by the users.
Well you can either use an external file or store them in the database. I think storing it in the database is the better approach since you won't have to distribute another extra file and all of your data will be at one place. Don't forget to seed the data every time you clear the database though.

Categories