I have a billing application written in C# that uses an Access database (.mdb file) for storage. I'd like the application to support multiple users sharing access to the database but this fails with an error message similar to
The database has been opened exclusively by another user, or you do not have permission to open it
What do I do to allow multiple users to access the file?
if you really have to do it (and shame on you for using access to that) you have to realize access is FILE BASED.
Basically, you connect to the database file (using a driver), so to allow multiple concurrent users you must have a network in place with a file share on which you put the database. And you need to make sure your database is opened in a way allowing multi user access. I think I remember this being a setting on the database or something, that then forced it to create some lock file.
THAT SAID: move to SQL Server, Access starts really showing negative sides the moment you go network/multi user. It is discouraged by Microsoft for about 10 years now. This is one reason I think I remember - I stopped doing access multi user 15 years ago because it made no sense.
Related
Hi I have a wpf application, developed using C# .net, and framework used is 4.5.
I was able to achieve Create, retrieve and delete operations related to the sql lite using EF6, and also was able to password secure the connection the sqlite db.
My problem is I want to secure the db file, I went through many links and googled for almost 3 days and came across various options, but few of them commercial and open ones are on C, nothing for .net.
So next approach was to keep file hidden for general user, as I dont want them to delete the file, as file is deleted then all the information in it will be lost.
I am willing to understand,the best practices out there to secure sqlite database file (please note the database is secured with Key).
I tried below things
Storing the file on Appdata folder, but it is accessible and easy
to remove the file, also someone can take the database file and try
to bruteforce to know the password, which may not be worthy :)
Storing the file in program data folder, but if the use installed
windows app is not admin, it may throw exception.
I came across IsolatedStorage, which seems to be hidden from general user,
and also we can define the scope for the sotrage, but the problem is I am not
able to implement if for .sqlite, as I do not know the exact path,
and to initialize the sqlite connection, we require exact path.
What could be the best way to secure the file. I do not want user to read, update or delete using other application, but should be accessible only through my application.
(Just comment for this post)
Sqlite database is normally can read and write from other user (Not from you app).For the problem of Open,Delete,Copy the sqlite db,I think you can lock the db file by Locking File(There are many type of locking file structure on google).So if you want to read data from your db,you unlock the db file with your specified key and open the connection.After that,you can lock the db file.
I built a software for a farm using C#, the program was meant to keep track of the inventory and the financial exchanges related to the farms work.
The software was built to be installed on the manager's computer who then entered the farms data and retrieved reports and so. While the accountant used the same PC to use the financial part of the program with a different account.
Now the farm's business grew and more users need to use the system, can I move the database with the old data to a server so users can log in the system from different PC's at the time and continue the old tasks?
If I can - what do I change in my code?
P.S. the database was done in MS Access.
Not a lot of information to go on here. I can tell you that Access is a file based database system, and so whilst you could put the database files on a server or a NAS device with no problem with multiple users you should expect to run into the usual problems of Windows file sharing - VERY SLOW performance as a minimum.
It is also possible that the database may have been limited to a single user at a time, and without any more information impossible to know whether the developers have allowed for multi-user at all or whether you could have a situation where if several people do open the file at once one person may be overwriting another's data leading to corruption.
The short answer is that if the original developers are no longer around and you cannot ask the question of them then you probably need a new dedicated application to do the work which would mean either a complete rewrite or an alternative commercial application.
For multi-user Microsoft SQL Server, MySql, or even Firebird or another dedicated database back end would be the way to go. The front end could be anything - Winforms, WPF, even a web application if that is what you want, but it would have to be written.
I hope that this is helpful.
We have an MS Access database (accdb) out on our network, that multiple users will edit & read by means of a .NET application. I am aware that a server db such as SQL Server would be the best choice for this situation, but currently that's out of my hands.
The .Net application will use both ADO.Net (ie OleDBConnections) and the tools inside the Microsoft.Office.Interop.Access.Dao namespace to connect to the database.
I have read a few articles and posts about multiple users connecting to Access, and there seems to be some confusion and differing opinions about Access's capabilities in this regard. How/can I achieve the following in my application:
Establish connections to write to the database, that will lock the entire database (all records and tables) until the connection is ended. If other users attempting to write simultaneously are halted by an exception, that is okay.
Establish connections designated as read-only, that have no conflicts with any other user's actions.
Thanks.
To open a ACCDB in exclusive mode you need to add this key/value to your connection string
Mode=Share Exclusive;
This will block other user to connect to the same database until you close and dispose the
connection that opens the database in this way.
If I remember well, the possible values (pretty self explanatory) for the Mode keyword in the JET connection string are
Mode='Read';
Mode='Write';
Mode='ReadWrite';
Mode='Share Deny None';
Mode='Share Deny Read';
Mode='Share Deny Write';
Mode='Share Exclusive';
I have tried various combination of the flags above, but I can't find a simple solution that allows a single connection to be opened in ReadWrite while the following connections fall back to Read Only automatically. Probably in your scenario the best path is to have a local connection (not a global one), try to open it in Share Exclusive if you need to write to the database and catch the exception if you cannot open the database giving the user a Retry option. Not an easy path I know. Let's see if a user with a better knowledge of MS-Access could give a more elaborate solution.
I agree with the comment above (and your own assesment) that this is not the best database tool to use in situations of concurrency, however things are a lot better now.
I find this answer well written, comprensive and with a balanced view of the strength and weakness of Access about concurrency issues.
I'm trying to understand how making a query to a .mdb file works.
Suppose the file is located on a share drive, PC2, I open it programmatically from PC1.
When I make a connection to a .mdb file, I assume no "instance" of MS Access is started on the PC2 (since it's a simple file server). Is this correct?
When I make a SQL query, does it have to copy the table locally, and run the query then return my results and toss away the table and any excess data?
What happens if I "order by" on a query? is the entire query returned, then locally ordered, or somehow ordered remotely?
I'm sure I have other questions, but I'm trying to understand how connecting to an MDB file works from a remote location. (we have a decent amount of latency where I am located, so a particular query can take 9 seconds, which in my case is unacceptable, I'm trying to understand how this is working and if it can be improved).
I'm running with c# in this case, I don't expect that should make much difference, but may in your response.
When I make a connection to a .mdb file, I assume no "instance" of MS Access is started on the [remote machine] (since it's a simple file server). Is this correct?
Yes. The application will be interacting with a copy of the Access Database Engine on the local machine, which in turn retrieves the information from the database file on the remote machine.
When I make a SQL query, does it have to copy the table locally, and run the query then return my results and toss away the table and any excess data?
Not necessarily. Depending on the indexing scheme of the tables(s) involved, the Access Database Engine may only need to retrieve the relevant indexes and then determine the specific pages in the data file that contain the records to be retrieved. In some cases it may need to retrieve the entire table (e.g., when a full table scan is required), but that it not always the case.
What happens if I "order by" on a query? is the entire query returned, then locally ordered, or somehow ordered remotely?
The Access documentation says that indexes will speed up sort operations (ref: here), suggesting that the Access Database Engine can retrieve the required rows from the remote file in sorted order.
Your instincts are correct, mdb/mde dbs are just glorified text files which must be processed locally. Here are some tips on network performance: http://www.granite.ab.ca/access/performancefaq.htm
But since SQL Server Express is free, there is almost no excuse for not migrating, especially since Access has a tool to manage that for you. In a low volume multi-user environment (2-10 for example), MS Access can work ok, but for enterprise solutions where a higher volume of users and/or transactions is in any way possible, you are dicing with disaster.
To add to Gord's answer...
Access databases are accessed through Windows file page locks. My understanding was that Microsoft added this page locking specifically for use by MS Access (but is also available for any file through the Windows API).
Because the instance is local and collisions and conflicts are handled through file page locks, client-server contention is an issue. Access has known issues here. It's why one should switch to SQL Server Express (also free) whenever possible. But, yes, MS Access has a certain level of convenience; SSE has a bigger footprint and a far less friendly GUI
All desktop databases have client/server issues. Gord's answer matches my knowledge. The point of indices is to reduce the amount of table data that needs to be pulled locally. Pulling the index is a relatively small task in comparison to the table data. This is standard index optimisation, although I would say it is even more important for desktop databases due to the remote data and, ugh, file paging.
In general the Access (JET) engine does NOTHING remotely. It's all file data grabs and executed locally in in the local MSA/Jet engine. You know this because the engine is installed locally and doesn't have to be installed on the file host. It is, however, a convenient quick and dirty way of dispersing processing loads. :)
I have a client who has a product-based website with hundreds of static product pages that are generated by Microsoft Access reports and pushed up to the ISP via FTP (it is an old design). We are thinking about getting a little more sophisticated and creating a data-driven website, probably using ASP.NET MVC.
Here's my question. Since this is a very small business (a handful of employees), I'd like to avoid enterprise patterns like web services if I can. How does one push updated product information to the website, batch-style? In a SQL Server environment, you can't just push up a new copy of the database, can you?
Clarification: The client already has a system at his facility where he keeps all of his product information and specifications. I would like to refresh the database at the ISP with this information.
You don't mention what exactly the data source is, but the implication is that it's not already in SQL Server. If that's the case, have a look at SSIS.
If the source data is in SQL Server, then I think you'd want to be looking at either transactional replication or log shipping to sync the two databases.
If you are modernizing, and it is a handful of employees, why would you push the product info out batch style?
I don't know exactly what you mean by "data driven", but why not allow the ASP.NET app to query the SQL Server product catalog database directly? Why generate static pages at all?
UPDATE: ok, I see, the real question is, how to update the SQL database running at the ISP.
You create an admin panel so the client can edit the data directly on the server. It is perfectly reasonable to have the client keep all their records on the server as long as the server is backed up nightly. Many cloud and virtual services offer easy ways to do replicated backups.
The additional benefit of this model is that more than one user can be adding or updating records at a time, making the workforce a lot more scalable. Likewise, the users can log in from anywhere they have a web browser to add new records, fix mistakes made in old records, etc.
EDIT: This approach assumes you can convince the client to abandon their current data entry system in favor of a centralized web-based management panel. Even if this isn't the case, the SQL database can be hosted on the server and the client's application could be made to talk to that so you're only ever using one database. From the sounds of it, it's a set of Access forms and macros which you should have source access to.
Assuming that there is no way to sync the data directly between your legacy system DB (is it in Access, or is Access just running the reports) and the SQL Server DB on the website (I'm not aware of any):
The problem with "pushing" the data directly into the SQL server will be that "old" (already in the DB) records won't be updated, but instead removed and then recreated. This is a big problem with foreign keys. Plus, I really don't like the idea of giving the client any access to the db at all.
So considering that, I find that the best is to write a relatively simple page that takes an uploaded file and updates the database. The file will likely be CSV, possibly XML. After a few iterations of writing these pages over the years, here's what I've come up with:
Show file upload box.
On next page load, save file to temp location
Loop through each line (element in XML) and validate all the data. Foreign keys, especially, but also business validations. You can also validate that the header row exists, etc. Don't update the database.
3a. If invalid data exists, save an error message to an array
At the end of the looping, show the view.
4a. If there were errors, show the list of error messages and tell them to re-upload the file.
4b. If there were no errors, create a link that has the file location from #2 and a confirmation flag
After the file location and confirm flag have been submitted run the loop in #3 again, but there's an if (confirmed) {} statement that actually makes the updates to the db.
EDIT: I saw your other post. One of the assumptions I made is that the databases won't be the same. ie, the legacy app will have a table or two. Maybe just products. But the new app will have orders, products, categories, etc, etc. This will complicate "just uploading the file".
Why do you need to push anything?
You just need to create a product management portion of the webpage and a secondly a public facing portion of the webpage. Both portions would touch the same SqlServer database.
.Net has the ability to monitor a database and check for updates. then you can run a query to [push] the data elsewhere.
or use sql to push the data with a trigger on the table(s) in question.
Is this what you were looking for?
You can try Dynamic Data Web Application.
You should have a service that regularly updates the data in the target DB. It will probably run on your source data machine (where the Access-DB is)
The service can use SSIS or ADO.NET to write the data. You can do this over the web, because you have access via TCP/IP to the server I assume.
Please check when the updates are done and how long it takes. If you can do the updates during the night you are fine. If not you should check, if you can still access the web during the import. That is sometimes not the case.
Use wget to push the new data file to the mvc app and once the data is received by the action, the mvc app invokes the processing/importing of the data (maybe in a worker process if you dont want long requests).