Scaleable MS Access ASP.NET app - c#

I am constrained by the following, no way around it:
Read-Only Data: Microsoft Access
JET 4.0 OLDB
ASP.NET 2.0
Shared Host, very little control.
OR Mapper - LLBL Gen Pro
The app is a read-only tool that reads a lot of Microsoft Access Databases in the APP_Data folder. Works fine mostly.
Under load it starts failing accessing the Access MDBs.
What is the best strategy for accessing the Access MDBs to limit errors in accessing them? Right now I try, then Thread.Sleep(500) on an error then try again.

I think there may be ways to modify the isolation/concurrency/locking options when accessing the Access databases to eliminate overhead of managing locks. Perhaps try "Mode=Share Deny None;" in the connection string. I would not use this if you are modifying data in any way at anytime though as it's pretty much throwing out all the isolation/concurrency management that you get with a database. Use at your own risk.

How frequently does the data change? If it's read-only can you load the data from the databases into cache and read it from there instead of directly from the databases?
What kind of specific errors getting. I assume they are connection errors?

This is a horrible solution but if you truly are "lost on a desert island with only these tools" and the access databases are completely read-only, then create multiple copies of each of them and allow only a certain number of connection into any of them at a time. For example, if you have 2 access databases, MdbAA and MdbBB then create copies like:
MdbAA01
MdbAA02
MdbAA03
MdbBB01
MdbBB02
MdbBB03
Then when a request comes for MdbAA, see how many requests are currently accessing MdbAA01, if over the threshold, then try MdbAA02, etc. Do the same for any requests to the MdbBB file.
Like I said this is very bad solution but if you truly have no choice then it might work for you. But realistically it sounds like the app has outgrown Access (and the shared host) so it is time to upgrade the architecture.

Spend a bit of money and get some sql storage. How much time have you spent working on fitting a crutch to a broken sounding system?
If the project is worth doing then it's worth investing some cold hard cash.
If a business is trying to force you down this route, explain why the option you have is not viable. If you suggested the option in the first place, grow a pair and explain why you were wrong, but this is how you can fix it.
Sorry if that comes off as flippant.

Related

Is there absolutely no protection for Windows applications on dotnet?

I am in the middle of developing an application in Winforms and there just doesn't seem to be any protection from decompilation of the executables and other generated assemblies... there are tools that decompile literally within seconds. There are obfuscators available; however, I am specifically looking for a free one.
I store some pretty sensitive strings within the application, and though I try my best to keep them encrypted or in the local sqlite database, there is always that one, single point of failure that leaves the entire application vulnerable. I checked out a couple of obfuscators, both open source and commercial offerings. The open source one seems to be broken, and the commercial ones are priced quite steeply, which is unaffordable for an indie developer like me.
I am aware that an executable has to run in memory, and this in itself makes it vulnerable. And a determined attacker can eventually decompile an application. However, I want to make this process as expensive as possible. At the very least I would want to protect the strings within my application.
My question is, is there just no way to protect an application assembly from getting reverse engineered if I decide not to use any of the expensive options available?
There's lots of protection in Windows. But it's all there to protect the your users from you, not the other way around.
The simple fact of decompliation is: "As long as a computer can still execute it, it can also still decompile it."
Execution is a process that translates binary into actions.
Decompliation is a process that translates binary into code.
If one is blocked, the other is blocked as well.
Obfuscation can make it harder to read the decompiled code. But that is about all it can do.
I store some pretty sensitive strings within the application, and though I try my best to keep them encrypted or in the local sqlite database, there is always that one, single point of failure that leaves the entire application vulnerable
Every string, is only as save as the place you keep it at. The same applies for encryption keys. There are 2 limited workarounds:
if it is around comparing the input to something in the backend - like you do with passwords - password security can work. Modern PW security means not even the Adminsitrator can figure out the passwored, yet you can still compare user input to it.
You could move the strings into a seperate application. Instead of giving applications the SQL Server Connection string, you give it access to a WebService that you control. Only the WebService actually knows how to contact the Database.
It depends on your practical scenario what you can do. But in the end, if a string or other peice of information is to be useable, it is vulnerable in memory.

Best way to handle large amount of permanent data

I'm developing a PC app in Visual Studio where I'm showing the status of hundreds of sensors that are connected via WiFi. The thing is that I need to hold on to the sensor data even after I close the app, so I'm considering some form of permanent storage. These are the options I've considered:
1) My Sensor object is relatively compact with only a few properties. I could serialize all the objects before closing the app and load them every time the app starts anew.
2) I could throw all the properties (which are mostly strings and doubles) into a simple text file and create a custom protocol for storage and retrieval.
3) I could integrate a database with my app. Someone told me this is the best way to go about it, but I'm a bit hesitant seeing as I'm not familiar with DBs.
Which method would yield the best results in terms of resource usage and speed? Or is there some other, better way to go about this?
First thing you need is to understand is your problem. For example, when the program is running do you need to have everything in memory at the same time or do you work with your sensors one at a time?
What is a "large amount of data"? For example, to me that will never be less than million (or billion in some cases).
Once you know that you shouldn't be scared of using something just because you are not familiar to it. Otherwise you are not looking for the best solution for your problem, you are just hacking around it in a way that you feel comfortable.
This being said, you have several ways of doing this. Like you said you can serialize data, using json to store and a few other alternatives but if we are talking about a "large amount of data that we want to persist" I would always call for the use of Databases (the name says a lot). If you don't need to have everything in memory at the same time then I believe that this is you best option.
I personally don't like them (again, personal choice) but one way of not learning SQL (a lot) while you still use your objects is to use an ORM like NHibernate (you will also need to learn how to use it so you don't get things a slower).
If you need to have everything loaded at the same time (most often that is not the case so be sure of this) you need to know what you want to keep and serialize it. If you want that data to be readable by another tool or organize in a given way consider a data format like XML or JSON.
Also, you can use mmap-file.
File is permanent, and keep data between program run.
So, you just keep your data structs in the mmap-ed area, and no more.
MSDN manual here:
https://msdn.microsoft.com/en-us/library/windows/desktop/aa366556%28v=vs.85%29.aspx
Since you need to load all the data once at the start of the program, the database case seems doubtful. The DB necessary when you need to load a bit of data many times.
So first two cases seem more preferred. I would advice to hide a specific solution behind an interface, then you'll can change it later.
Standard .NET serialization of sensors' array is more simple probably, and it will be easier to expand.

Looking for the most painless non-RDBMS storage method in C#

I'm writing a simple program that will run entirely client-side. (Desktop programming? do people still do that?) and I need a simple way to store trivial amounts of data in a structured form, but really don't see any need to use a database system. What's more, some of the data needs to be serialized and passed around to different users, like some kind of "file" or perhaps a "document". (has anyone ever done that before?)
So, I've looked at using .Net DataSets, LINQ, direct XML manipulation, and they all seem like they would get the job done, but I would like to know before I dive into any of them if there's one method that is generally regarded as easier to code than others. As I said, the amount of data to be stored is trivial, even if one hundred people all used the same machine we're not talking about more than 10 MB, so performance is not as large a concern as is codeability/maintainability. Thank you all in advance!
Sounds like Linq-to-XML is a good option for this.
Link 1
Link 2
Tons of info out there on this.
Without knowing anything else about your app, the .Net DataSets would likely be your easiest option because WriteXml and ReadXml already exist.
Any serialization API should do fine here. I would recommend something that is contract based (not BinaryFormatter, which is type-based) as that will keep it usable over time (as your assembly changes).
So I would build a basic object model (DTO) and use any of:
XmlSerializer
DataContractSerializer
protobuf-net (you all knew it was coming...)
OO, simple, and easy. And easy to use for passing fragments of the data (either between users of to a central server).
I would choose an embedded database. Using something like sqlite doesn't seem to be an overkill for me. You may even try its c# port (http://code.google.com/p/csharp-sqlite/).

Should I store localization content in the application state

I am developing my first multilingual C# site and everything is going ok except for one crucial aspect. I'm not 100% sure what the best option is for storing strings (typically single words) that will be translated by code from my code behind pages.
On the front end of the site I am going to use asp.net resource files for the wording on the pages. This part is fine. However, this site will make XML calls and the XML responses are only ever in english. I have been given an excel sheet with all the words that will be returned by the XML broken into the different languages but I'm not sure how best to store/access this information. There are roughly 80 words x 7 languages.
I am thinking about creating a dictionary object for each language that is created by my global.asax file at application run time and just keeping it stored in memory. The plus side for doing this is that the dictionary object will only have to be created once (until IIS restarts) and can be accessed by any user without needing to be rebuilt but the downside is that I have 7 dictionary objects constantly stored in memory. The server is a Win 2008 64bit with 4GB of RAM so should I even be concerned with memory taken up by using this method?
What do you guys think would be the best way to store/retrieve different language words that would be used by all users?
Thanks for your input.
Rich
From what you say, you are looking at 560 words which need to differ based on locale. This is a drop in the ocean. The resource file method which you have contemplated is fit for purpose and I would recommend using them. They integrate with controls so you will be making the most from them.
If it did trouble you, you could have them on a sliding cache, i.e. sliding cache of 20mins for example, But I do not see anything wrong with your choice in this solution.
OMO
Cheers,
Andrew
P.s. have a read through this, to see how you can find and bind values in different resource files to controls and literals and use programatically.
http://msdn.microsoft.com/en-us/magazine/cc163566.aspx
As long as you are aware of the impact of doing so then yes, storing this data in memory would be fine (as long as you have enough to do so). Once you know what is appropriate for the current user then tossing it into memory would be fine. You might look at something like MemCached Win32 or Velocity though to offload the storage to another app server. Use this even on your local application for the time being that way when it is time to push this to another server or grow your app you have a clear separation of concerns defined at your caching layer. And keep in mind that the more languages you support the more stuff you are storing in memory. Keep an eye on the amount of data being stored in memory on your lone app server as this could become overwhelming in time. Also, make sure that the keys you are using are specific to the language. Otherwise you might find that you are storing a menu in german for an english user.

Hold global data for an ASP.net webpage

I am currently working on a large-scale website, that is very dynamic, and so needs to store a large volume of information in memory on a near-permanent basis (things like configuration settings for the checkout, or the tree used to implement the menu structure).
This information is not session-specific, it is consistent for every thread using the website.
What is the best way to hold this data globally within ASP, so it can be accessed when needed, instead of re-loaded on each use?
Any AppSettings in web.config are automatically cached (i.e., they aren't read from the XML every time you need to use them).
You could also manually manipulate the cache yourself.
Edit: Better links...
Add items to the cache
Retrieve items from the cache
Caching Application Data
It's not precisely clear whether your information is session specific or not...if it is, then use the ASP Session object. Given your description of the scale, you probably want to look at storing the state in Sql Server:
http://support.microsoft.com/kb/317604
That's the 101 approach. If you're looking for something a little beefier, then check out memcached (that's pronounced Mem-Cache-Dee):
http://www.danga.com/memcached/
That's the system that apps like Facebook and Twitter use.
Good luck!
Using ASP.NET caching feature is a good option I think. In addition to John's answer, you can use Microsoft's Patterns & Practices team's Caching Application Block.
This is a good video exploring the different ways to can retain application state.
http://www.asp.net/learn/3.5-videos/video-11.aspx
It brushes on the Application object which is global for the whole application, for all users and shows you how to create a hit counter (obviously instead of storing an integer you could store objects). If you need to make changes, you do need to use a lock for concurrency, and I'm not sure how it handles LARGE amounts of data because I've never had to keep that much there.
I usually keep things like that in the Application object.
If the pages are dependent upon one another and they post to one another, you could use the page's request object. Probably not the answer you're looking for, but definitely one of the smallest in memory to use.
I have run into the same situation in the past and found an interface to be the most scalable solution. Application cache may be the answer today, but will it scale to meet your needs?
If you need to scale up, you may find cookies, or some type of temp database storage to be the trick. Simply add a new method to your interface, and set the interface to choose the "mode" from web.config.

Categories