I'm building a webapp that needs to interact with a Access Database. The Access database is about 200 megs and I don't want to upload the entire thing...just the contents of one table. So far, I've used Microsoft.Office.Interop.Access in the past on a desktop app but when I tried this on a webapp there is some cryptic permission issues on the web server(I think) that need to be ferreted out.
As far as I understand it I can
1 - upload the entire database and select the data
2 - I can use interop and figure out the permission issues
is there a 3 or 4 option?
Thanks guys.
The location of the access file doesn't matter as long as it is accessible local or through the network and the NETWORK account of the webserver (if it is a Win2K3 or higher server otherwise it's the ASP.NET account) has access to that location.
So no need to download or upload anything.
Also... the fact that your back-end is dealing with an access database shouldn't be visible or be of any concern to the client...
OTOH if you are looking for a solution to "manage a database through a web interface", then maybe it's better to look at something like this... (It's for sql server, but migrating from access to sql server isn't that big an issue ;-)
If you want to code it yourself, i think this post can come in handy.
No need to interop, just use an OleDbConnection with the right connectionstring.
I don't know if i understood your problem but maybe you could upload the table data using a CSV file, then parse every line and use a SQL query to INSERT this data the Access database.
Related
I am developing a desktop app in c# using "entity framework database first approach" that is required to be highly secured i.e. no one can access its database without login into that application and no one can extract the data directly using database file. The database have 20 table. Although I Tried to encrypt the database but when the application will start i will have to decrypt the database file in order to connect and leave it decrypted until user tries to exit the application.
No offence, I am a noob to sql server and I want to create a highly secured sql server database for my c# desktop application that can only be opened by that application only. I know that there are two authentication model by which you can connect to a sql server i.e. windows authentication and mixed authentication. But i don't want my database to be opened using windows authentication.
Is there a way by which only one user can open the database with password?
if you used SQL compact the database server is only really there for your app, which is probably the closest, however it doesnt stop people taking the file and putting it into SQL express/server
Similarly: embedding sql express may help but again the files are still there.. and ..
If SQL is in an instance yes, you may control it with a username/password but anyone with admin rights to the instance can get in it.
You also would need to consider how they would backup the data
Yes. You can easily handle this.
Go to Security >> Logins and then create an user with password. You can also give him access to a specific database. You can specify the Server Role section.
I have a game I've been working on that I want to do a sort of "cloud saving" with. My issue is securely uploading save files so that we don't expose our website or FTP server. Right now, I'm using FTP with a severely restricted account that has access to /saves, but it also has access to each user's save directory. Malicious destruction of save data was solved with some clever design, and it's not what I'm worried about. I am worried about someone getting ahold of the FTP account I use to login (wouldn't be too hard, because it has to be stored in code) and using it to make multiple connections and upload massive files. I don't want to place an upload restriction on the account, because all of my users have to use the same account for uploading, and I don't want legitimate users running into issues. However, this still presents an issue. Users have a WordPress username and password they use to launch the game, and the launcher validates permissions through WordPress. Ideally, when people buy the game I'd like to create a directory for them, as well as a username and password and upload limit of probably 10MB/day, but I doubt our hosting service provides this so I'm looking at alternate methods.
tl;dr How do I restrict users of my game into a specific directory with an upload limit, potentially without using FTP? I tried to do uploading with PHP before, but it's generally frowned upon when a remote PHP script tries to access files on a user's machine without any sort of FORM element. I guess it might work if I could initiate some sort of upload from the client... I'd still have to find a way to prevent malicious uploads, though.
Any ideas, anyone? This is something I'd really like to do, and to do it I need to make it secure against attacks.
Thanks!
Isn't this the kind of problem that web service created to solve? You can create a web service, integrate it with your user database, so your game would call the service to upload and download the data with authentication token from Wordpress. It won't stop anyone from DDOSing your webservice, but at least no risk for leaked password. Do note, according to this article, there's a hard limit to the uploaded data at 4MB. Of course you can simply split the file before sending them and handle the joining at the server.
The place where I work has 2 servers and a load balancer. The setup is horrible since I have to manually make sure both servers have the same files. I know there are ways to automate this but it has not been implemented, hopefully soon (I have no control over this). I wrote an application that collects a bunch of information from a user, then creates a folder named after the email of the user in one of the servers. The problem is that I can't control in which server the folder gets created in, so let say a user goes in.. fills his stuff and his folder gets created in server 1, user goes away for a while and goes back to the site but this time the load balancer throws the user into server 2, now the user does something that needs to be saved into his folder but since it didn't created in this server an error occurs. What can I do about this? any suggestions?
Thanks
It sounds like you could solve a few issues by implementing a cloud file service for the file writes such as Amazon S3 http://aws.amazon.com/s3/
Disk size management would no longer be a concern
Files are now written and read from S3 so load balancer concerns are solved
Benefits of a semi-edge network with AWS. (not truly edge but in my experience better than most internally hosted solutions)
Don't store your data in the file system, store it in a database.
If you really can't avoid using the file system, you could look at storing the files in a network share both servers have access to. This would be a terrible hack, however.
It sounds like you may be having a session state issue. It sounds odd the way you describe it, but have a look at this article. It's old, but covers the basics. If it doesn't try googling "asp.net session state web farm"
http://ondotnet.com/pub/a/dotnet/2003/03/24/sessionstate.html
Use NAS or SAN to centralize storage. That same network-accessible storage can store the shared configuration that IIS can be setup to use.
Web Deploy v2 just released from Microsoft, I would encourage the powers that be to investigate that, along with Application Request Routing and the greater Web Farm Framework.
This is a normal infrastructure setup. Below are the two commonly used solutions for the situation you are in.
If you have network attached storage available (e.g. Netapps), you can use this storage to centrally store all of your user files that need to be available across all servers in your web farm.
Redesign your application to store all user specific data in a database.
I'm trying to figure out the membership module in asp.net.
Currently i have a site that i have programmed on my computer, i'm using the asp.net membership and regestering users through the asp.net configureation. So all the data is on my computer.
Now i want to install the web site and all the users in the membership table to a hosting server.
I know i have to get the membership tables to that server, whats the best way to move the site to the server so all will work ?
Do i have to run aspnet_regsql on the server itself to make all the tables.
Is it better to program the site on the server from the beginning so i dont have to move the whole lot.
Can i access the ASP.NET Configuration when the site is on a hosting server ?
The best way to do it would be to make a backup of your database on your development computer and then restore it onto the hosted computer as your database. Your hosting company should give you a way to do this. By doing it that way you will be copying over all of the tables, procedures, and any other DB objects created by aspnet_regsql and any users or login information you already created.
The one change you will need to make is in you web.config file. The web.config file should have the connection string you are using to connect your website to the database and that will need to change so that it connects to your database at your hosting company. Check with the FAQ of your host and they will probably tell you what the connection string should be.
If your host is different than that please post who you are hosting with and what type of plan you have so we have some idea of what to look at.
Your comment about sharing code is a completely different topic. I think I can send you in the right direction though. Check out http://github.com or https://www.mercurial-scm.org/. git and Mercurial are both distributed version control. That is the best way to share code for remote teams (even non-remote). Then you would just push your code from your source control to your site whenever you want to make updates.
I need to let a company push information up to my site.
The best way to explain what I am talking about is to explain how it is currently done with their previous website:
This company uploads a CSV file to an FTP set up by the website. The website then processes the CSV file and puts it into an SQL database so that it can be used by the website.
In this case, I am the website and I am working with the company. Both sides are willing to change what they do. So my question is...
What is the best way to accept batch information like this? Is there a more automated way that doesn't involve FTP? In the future I may have a lot of companies wanting to do this, and I'd hate to have to setup accounts for each one.
The project is C# ASP.NET MSSQL
Let me know if you need more information...
Set up a web service to accept incoming data. That way you can validate immediately and reject bad data before it ever gets into your system.
If you want to eliminate FTP, you could allow them to upload files to your site leveraging using FileUpload. Once the file is uploaded you can do your server side processing.
EDIT: From the OP's comment's it seems to be an automated process. That said, if their process generates the file, you could:
Allow them to continue their current process which would involve them generating their file and placing it somewhere where it could be accessed via a URI with authentication, you could access this file on a schedule and process it. From what it seems right now they generate a file and upload it to your FTP server, so there seems to a manual element to begin with.