I have a game I've been working on that I want to do a sort of "cloud saving" with. My issue is securely uploading save files so that we don't expose our website or FTP server. Right now, I'm using FTP with a severely restricted account that has access to /saves, but it also has access to each user's save directory. Malicious destruction of save data was solved with some clever design, and it's not what I'm worried about. I am worried about someone getting ahold of the FTP account I use to login (wouldn't be too hard, because it has to be stored in code) and using it to make multiple connections and upload massive files. I don't want to place an upload restriction on the account, because all of my users have to use the same account for uploading, and I don't want legitimate users running into issues. However, this still presents an issue. Users have a WordPress username and password they use to launch the game, and the launcher validates permissions through WordPress. Ideally, when people buy the game I'd like to create a directory for them, as well as a username and password and upload limit of probably 10MB/day, but I doubt our hosting service provides this so I'm looking at alternate methods.
tl;dr How do I restrict users of my game into a specific directory with an upload limit, potentially without using FTP? I tried to do uploading with PHP before, but it's generally frowned upon when a remote PHP script tries to access files on a user's machine without any sort of FORM element. I guess it might work if I could initiate some sort of upload from the client... I'd still have to find a way to prevent malicious uploads, though.
Any ideas, anyone? This is something I'd really like to do, and to do it I need to make it secure against attacks.
Thanks!
Isn't this the kind of problem that web service created to solve? You can create a web service, integrate it with your user database, so your game would call the service to upload and download the data with authentication token from Wordpress. It won't stop anyone from DDOSing your webservice, but at least no risk for leaked password. Do note, according to this article, there's a hard limit to the uploaded data at 4MB. Of course you can simply split the file before sending them and handle the joining at the server.
Related
Is anybody knows how to share cookies between 2 windows users?
I have a Windows 10, where have 2 users: one is admin and second is operator.
Admin is logged in into the system and then goes to the web site, where setup some config. In this config we have some specific value which should be store locally in machine and operator shouldn't know nothing about it. So he is set some kookie { someKey: someValue } and then log out from Windows.
After this operator log in into Windows and open the same website and he should have access to this cookie { someKey: someValue }.
I search around we and found nothing about it. Found only solutions about save to file system, send via tokens and save MAC address with a value into DB. But this is not suitable for me. I know that share cookies and store locally isn't secure, but need to implement that feature.
Web project based on chrome browser, asp.net mvc, angularjs and ms sql for db storage. Is anybody can help me with this issue about cookies?
There is no way to do this. First, every browser has its own way to store and retrieve cookies. It is impossible to write something that will work for any platform and any version.
Second, there is security. You can't just copy some files and expect this to work. Browser developers aren't stupid to leave such a big security loophole in their software.
You are mixing Windows applications with full control over the system with a web application that only resides within the browser. You should find a better way. You could use a certificate installed on the machine to validate the user, but it seems to me there are better options, like simply logging in, etc.
Cookies are a browser component that all major browsers locate in user specific directories. if you could change it to HTML5 storage API and you could setup the storage to a folder both users have access (dunno about this). You could have client side shared data. Most probably, you could not. And certainly not using cookies.
Disclaimer: I havent used storage API
Edit: Just checked. Storage API does store the data un user specific folders, so cannot use it either.
"In practice, "client-side storage" means data is passed to the browser's storage API, which saves it on the local device in the same area as it stores other user-specific information, e.g. preferences and cache. Beyond saving data, the APIs let you retrieve data, and in some cases, perform searches and batch manipulations." Source: https://www.html5rocks.com/en/tutorials/offline/storage/
We have a developer debugging tool to help manipulate security section of a database that our product depends on. This tool's purpose is to inject state into database to reduce time to create test scenarios. The database is not typical database that one can manipulate using sql. Rather it is a binary file that only our tool can manipulate. This is a C# application.
If this tool goes outside our company (say someone emailed it to a customer who shared it somewhere public), that could open lot of security issues.
We like to build intelligence into this tool so that it is usable within company or at partners network with whom we shared the tool. We have no knowledge of partner's network.
I am wondering what the suggested ways of implementing it?
Like:
Ping company active directory server or exchange server. Allow the tool usage if you can reach one of these servers.
Package a certificate with the tool that expires a month from build date. Always check if the cert expired or not before allowing usage of the tool.
Modification of (2). Make every user to request a key to unlock the tool after specific date.
Before we go implement a solution, I am wondering if there is already a library that does this.
Thanks
Assuming you host "file" inside your organization and all parties just access it somehow. If you give both data and tools to modify it to external partners there is nothing really to stop them to modify data as they pleased (short of legal/administrative actions but that is outside of SO scope).
There is also really not much you can do to protect code running on user's machine irrespective if it is C# or native compiled code. .Net code is a bit easier to modify/bypass protections but if you concerned about securing access to a file you need to protect files/servers rather than worry about client side code.
Usual solution to such problem - authentication and authorization: only allow authenticated users to access the file and only accept changes from authorized users.
If you use file based storage than inside your organization regular Windows domain accounts would work for authentication and regular file system permissions would work for authorization.
For outside partners you probably would need server to perform modification of the file(s) and authentication/authorization possibly using ADFS or Oauth.
I am developing a Desktop application using c#. This application would require users to login before using it .
I plan to have an xml file on a server (which is not public to view) .
When the user logs in to the application using a username and password it would check with the xml file Online for the information and allow the user to proceed .
This is my approach and have not started coding it yet .
I would like to know if this is good enough an approach or are there are any other approaches that are better and more feasible ?
Please change the tags associated with the question if anyone feels they are not the right ones ..
thanks
EDIT 1.
I would add another level of username password to use that xml file online .. One that only the application would know
You can create a WCF Service around your XML-File to authenticate the users. That way, you don't need to expose the file to the public.
It could be, depending on your exact scenario and requirements. A couple of things you may want to think about:
Consider storing hashes of the password in the database rather than the password itself. (And then send the hashed password over the wire rather than the password itself.) That way, if either your database or connection are ever compromised, you won't be exposing passwords.
Consider sending the authentication data over an SSL connection so it cannot be seen by eavesdroppers. (Especially if you choose to send raw passwords over the wire.)
The place where I work has 2 servers and a load balancer. The setup is horrible since I have to manually make sure both servers have the same files. I know there are ways to automate this but it has not been implemented, hopefully soon (I have no control over this). I wrote an application that collects a bunch of information from a user, then creates a folder named after the email of the user in one of the servers. The problem is that I can't control in which server the folder gets created in, so let say a user goes in.. fills his stuff and his folder gets created in server 1, user goes away for a while and goes back to the site but this time the load balancer throws the user into server 2, now the user does something that needs to be saved into his folder but since it didn't created in this server an error occurs. What can I do about this? any suggestions?
Thanks
It sounds like you could solve a few issues by implementing a cloud file service for the file writes such as Amazon S3 http://aws.amazon.com/s3/
Disk size management would no longer be a concern
Files are now written and read from S3 so load balancer concerns are solved
Benefits of a semi-edge network with AWS. (not truly edge but in my experience better than most internally hosted solutions)
Don't store your data in the file system, store it in a database.
If you really can't avoid using the file system, you could look at storing the files in a network share both servers have access to. This would be a terrible hack, however.
It sounds like you may be having a session state issue. It sounds odd the way you describe it, but have a look at this article. It's old, but covers the basics. If it doesn't try googling "asp.net session state web farm"
http://ondotnet.com/pub/a/dotnet/2003/03/24/sessionstate.html
Use NAS or SAN to centralize storage. That same network-accessible storage can store the shared configuration that IIS can be setup to use.
Web Deploy v2 just released from Microsoft, I would encourage the powers that be to investigate that, along with Application Request Routing and the greater Web Farm Framework.
This is a normal infrastructure setup. Below are the two commonly used solutions for the situation you are in.
If you have network attached storage available (e.g. Netapps), you can use this storage to centrally store all of your user files that need to be available across all servers in your web farm.
Redesign your application to store all user specific data in a database.
I need to let a company push information up to my site.
The best way to explain what I am talking about is to explain how it is currently done with their previous website:
This company uploads a CSV file to an FTP set up by the website. The website then processes the CSV file and puts it into an SQL database so that it can be used by the website.
In this case, I am the website and I am working with the company. Both sides are willing to change what they do. So my question is...
What is the best way to accept batch information like this? Is there a more automated way that doesn't involve FTP? In the future I may have a lot of companies wanting to do this, and I'd hate to have to setup accounts for each one.
The project is C# ASP.NET MSSQL
Let me know if you need more information...
Set up a web service to accept incoming data. That way you can validate immediately and reject bad data before it ever gets into your system.
If you want to eliminate FTP, you could allow them to upload files to your site leveraging using FileUpload. Once the file is uploaded you can do your server side processing.
EDIT: From the OP's comment's it seems to be an automated process. That said, if their process generates the file, you could:
Allow them to continue their current process which would involve them generating their file and placing it somewhere where it could be accessed via a URI with authentication, you could access this file on a schedule and process it. From what it seems right now they generate a file and upload it to your FTP server, so there seems to a manual element to begin with.