I made an UWP app for Microsoft Store. However, user data automatically saved in the LocalState folder will be deleted every time the app is updated. I want the data to be retained after every updating, so I'm trying to suggest the users to save their data by themselves in the Document folder or somewhere to avoid their data deleted, but I don't want to bother them. Where should I save user data?
The roaming folder will be unable to use in future and I don't want to use Azure because of its fee.
The common approach is to store the data in some remote location, like for example in the cloud. You would typically use a service of some kind to request and save the data.
If you think Azure is to expensive, you'll have to find a cheaper storage solution. The principle is the same regardless of which storage provider you use.
As mentioned in the docs, roaming data is (or at least will be) deprecated. The recommended replacement is Azure App Service.
Related
I need to create a cloud storage application using ASP.NET MVC, C# and integrate it with Azure storage.
I currently have a functional interface which allows users to register and securely stores their details in an SQL database. I also have a basic file uploader using Azure Blob storage that was created using this tutorial as a guideline.
My question regards how to give users their own container/page so that their files are only accessible by them. At the moment, the file uploader and Azure container is shared so that anybody with an account can view and edit the uploads. I want to restrict this so that each user has their own individual space that cannot be read or modified by others.
I have searched for answers but cannot find anything that suits my needs. Any advice would be greatly appreciated.
My question regards how to give users their own container/page so that
their files are only accessible by them.
One way to achieve this is by assigning a container to a user. When a user signs up, as a part of registration process you create a blob container for the user and store the name of the container along with other details about the user. When the user signs in, you fetch this information and only show files from that container only. Similarly when the user uploads the files, you save the files in that container only.
A few things you would need to consider:
You can't set any hard limit on the size of the container. So a container can be as big as a size of your storage account. If you want to put some restrictions on how much data a user can upload, you would need to manage that outside of storage in your application. You may also want to look into Azure File Service if that's a requirement. In Azure File Service, you can restrict the size of a share (equivalent of a blob container).
You may even want to load-balance your users across multiple storage accounts to achieve better throughput. If you decide to go down this route then along with container name, you would also need to store the storage account name along with user information.
Store the document names in a separate SQL database linked to the user's account. Then display to the user only those files with filenames linked specifically to them, and you should be on your way! I've used this architecture before, and it works like a charm. You should attach a Guid or some other unique identifier to each filename before implementing this model, however.
I am trying to put some files and folders in a place that my WinRT application could access and read files from. These files and folders must be locked and permanent even when I uninstall the application.
There are 4 main categories of folders that you have access to from a Windows Store Application:
/AppData/Local/<yourpackage>
Standard available libraries (Music, Pictures, Videos)
Libraries that require elevated permissions (Documents)
User-defined folders (through a FilePicker)
Typically you will store data specific to your application in the first kind and data for the user in a library or a user-defined folder.
When your application is uninstalled, its package folder under /AppData/Local is removed so there is nothing left there (note that this contains the directories which you get using ApplicationData.Current.Local and ApplicationData.Current.Roaming).
Therefore in order to save data persistent through installations you will have to save them in a different directory. Note however that in the case of option 4, the user might not decide to give you access to the folder he picked out the previous time or that he doesn't want to give the application access to the Documents library.
I think a more solid solution would be to save the data on a remote location so that the user can download it when he installs the application again, rather than relying on everything being saved at the user's device.
With regards to the locking: you could look into the DataProtectionProvider and see if that suits your needs.
To reiterate Jeroen's response:
To save data locally in a place that won't be deleted when the app is uninstalled, you must use the file picker to have the user choose a location.
Saving the data remotely is the only option that doesn't require user interaction.
It's possible to save data in the app's roaming settings, as those are retained for some period of time after the app is uninstalled. If the user has the app installed elsewhere, that data will be retained continuously. If there are no other installations, it will be retained for around a month.
I recently wrote a blog post addressing this exact scenario in more detail, including how to get a persistent user identity for any data your store remotely: http://www.kraigbrockschmidt.com/2014/07/31/persisting-data-after-uninstall/.
I'm writing a service using C# that is supposed to run on a Windows platform as the "local system". I can store small amounts of data in the registry, but if I want to store more data in a file, where do I place such file in? And also how to protect that data from a modification by users with lower access rights?
General approach is to use ProtectedData class to encrypt the data, then store them anywhere on the disk where the application can (eg in a subfolder of LocalApplicationData special folder, whose location you can obtain using Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData call).
Update to previous answer that may save some troubleshooting time to people using this solution (as I don't have enough reputation to comment):
For Local System Windows service account, Environment.SpecialFolder.LocalApplicationData
may also be resolved to
C:\Windows\SysWOW64\config\systemprofile\AppData\Local instead of C:\Windows\System32\Config\SystemProfile\AppData\Local.
Using the already mentioned CommonApplicationData instead of LocalApplicationData does not bring this issue.
Information source: https://www.jamescrowley.net/2014/02/24/appdata-location-when-running-under-system-user-account/
I have a project in MVC-3 which i am going to convert in Windows Azure project and will use blob storage.
My question is that i have some user specific information in my project which is stored in ASP.NET profile and in database table. Like images names uploaded by specific user and others. So while changing we will remove this profile and database table info also. Or keep this information as it is or just changing the location on which the images and other data save ( means from on-premises hard-disk to Windows Azure data center. )
Sorry if this is the odd question but i am pretty new in Azure. So can any body please explain me ?
The easiest method would be to use SQL Database (previously known as SQL Azure) and edit your connection string pointing to SQL Database (from on-premise SQL database). You will also need to migrate your SQL database schema/data from on-premise to SQL Database. This way your migration will be much easier and you would not need to make any significant changes to your code.
If you decide to use Azure Table Storage, the you will do significant changes to your code to store user specific data to a key-value pair type of storage (i.e. Azure Table Storage), depend on how complex your tables is, this may be a choice or may not be.
As you mentioned that you have user specific information in ASP.NET profile/database tables however i think you application also require users to upload images so this is also important factor while migrating your application. While migrating your MVC3 application to Windows Azure Cloud Services, you would need to move images or any static content you keep on local disk based storage to Windows Azure Blob Storage (persisted network storage) so the code change will require to read and write data to and from Azure Blob storage, in lpace of your local machine storage. This is also a must code change in your code otherwise the image data uploaded by user will not persist and prone to lost if VM reimaged due to several other reasons.
Others may have different idea but I would personally prefer using SQL Database as a choice to reduce complexity during migration as you could pretty much migrate your database table to SQL Database with just a few lines of code change.
I like to write a process in Worker role, to download (sync) batch of files under a folder(directory) to local mirrored folder(directory)
Is there a timestamp(or a way to get) on the time of last folder(directory) updated?
Since folder(directory) structure unsure, but simply put is download whatever there to local, as soon as it changes. Except recursion and setup a timer to check it repeatedly, whats another smart idea do you have?
(edit) p.s. I found many solutions on sync files from local to Azure storage, but the same principle on local files cannot apply on Azure blob, I am still looking for a way that most easily to download(sync) files to local as soon as they are changed.
Eric, I believe the concept you're trying to implement isn't really that effective for your core requirement, if I understand it correctly.
Consider the following scenario:
Keep your views in the blob storage.
Implement Azure (AppFabric) Cache.
Store any view file to the cache, if it's not yet there on a web request with unlimited(or a very long) expiration time.
Enable local cache on your web role instances with a short expiration time (e.g. 5 minutes)
Create a (single, separated) worker role, outside your web roles, which scans your blobs' ETags for changes in interval. Reset the view's cache key for any blob changed
Get rid of those ugly "workers" inside of your web roles :-)
There're a few things to think about in this scenario:
Your updated views will get to the web role instances within "local cache expiration time + worker scan interval". The lower the values, the more distributed cache requests and blob storage transactions.
The Azure AppFabric Cache is the only Azure service preventing the whole platform to be truly scalable. You have to choose the best cache plan based on the overall size (in MB) of your views, the number of your instances and the number of simultaneous cache requests required per instance.
consider caching of the compiled views inside your instances (not in the AppFabric cache). Reset this local cache based on the dedicated AppFabric cache key/keys. This will raise the performance greatly for you, as rendering the output html will be as easy as injecting the model to the pre-compiled views.
of course, the cache-retrieval code in your web roles must be able to retrieve the view from the primary source (storage), if it is unable to retrieve it from the cache for whatever reason.
My suggestion is to create an abstraction on top of the blob storage, so that no one is directly writing to the blob. Then submit a message to Azure's Queue service when a new file is written. Have the file receiver poll that queue for changes. No need to scan the entire blob store recursively.
As far as the abstraction goes, use an Azure web role or worker role to authenticate and authorize your clients. Have it write to the Blob store(s). You can implement the abstraction using HTTPHandlers or WCF to directly handle the IO requests.
This abstraction will allow you to overcome the blob limitation of 5000 files you mention in the comments above, and will allow you scale out and provide additional features to your customers.
I'd be interested in seeing your code when you have a chance. Perhaps I can give you some more tips or code fixes.