a requirement for project is to have my winforms system to be able to see a central file repository (central folder of scanned documents). users need to be able to read, and delete and save files.
to do this the server had a shared folder, and every client had a map drive to the folder.
from experience i have found map drives to be a real pain in the but. sometimes u hit security issues, sometimes map drives just go wonky.
i would prefer to avoid this design, and rather have a webservice that publishes a list of files, another service that returns a file, deletes file etc etc.
but it feels like this must have been done before plenty times. rather than rolling my own, has anyone implemented or seen something similar?
Server is windows with IIS, and I am on the ms stack so would make sense to have something that fits into that.
Related
I'm refactoring a WPF application and dealing with cleaning up storage of settings. I've re-written most to use the application's settings (Properties.Settings.Default) and this technically is working right now it seems to generate rather ugly paths in the %appdata% folder such as:
C:\Users\me\AppData\Local\Company_Name_With_Underscores\program.exe_Url_xs3vufrvyvfeg4xv01dvlt54k5g2xzfr\3.0.1.0\
These also then result in a new version number folder for each version that don't get cleaned up ever unless apparently I manually do so via file io functions.
Other programs don't seem to follow this format, including Microsoft programs, so I'm under the impression this seems like one of those 'technically the best way but not practical so nobody uses' solutions. Is this the case or am I missing something to make this more practical?
I ask mainly because I can foresee issues if we ever have to direct a customer to one of these folders to check or send us a file from there.
I'm using .NET 4.0 right now.
I am designing a Windows Service which main purpose is to monitor content of network shares. I've already got to know pros and cons of FileSystemWatcher class and probably I am going to use it with some custom enhancement. One thing that bothers me is that I still don't know how to get the information about who exactly modified shared files. I think it could be extracted somehow from permissions mechanism in Windows, but how? Have you got any ideas how to get at least login of the person who accessed and modified shared content?
I can use either C# or PowerShell.
The best way to do this is to enable advanced file auditing on the servers you need this information on. If that isn't an option (it can get quite inefficient on servers with high disk IO), you can try using FileSystemWatcher to get the same results. Here is an example of how to do that!
i have a website, and i am migrating to a version 2. It's a staggered roll-out as opposed to big bang. What I intend to do is send a certain amount of traffic to the new site, and the rest the old version. I prefer not to do this on the load balancer (several reasons which I do not want to go into as it is business driven more than technical). I have a limited number of servers, so I do not wish to put the new code on dedicated servers. I also want to ensure the URL does not differ so as not to confuse the customer
The version two code is totally new and is not and will not be part of the version one codebase. The two must be isolated. Again; business driven but not negotiable.
(e.g.: he hits http://example.com/somepage, i do not want to redirect to http://new.example.com/somepage).
I am using IIS and ASP.NET MVC 4.0. I prefer a code based solution (httpmodules and putting something in the querystring for example) but cannot see how this would work.
Other than putting tonnes of iRules in the load balancer I am happy to hear any kind of solution, whether it is hardware / software / middleware.
The only way you could conceivably achieve this is via a load balancer. Otherwise, you run into the simple problem that a single domain can't be attached to multiple applications and multiple applications can't reside in the same document root. You could use a subdomain like new.example.com, but you explicitly said you don't want that. The only other alternative would be house the new app in a virtual directory of the old. Then, you could keep the domain the same, but you'd have to redirect from http://example.com/somepage to http://example.com/new/somepage. Also, the Web.config of the old application will apply to any other application housed in a virtual directory, so great pain would have to be taken to make sure there's no config bleed-through.
As of late, we started a pretty large project (C# XNA game).
It seemed to be pretty obvious solution to store all the files in a remote server, use a database for file "versions" and have the patcher download the newer versions and delete any archaic.
Now this is all nice in theory, we even found a service with the space for it (SkyDrive with the 25GB offer).
The problem came up when it got to file manipulations.
We're looking for:
Can programmatically download/upoad (for the patch maker) files to/from SkyDrive.
Has a secure way of containing uname/pass.
Allow me to explain both.
Thing is, we had to make the SkyDrive on my personal account (due to the 25gb offer only being there for old users). I'm not very happy with someone getting my password, even though I'll obviously change it to something completely archaic, they would still get access to most of my other hotmail/msn related stuff. (I guess it's a reason to remake it all then?). So if possible I would secure the actual uname/pass inside the program. Since it's .NET and is compiled on demand, (and can easily be decompiled) I'm having doubts real security in this case is improbable (if it is possible to secure please do tell me how).
On top of that, there's no efficient&official SkyDrive API. This means that there's an even bigger security hole (see previous paragraph) and the communication won't necessarily work as expected). This also means there may be slowness in communication - something bad if you have 1000 users downloading the same file.
So to formulate all of this:
What is the the proper way (read API) to use SkyDrive as a storage server for a patcher considering it's linked to my personal account?
small sidenote, if I must, I can be evil and get our slow artist to host the server
Edit 1:
The idea is to have anyone be able to download the client, but initiating anything requires an active account on our database. As such the files themselves don't have a problem being read by everyone. So I'll add the following: how to programmaticaly get direct downloads from SkyDrive if the files are public? The current links lead to their web UI. And I mean programmatically (maybe during upload time) as to avoid doing it all by hand.
This is a bad idea.
Given #1:
Use a public folder to store your assets and grant everyone access to it
Use httpclient to download the files from the public folder anonymously in your patcher client
Use the SkyDrive descktop client to synchronize the public folder from a 'build' machine
Hi I need your help if you are an expert in MOSS.
I have a system that needs to upload documents into a MOSS document library.
I decided that the easiest approach for a phase 1 system would simply be to map a network path to a MOSS Document library.
The whole thing seems too easy. After that its a straight copy using System.IO.
What I would like to know, is this method reliable enough to be used in a production system?
Speculation would be great, but if you have real experience with working with MOSS in this way, your answer would mean a lot.
Thanks.
So long as you do the proper error checking around the copy, its fine - if you bear in mind the standard caveats with SharePoint document libraries and file naming conventions.
SharePoint does not allow some characters in file names which NTFS and FAT do - these will cause an error when you try to copy them to the DL, regardless of how you do that, so you will need to sanitise your filenames beforehand.
The only downside to using a network path to the webdav interface of SharePoint is that if you stress it too much (a large copy of a lot of files), you can easily overwhelm it and it will cause the network share to become unavailable for a period of time. If you are talking about a few files every now and then, several an hour for example, it should be fine.
You are better off reading the files off the network path and then using the Object Model API or the web services to upload it.
You can use timer jobs which can be scheduled to run at a convenient time. The timer job can read it's configuration settings from an XML file.
This system would be easier to maintain, and troubleshoot, when compared to a straight copy using System.IO.