I have been working on project recently where eventually I need two (or more) applications on different machines to be able to access the same file on one of them without uploading it to a server first.
Thus, the best idea I could get was to create something like a mini-temporary file server where the desired file path is streamed over the ip address and the other machine can access it via a URL like that "http://xxx.xxx.xxx.xxx/path/file.ext".
I have a good experience with C# but this approach... I never stepped into before, so any help is appreciated either in achieving this approach or any other method that can lead to allowing cross-internet access to a file on a machine.
Thanks in advance.
[Edit]
This operation has to be done without prior port forwarding, I don't know if this is possible, but I guess if it is not then I might need to do something like streaming to a php server first or something, again any help is appreciated.
Set up a virtual directory pointing to a local directory in IIS.
Application A writes file to local directory.
Application B reads
file from http.
Otherwise, you could use a network drive for both applications to read/write from.
Related
Currently, I have a web based C# application (ServiceStack) that has an XML file it relies on to generate things client side. I no longer want to store this file on the client side. I need a way to use the repository I wrote to edit the file when it is stored on the server.
I have tried the following:
Finding the file location of XML when in the service. (it says that the service current directory is windows/system32, because it's running as a service. Makes sense now.)
Putting the file in the same project side by side. (same problem)
This already works:
Retrieves XML ( With hard coded path :c )
Deserialize XML
Add/Remove to XML as needed
Save
The key thing to this problem is it must live where the API lives. I don’t think that I am understanding the way this works very well, and I'd greatly appreciate some help.
Use System.Web.Hosting.HostingEnvironment.MapPath to find the path of the file relative to your application, assuming it's hosted via IIS.
In ServiceStack you'd normally use the Virtual File System to resolve files, e.g:
base.VirtualFileSources.GetFile("path/to/file.xml").ReadAllText()
Otherwise if you just want the path you can use IAppHost.MapProjectPath() available from v4.5.4 to resolve a file path relative to your WepApp consistently in all App Hosts, e.g:
HostContext.AppHost.MapProjectPath("~/path/to/file.xml")
I working on a windows forms project, visual studio 2010, c#
I want to send some files to computers in our network but they don't have a "listener" as in client/server solution but i do have username/password. Is there any way to send files knowing this information? And as i said, i do not want to build a client / server solution.
Cant i use "Impersonate" somehow?
If you know the username password and your client is within the same domain, you might be able to use UNC with authentication (with $) and send the files to client PC. Something like: \\clientPC\c$. Once authenticated, you can just use File IO, e.g. File.Copy(..."\\clientPC\c$\yourfile.txt") to send file.
You can use the class posted here for UNC authentication.
Is this in the same domain as your machine? If so, do you have the ability to create a share? If you can, you may be able to just setup a share and transfer the files as you would locally. If this is possible there is no point in creating a verbose application for a trivial need.
I built similar for a company I used to work for, the "client" exposed a share and me; the "server" simply used File.Move() to transfer files.
Yes you can use impersonate,
Read this article
, Maybe it will helpful.
I have built a web application which uses two web front end servers, the Users are randomly directed to either one through the same URL. The web app has specific functionality to upload and download files. When a file is uploaded, it is stored on a specific directory on the server to which it is uploaded.
The issue is that when a User uploads a file to the folder on Server 1, any user trying to download that same file from Server 2 will not be able to as it only exists on the server where it was uploaded.
What's the best way of solving this? I've looking at:
- Using a SAN, problem here is I don't want to change or create a domain
- Writing a Windows Service, would prefer to avoid this if possible, I've not done it before but will give it a go if necessary
Thanks in advance!
Joe
Unless I'm missing something very obvious, all you need is a shared location. This could be a network share addressed through a UNC path, a folder on an FTP server, a database, anything at all, as long as it's
shared,
accessible from both web-servers
web-application service account has read/write permissions to it.
From your requirements a network share on a file server (perhaps 1 of the 2 web-servers, or the load-balancer, or (ideally) a new server entirely) would be the simplest method.
How do I map an ftp share folder to a local drive using C#? Is there any class library available for this?
I need to achieve the same functionality as NetDrive(http://www.netdrive.net/) offers using FTP ?
Maybe you can leverage someone else's work and get a headstart. The makers of NetDrive for instance offer an SDK - not sure what that requires / costs, however. But it might be worth an inquiry, no?
And maybe, if you combine this approach to define a remote FTP site as a Windows network share, and then use this code in C# to mount a network share as a drive, you might get your job done :-)
It's a very tall order - if you actually want a local drive, e.g. X:\, to access an FTP site, you'd surely need to write a driver. Not an easy task.
If you want it to simply appear in Windows Explorer somewhere - you can use the shell extension as Marco's answer suggests. But don't expect to be able to treat it like a regular drive.
To create a virtual drive or a folder on existing drive and expose remote FTP server contents that way you need a filesystem driver. Or use can use our Callback File System (CBFS) product which lets you write the code in user-mode and includes a pre-created filesystem driver. CBFS includes a sample, SFTPDisk, that does exactly what you need, but with SFTP protocol (SFTP is not FTP but SSH File Transfer Protocol).
Note, that in FTP there's no function to upload a block to the middle of the existing file. This makes some file write operations trickier than with SFTP or local filesystem - you may need to cache the whole file and upload it asynchronously when it's closed by the client.
There's a project in C# to use an FTP folder as virtual drive -> http://amalgam.codeplex.com/ (it uses dokan).
There's a freeware program that does the same thing -> http://www.ferrobackup.com/ftpuse/
Will data traffic go through the host application program or will be dealt remotely in scenarios where C#'s File.Copy is used:
File.Copy(#"\\SERVER13\LOL\ROFL.txt", #"\\SERVER13\ROFL.txt")
Cheers n thx!
First of all you have a small bug in the path of the destination file.
Second, there are no remote copy operation. There exist a remote move operation (rename, but with a destination in other directory) like MoveFile (see native API http://msdn.microsoft.com/en-us/library/aa365239%28VS.85%29.aspx).
UPDATED: Probably you came from unix and knows utility rcp, but it works with respect of remote shell service (rshd) and not with respect of direct file system features. You can also use PsExec utility from SysInternals (see http://technet.microsoft.com/en-us/sysinternals/bb897553.aspx) to start some program on the remote computer, but all this is not a subject of programming.
It will go through the local application. The file system does not know what the application is going to do with the bytes it reads from the share, or where the bytes that are written to the share come from.
In addition, the application does not know (in the case of DFS) if the two shares are on the same machine.
If you want to let the server handle it, you have to remotely run a copy program.