This isn't a very complicated scenario really, but as I start to type out the problem I'm realizing how convoluted it can become textually. Let me try and be very clear:
First, the set up...
I have a C#/ASP.NET web application that is publicly facing on my main domain (www), let's call it www.mysite.com. Nothing fancy, just a front-end that connects to SQL to display records.
Then, I have a second C#/ASP.NET web application that is secured using forms authentication running on a subdomain, let's call it admin.mysite.com. This is a very light-weight CMS system to administer the public site.
Now, the problem...
Both of these sites run fine for basic tasks, however, my problem arises when I try to gain access to the file system for uploading. My webhost requires subdomains to run as a virtual directories under the main application in IIS (so the subdomains actually resolve/re-direct to www.mysite.com/admin when you type in admin.mysite.com), but because of this I am unable to write to my website root from the subfolder.
Let me explain a little more...
The CMS system (running as a virtual directory) gives the admin the ability to upload photos for display on the main site, the target folder of which is www.mysite.com/images - when attempting disk access from the root app, I am able to write to the virtual directory, but cannot do the opposite -- that is, write to the root from the virtual directory, getting security violations. If I can only upload to the /admin/ virtual directory, the entire point is moot because it's a secured folder that the public can't see!
The only solution I can think of is to upload the files to the /admin/ virtual directory, then call a URL in the root that moves files from /admin/ back to the root, but that is entirely ghetto.
I hope this post makes sense. Anyone else experience anything like this? The bottom line is that it seems virtual directories ONLY have access to themselves, and not their parent directories, no matter what credentials are used.
Thanks!
Somewhat unrelated to your question.
GoDaddy is not a great host due to things like you have stated. I recently went on the search for a host and wholeheartedly recommend JodoHost.com.
Some post I wrote on the matter:
http://www.ocdprogrammer.com/post/2009/12/16/The-search-for-a-web-host.aspx
http://www.ocdprogrammer.com/post/2010/01/03/JODOHostcom.aspx
However, the behavior you are seeing seems very normal to me. A sub domain not being able to access the root domain. Though with a dedicated server you could overcome this, but one of the drawbacks of a shared host.
In that case, I think I would consider using a database.
Here's how I would do it:
The files are uploaded into the /admin directory
The admin app writes the URL of the file into the database that it shares with the root app
The root app loads the URL from the database and uses it to point to the file.
I think that is the best way to approach it, under such circumstances.
You could put a webservice in your root domain "www.yoursite.com" that will receive a stream and will save this stream to disk as file. (this webservice could receive three parameters: the stream, the name of the desireed file on disk, and eventually the destination folder for the file so it will be a multi-purpose write file system)
Later, in your subdomain "subdomain.yoursite.com" you could consume the webservice (when you
upload a file, convert to stream, and send it to the webservice hosted in the root site with
appropiate parameters).
Related
This relates to my previous post.
https://stackoverflow.com/questions/45937117/c-sharp-unbale-to-access-
downloads-folder?noredirect=1#comment78832001_45937117
What I want is to access "Downloads" folder from an ASP.Net web application.
string pathUser =
Environment.GetFolderPath(Environment.SpecialFolder.UserProfile);
string pathDownload = Path.Combine(pathUser, "Downloads");
string commentImagePath =pathDownload +"\\test.png";
Then I realized that the above code works in a Desktop application.
That is the UserProfile is available in that "Environment", but not in the web application's environment.
I have some download happening, and the content will be downloaded to the "Downloads" folder.
That is why I need the "Downloads" folder . I need to access that content.
Please help me with this.
Think that the Downloads folder is different for each user in the system , so you'll get c:\users\usera\downloads and c:\users\userb\downloads. So when you mean Downloads, you actually mean a different physical path.
At the same time, your ASP .NET application, if hosted in IIS will impersonate a specific user, so make sure that user has enough privileges to the path of interest. Also consider that the Downloads folder might be a folder that Windows will protect by default from being accessed across accounts.
I think you'd be better off saving files to a specific folder relative to your root of the website. Otherwise try to map specific folders from your drive as a Virtual Directory in IIS so the site would see it as relative to root, whilst in reality it lives somewhere else.
how can i get a list of folders from a website?
Namely I wrote a program that take a URL
And give a list of folders from the website.
I try
Directory.GetDirectories(myURL)
but it not work.
Generally, you will have to have the server run some code to get the list of directories. The client does not have access to the filesystem of the web server, and even using FTP or WebDAV the scope of what can be seen by the client will be limited.
The easiest way would be to create a folders.txt file in every directory on your web server with the name of all child directories. Then use your favorite HTTP API to download the file and parse its contents.
As for websites that are beyond your control: you can't. However you can check if you have access to a folder with a specific name. That should give you some ideas.
You can't directly access the file system on the web server (a .NET security feature). You can however do this when you're running locally (under localhost), but I understand that's not the point. If you're talking about submitting an URL that you don't own, then typically, no, that's not possible.
I have a website sitting in a virtual directory in IIS 6.0. Within this virtual diretory there is also a P12 certification that I need to use for accessing an external web service. When I attempt to access this file through the site, I get a "file not found" error.
I have verified that the file is there and have mirrored my local dev enviromnent to match production and all works fine there.
I'm pretty confident that this is a permissions issue.
Can anyone point me in the right direction?
Thanks!
By default, IIS 6 will only serve specific types of files (based on extension) to requesting clients. If you're P12 certification (I'm not sure what that is) file is not one of those, a 404 is exactly what you should receive.
In your IIS admin console, you can modify the list of file types which are processed and/or served.
It sounds like it's possible you're attempting to access this file programmatically, in which case you'll need to provide a bit more information - show us the code that is attempting to access the file, and maybe the exception as well
EDIT:
Based on your comments about the location of the file, you could try doing something along the lines of this:
File.Exists(Server.MapPath("/") + "DLWSCert.p12")
I found that the message of the exception is misleading. The message says was File Not Found but the message was contained within a System.CryptogrphyException. Ultimately, the problem was permissions on the app pool account. Once those were set properly, it worked fine.
We have two separate web applications for a site: One for the site itself, and one for the cms/administration side. I'm not sure why the original developer designed it this way, but whatever.
I am tasked with adding some functionality to the administration side that uploads files. These files then need to exist within the folder structure of the actual site. I was thinking I might have to write a web service that sits on the actual site that accepts the file bytes and file name from a call within the administration site, and creates the file in the correct folder, but I was wondering if anyone had any ideas about a cleaner way to accomplish the same thing.
In general, how would you tackle a scenario where you upload a file on one site, and send it to the directory structure in another?
Thanks in advance!
The solution I ended up going with is to store the full file path to the other site in the web.config. It's not the most elegant solution, but it works and I'm mildly happy with it since it is easily maintainable across dev/staging/production.
You could create a Windows Service to transfer the uploaded files from one folder to another.
After a file is uploaded on the admin site, the windows service moves the file over to the correct location on the other site. You just need to decide how to communicate with the service - you could add details about the uploaded file to a message queue that the service monitors or perhaps you windows service might just watch the upload folder for any new files.
The place where I work has 2 servers and a load balancer. The setup is horrible since I have to manually make sure both servers have the same files. I know there are ways to automate this but it has not been implemented, hopefully soon (I have no control over this). I wrote an application that collects a bunch of information from a user, then creates a folder named after the email of the user in one of the servers. The problem is that I can't control in which server the folder gets created in, so let say a user goes in.. fills his stuff and his folder gets created in server 1, user goes away for a while and goes back to the site but this time the load balancer throws the user into server 2, now the user does something that needs to be saved into his folder but since it didn't created in this server an error occurs. What can I do about this? any suggestions?
Thanks
It sounds like you could solve a few issues by implementing a cloud file service for the file writes such as Amazon S3 http://aws.amazon.com/s3/
Disk size management would no longer be a concern
Files are now written and read from S3 so load balancer concerns are solved
Benefits of a semi-edge network with AWS. (not truly edge but in my experience better than most internally hosted solutions)
Don't store your data in the file system, store it in a database.
If you really can't avoid using the file system, you could look at storing the files in a network share both servers have access to. This would be a terrible hack, however.
It sounds like you may be having a session state issue. It sounds odd the way you describe it, but have a look at this article. It's old, but covers the basics. If it doesn't try googling "asp.net session state web farm"
http://ondotnet.com/pub/a/dotnet/2003/03/24/sessionstate.html
Use NAS or SAN to centralize storage. That same network-accessible storage can store the shared configuration that IIS can be setup to use.
Web Deploy v2 just released from Microsoft, I would encourage the powers that be to investigate that, along with Application Request Routing and the greater Web Farm Framework.
This is a normal infrastructure setup. Below are the two commonly used solutions for the situation you are in.
If you have network attached storage available (e.g. Netapps), you can use this storage to centrally store all of your user files that need to be available across all servers in your web farm.
Redesign your application to store all user specific data in a database.