I have the following link inside my asp.net mvc web application :-
#Model.Name
But when I click on this link, I get the following error :
HTTP Error 404.8 - Not Found
The request filtering module is configured to deny a path in the URL that contains a hiddenSegment section.
So what is causing this problem , and how I can solve it ?
Thanks
Create a Controller (e.g. "Streamer") and Action (e.g. "StreamUploadedImage") that streams the image (the Action will typically return a FileResult).
Change the url to reference your action, passing the image id as a parameter, e.g. (from memory so syntax may not be accurate):
#Html.ActionLink(Model.Name, "StreamUploadedImage", "Streamer", new {id = "38" })
An alternative would be to put the uploaded image in a location where it can be accessed from the client, e.g. in a subfolder of the Content folder:
#Model.Name
But using a controller gives you more control, e.g. to implement authorization.
The path is blocked by your IIS. To resolve, move the files to an other location ("~/Uploads/Images/" perhaps?).
The reason why IIS is blocking some folders is beacause they can contain importent data or files, which the user should not have access to. To avoid hackers from getting this information, the IIS is denying access to any of the files in those folders.
For more information: http://www.iis.net/configreference/system.webserver/security/requestfiltering/hiddensegments
Related
I have a site that's about to be taken down in month's time. What I need to do is place a robots.txt to prevent the search engine from indexing it any longer. However, after I placed the file on the root of the solution in the web server, and tried to check if I can access it by typing www.sitename.com/robots.txt, it just refreshes the screen or perhaps just returns to the home page. My application is running in MVC 3.
Things I've already tried:
Added modules runAllManagedModulesForAllRequests="true" in web.config
Used Dynamic Robots.Txt using IHttpHandler and Controller/Action approach (based on this link robots.txt file for different domains of same site)
Played around with MIME Types (removed .txt and back)
I am expecting to see the contents of the robots.txt file same as when I access a css file or js file just by appending the file name in the URL.
I use nancyfx.
I have published a folder with files.
I start the server and observe the following situation:
If there is a file '1.html' in the published folder, then I get it through the browser. If I delete this file from the disk, I get an error 404 (this is correct). If I add this file to disk again, or I change its contents, I normally get it in the browser.
If after starting the server I try to access from the browser to a nonexistent file '2.html', I get an error 404 (This is correct). However, if after that I create the file '2.html' on the disk, I still get a 404 error. It helps only restart the server.
I got the impression that nancyfx at the first access to the requested files forms some kind of cache, which subsequently does not allow me to get the files added after they were unsuccessfully requested.
Help please with the decision of the given problem. Thank you in advance.
Nancy uses a convention based approach for figuring out what static content it is able to serve at runtime. you can check the Documentation here
My project is an Asp.Net MVC4 web application.
Currently it has a method to generate a text file and send it to the client's browser for download.
I need to modify it to force the browser to save the file in a custom (pre-defined) location on the client's computer.
This will not be possible as it would introduce a severe security problem. A user has to decide where the file will be saved.
You can only specify a location on a server to which you have access to.
If its an internal site, then you could setup the server to save the file to a network location and return that path to the user..
If you want to show a save as, add this to your ActionResult to indicate a file download:
Response.SetCookie(new HttpCookie("fileDownload", "true") { Path = "/" });
return myFileStreamResult
I needed to download and sort files into a rigidly defined directory structure on the client machine with no possibility of user mistakes. Ideally it would be completely automatic. I couldn't make it fully automatic, but in Chrome in Windows, I eliminated the possibility of typing mistakes with:
<a class="aDownload" href="file.txt" download="CTRL+V for suggested path/file">Download</a>
<textarea id="textareaClipboard"></textarea>
Using jQuery to listen for a click of the link, I call a function to generate the desired path and final file name, put it in the textarea, and transfer this to the clipboard using
jQuery('#textareaClipboard').select();
document.execCommand('copy' ,null ,null);
The Save As dialog pops up with "CTRL+V for suggested path/file" in the file name field. Follow the suggestion to paste the generated file name into this field, and hit Enter.
It requires a minimal amount of user action to ensure the file goes to the right directory with the right name, and the user can always reject the suggestion.
Your web application only can sending file to your client. its imposible to force download and save to spesific location, because download and save to privilege is belongs to client browser.
if user not defined default download location, it will prompt save to when download something, then if user already defined default download location. it will download automatically and save to default location.
so i think you have a little misconception with your web logic :D
I want to take files from a known location on disk and have ASP redirect to them from code-behind, allowing the browser/app/device to control how the content is displayed.
I tried using:
Server.Redirect(pathToFile);
But got the following exception: Invalid path for child request 'C:\ContentFolder\testImage.png'. A virtual path is expected.
How can I allow my site to redirect users to these files? I am storing the base directory in the web config, and the file names are stored in a database.
If the files reside outside of your website directory you can't redirect to them. Think about the security implications if that was possible. You have a few options
Move the files inside the website directory (or a sub-directory of it). Then you can redirect your users to them using a virtual path e.g. Server.Redirect("~/files/somefile.zip").
Set up a virtual directory in IIS that maps to the physical location of the files on disk. Then you can redirect to them using the virtual path. You can do this through the GUI or config file.
Create a HttpHandler that loads the file from disk and returns them in the response. You can use a querystring to identify the file to load e.g. /filehandler.ashx?filename=somefile.zip. A quick google revealed this example.
I am working on a website where the images and other files are handled by a handler named resources.ashx. These files are not stored in any folder but are fetched from database.
The problem is the access to some of the files is restricted, whereas some images and files are open to all.
Lets say the path to one of the restricted image is :
../website/resources.ashx/restrictedimage.jpg
If an unauthenticated user types in this url, he will have access to the image straight away.
I want to restrict that.
P.S. -> I can't change the handler as I am referencing it from some other project.
May be an HttpModule can help you out. Handle the AuthenticateRequest event, parse/compare requested url and users/roles.
You can use authorization rules in your web.config to control access to the files (ie urls) or your choosing based on user/group membership.
See:
using multiple authorization elements in web.config